CN116029716A - Remote payment method, electronic equipment and system - Google Patents
Remote payment method, electronic equipment and system Download PDFInfo
- Publication number
- CN116029716A CN116029716A CN202111248039.1A CN202111248039A CN116029716A CN 116029716 A CN116029716 A CN 116029716A CN 202111248039 A CN202111248039 A CN 202111248039A CN 116029716 A CN116029716 A CN 116029716A
- Authority
- CN
- China
- Prior art keywords
- terminal
- video stream
- payment
- code
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 138
- 239000002131 composite material Substances 0.000 claims abstract description 197
- 230000006854 communication Effects 0.000 claims abstract description 77
- 238000004891 communication Methods 0.000 claims abstract description 76
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 4
- 230000033001 locomotion Effects 0.000 claims description 32
- 230000015654 memory Effects 0.000 claims description 30
- 239000013598 vector Substances 0.000 claims description 25
- 238000004590 computer program Methods 0.000 claims description 17
- 230000007613 environmental effect Effects 0.000 claims description 11
- 230000006870 function Effects 0.000 description 30
- 238000004422 calculation algorithm Methods 0.000 description 26
- 230000004044 response Effects 0.000 description 20
- 238000012545 processing Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 14
- 238000007726 management method Methods 0.000 description 13
- 238000010295 mobile communication Methods 0.000 description 13
- 230000005236 sound signal Effects 0.000 description 13
- 238000012546 transfer Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 10
- 238000012795 verification Methods 0.000 description 10
- 230000015572 biosynthetic process Effects 0.000 description 8
- 238000003786 synthesis reaction Methods 0.000 description 8
- 238000012790 confirmation Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 5
- 230000002159 abnormal effect Effects 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000013478 data encryption standard Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000001976 improved effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000013475 authorization Methods 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000008093 supporting effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/30—Payment architectures, schemes or protocols characterised by the use of specific devices or networks
- G06Q20/32—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/40—Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4405—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video stream decryption
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/488—Data services, e.g. news ticker
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Finance (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Security & Cryptography (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The application provides a remote payment method, electronic equipment and a system, wherein the method comprises the following steps: the first terminal and the second terminal establish communication connection; the method comprises the steps that a first terminal obtains a first video stream and a second video stream, wherein the second video stream comprises a payment code; the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream; the first terminal sends the synthesized video stream to the second terminal; the second terminal completes payment for the pay code based on the composite video stream. By implementing the embodiment of the application, the payment code is sent to the payer in a mode of synthesizing the video stream, so that the safety and the effectiveness of remote payment can be ensured.
Description
Technical Field
The embodiment of the application relates to the technical field of electronics, in particular to a remote payment method, electronic equipment and a system.
Background
With the continuous development of mobile payment technology, more and more people conduct transactions through electronic payment, and the electronic payment enables daily life of people to be more convenient. Electronic payment is typically in two forms, one is the form of a paycheck: the payee presents the payee, and the payer scans the payee through the terminal equipment to finish payment; and secondly, a payment code mode: the payer presents the payment code, and the payee scans the payment code through the code scanning equipment to finish the collection.
However, in some scenarios, the payer cannot independently complete the transaction through electronic payment, for example, the mobile phone of the payer does not install a payment application or the payer does not apply for a payment account (such as a part of the elderly, dysfunctional people and young children have difficulty applying for a payment account). Then, in the above scenario, it is often necessary for others to complete the transaction by remote replacement payment. For example, the payer can send the collection code to the mobile phone of the payer through the communication software, and the payer scans the collection code to finish payment; or the payer receives the payment code from the substitute payer, and presents the payment code to the payee for scanning the code to finish payment.
In the process of transmitting the collection code and the payment code by the payer and the payment agent, the collection code and the payment code may have the conditions of ambiguity or incomplete, and the like, and may be intercepted and tampered maliciously by a third party, so that the payer cannot accurately identify the collection code or the payment code to complete the safe payment. How to realize safe and effective remote payment is a technical problem to be solved at present.
Disclosure of Invention
The application provides a remote payment method, electronic equipment and a system, which can send a synthesized video stream synthesized by a first video stream and a second video stream to a payer, wherein the second video stream comprises a payment code, and the payment code can be a collection code and a payment code, so that safe and effective remote payment is realized.
In a first aspect, an embodiment of the present application provides a remote payment method, where the method is applied to a communication system including a first terminal and a second terminal, and the method includes:
the first terminal and the second terminal establish communication connection;
the first terminal acquires a first video stream and a second video stream through a first camera and a second camera respectively, wherein the second video stream comprises a cashing code;
the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream;
the first terminal sends the synthesized video stream to the second terminal;
the second terminal identifies a checkout code from the composite video stream;
the second terminal completes payment for the checkout.
According to the method, the first video stream and the second video stream are synthesized, the synthesized video stream is sent to the payer, so that the third party can be prevented from being tampered to ensure the safety of the cashing code in the sending process, and the payment code can be updated in real time through video communication to ensure that the payer can receive the effective payment code, the payer can not accurately recognize the payment due to the fact that the payment code is invalid, and the safe and effective remote payment can be realized.
With reference to the first aspect, in one possible implementation manner, after the first terminal and the second terminal establish a communication connection, the method further includes:
when the first terminal detects the operation of the first user, a payment request is sent to the second terminal, and the payment request is used for requesting payment;
after receiving the payment request, the second terminal displays prompt information and a control, wherein the prompt information is used for prompting a user whether to receive a cash register from the first terminal;
when the second terminal detects a second user operation aiming at the control, sending an agreement message to the first terminal;
the first terminal obtains a first video stream and a second video stream through a first camera and a second camera respectively, and the method comprises the following steps: after receiving the agreement message, the first terminal obtains a first video stream and a second video stream through the first camera and the second camera respectively.
With reference to the first aspect, in one possible implementation manner, the first video stream includes a face of a user of the first terminal.
In the embodiment of the application, the first terminal can shoot the face of the user through the camera to obtain the first video stream, so that the user of the second terminal can finish payment after confirming the payment safety through the face, and the payment safety can be improved.
With reference to the first aspect, in one possible implementation manner, the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, including:
the first terminal synthesizes two frames of images with the same shooting time in the first video stream and the second video stream into one frame of image in the composite video stream.
With reference to the first aspect, in one possible implementation manner, the size of an image in the first video stream is smaller than the size of an image in the second video stream, the first image and the second image are two frames of images with the same shooting time in the first video stream and the second video stream, respectively, and the first terminal synthesizes the two frames of images with the same shooting time in the first video stream and the second video stream into one frame of image in the composite video stream, including:
the first terminal replaces pixels of a preset area in the second image with pixels of the first image to obtain a frame of image in the synthesized video stream; the size of the preset area is the size of the first image.
In the embodiment of the application, the sizes of the first video stream and the second video stream are preset, so that the image can be directly synthesized by a pixel replacement method without reducing or amplifying the image, and the efficiency of video synthesis is improved.
With reference to the first aspect, in one possible implementation manner, the sending, by the first terminal, the composite video stream to the second terminal includes: the first terminal encrypts the synthesized video stream to obtain an encrypted synthesized video stream; the first terminal sends the encrypted synthesized video stream to the second terminal;
the method further comprises the steps of: and the second terminal decrypts the encrypted composite video stream when receiving the encrypted composite video stream to obtain the composite video stream.
In the embodiment of the application, the receipt code can be prevented from being tampered by a third party by encrypting the composite video stream.
With reference to the first aspect, in one possible implementation manner, the encrypting, by the first terminal, the composite video stream to obtain an encrypted composite video stream includes:
the first terminal encrypts the composite video stream based on the private key to obtain an encrypted composite video stream;
the first terminal sends the encrypted composite video stream to the second terminal, comprising: the first terminal sends the encrypted synthesized video stream and the identity of the first terminal to the second terminal;
and when the second terminal receives the encrypted composite video stream, decrypting the encrypted composite video stream to obtain the composite video stream, wherein the method comprises the following steps of:
When receiving the encrypted synthesized video stream and the identity of the first terminal, the second terminal acquires the public key corresponding to the identity of the first terminal based on the corresponding relation between the plurality of identity identifiers and the public key;
and the second terminal decrypts the encrypted composite video stream through the obtained public key to obtain the composite video stream.
In the embodiment of the application, the first terminal encrypts the synthesized video stream through only all private keys of the first terminal, and the second terminal decrypts the encrypted synthesized video stream through the public key corresponding to the private key. If the third party tampers the synthesized video stream, but the third party does not have the private key, the second terminal cannot decrypt the synthesized video stream with the public key after receiving the synthesized video stream sent by the third party, and it can be determined that the synthesized video stream is not sent by the first terminal. The method can ensure that the composite video stream is transmitted by the first terminal, thereby avoiding the receipt of unsafe payment requests.
With reference to the first aspect, in one possible implementation manner, the second terminal completes payment for the checkout code based on the composite video stream, including:
the second terminal identifies a checkout code from the composite video stream;
The second terminal displays a payment page corresponding to the cashing code;
and the second terminal responds to the user operation on the payment page and pays the account corresponding to the cashing code.
With reference to the first aspect, in one possible implementation manner, the method further includes:
the second terminal identifies the environment information in each frame of image in the composite video stream; the environmental information includes at least one feature quantity;
the second terminal determines payment safety when the number of the same characteristic quantity of every two adjacent frames of images in the synthesized video stream is larger than a preset number;
the second terminal completing payment for the checkout code based on the composite video stream, comprising: and the second terminal completes payment for the cashing code based on the composite video stream when determining payment security.
In the embodiment of the application, the second terminal can identify the environment information in the composite video stream after receiving the composite video stream, and when the payment security is determined, the payment for the cashing code is completed based on the composite video stream. The method can further ensure the payment safety.
With reference to the first aspect, in a possible implementation manner, the method further includes:
the second terminal synthesizes a front part and a rear part in each frame of image in the video stream, wherein the front part is an image shot by the first camera, and the rear part is an image shot by the second camera;
The second terminal calculates the motion vectors of the front part and the rear part of two adjacent frames of images in the synthesized video stream respectively;
the second terminal determines payment safety when the direction of the motion vector of the front part is consistent with that of the motion vector of the rear part;
the second terminal completing payment for the checkout code based on the composite video stream, comprising: and the second terminal completes payment for the cashing code based on the composite video stream when determining payment security.
In the embodiment of the application, the second terminal can identify the motion vectors of the middle part and the rear part of the composite video stream after receiving the composite video stream, and when the payment security is determined, the payment for the cashing code is completed based on the composite video stream. The method can further ensure the payment safety.
In a second aspect, an embodiment of the present application provides a remote payment method, where the method is applied to a communication system including a first terminal and a second terminal, and the method includes:
the first terminal and the second terminal establish communication connection;
the first terminal acquires a first video stream through a camera, generates a second video stream based on a payment code, and the second video stream comprises the payment code;
the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, wherein the synthesized video stream comprises payment codes;
The first terminal sends the synthesized video stream to the second terminal;
the second terminal displays the composite video stream.
With reference to the second aspect, in one possible implementation manner, after the first terminal and the second terminal establish a communication connection, the method further includes:
when the second terminal detects the operation of the first user, a payment request is sent to the first terminal, wherein the payment request is used for requesting a payment code;
after receiving the payment request, the first terminal displays prompt information and a control, wherein the prompt information is used for prompting a user whether to send a payment code to the second terminal;
the first terminal obtains a first video stream through the camera, generates a second video stream based on the payment code, and comprises: and when the second terminal detects the second user operation aiming at the control, acquiring the first video stream through the camera, and generating a second video stream according to the payment code.
With reference to the second aspect, in one possible implementation manner, the first video stream includes a face of a user of the first terminal.
With reference to the second aspect, in one possible implementation manner, the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, including:
the first terminal synthesizes two frames of images acquired at the same time in the first video stream and the second video stream into one frame of image in the synthesized video stream.
With reference to the second aspect, in one possible implementation manner, the size of an image in the first video stream is smaller than the size of an image in the second video stream, the first image and the second image are two frames of images with the same acquisition time in the first video stream and the second video stream, respectively, and the first terminal synthesizes the two frames of images with the same acquisition time in the first video stream and the second video stream into one frame of image in the composite video stream, including:
the first terminal replaces pixels of a preset area in the second image with pixels of the first image to obtain a frame of image in the synthesized video stream; the size of the preset area is the size of the first image.
With reference to the second aspect, in one possible implementation manner, the sending, by the first terminal, the composite video stream to the second terminal includes: the first terminal encrypts the synthesized video stream to obtain an encrypted synthesized video stream; the first terminal sends the encrypted synthesized video stream to the second terminal;
the method further comprises the steps of: and the second terminal decrypts the encrypted composite video stream when receiving the encrypted composite video stream to obtain the composite video stream.
With reference to the second aspect, in one possible implementation manner, the encrypting, by the first terminal, the composite video stream to obtain an encrypted composite video stream includes:
The first terminal encrypts the composite video stream based on the private key to obtain an encrypted composite video stream;
the first terminal sends the encrypted composite video stream to the second terminal, comprising: the first terminal sends the encrypted synthesized video stream and the identity of the first terminal to the second terminal;
and when the second terminal receives the encrypted composite video stream, decrypting the encrypted composite video stream to obtain the composite video stream, wherein the method comprises the following steps of:
when receiving the encrypted synthesized video stream and the identity of the first terminal, the second terminal acquires the public key corresponding to the identity of the first terminal based on the corresponding relation between the plurality of identity identifiers and the public key;
and the second terminal decrypts the encrypted composite video stream through the obtained public key to obtain the composite video stream.
With reference to the second aspect, in one possible implementation manner, the method further includes:
the second terminal identifies the environment information in each frame of image in the composite video stream; the environmental information includes at least one feature quantity;
the second terminal determines payment safety when the number of the same characteristic quantity of every two adjacent frames of images in the synthesized video stream is larger than a preset number;
the second terminal displays the composite video stream, comprising: and the second terminal displays the composite video stream when the payment security is determined.
With reference to the second aspect, in a possible implementation manner, the method further includes:
the second terminal synthesizes a front part and a rear part in each frame of image in the video stream, wherein the front part is an image shot by the first camera, and the rear part is an image shot by the second camera;
the second terminal calculates the motion vectors of the front part and the rear part of two adjacent frames of images in the synthesized video stream respectively;
the second terminal determines payment safety when the direction of the motion vector of the front part is consistent with that of the motion vector of the rear part;
the second terminal displays the composite video stream, comprising: and the second terminal displays the composite video stream when the payment security is determined.
In a third aspect, an embodiment of the present application provides a remote payment method, applied to a first terminal, where the method includes:
after the first terminal establishes communication connection with the second terminal, the first video stream and the second video stream are acquired through the first camera and the second camera respectively, and the second video stream comprises a cashing code;
the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream;
the first terminal sends a composite video stream to the second terminal, the composite video stream being used by the second terminal to identify the checkout code from the composite video stream and to complete payment for the checkout code.
With reference to the third aspect, in one possible implementation manner, after the first terminal establishes a communication connection with the second terminal, the method further includes:
when the first terminal detects the operation of the first user, a payment request is sent to the second terminal, and the payment request is used for requesting payment;
the first terminal obtains a first video stream and a second video stream through a first camera and a second camera respectively, and the method comprises the following steps: after receiving the agreement message, the first terminal obtains the first video stream and the second video stream through the first camera and the second camera respectively, and the agreement message is generated by the second terminal based on the second user operation after receiving the payment request.
With reference to the third aspect, in one possible implementation manner, the first video stream includes a face of a user of the first terminal.
With reference to the third aspect, in one possible implementation manner, the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, including:
the first terminal synthesizes two frames of images with the same shooting time in the first video stream and the second video stream into one frame of image in the composite video stream.
With reference to the third aspect, in one possible implementation manner, the sending, by the first terminal, the composite video stream to the second terminal includes:
The first terminal encrypts the synthesized video stream to obtain an encrypted synthesized video stream;
the first terminal sends the encrypted composite video stream to the second terminal, and the encrypted composite video stream is used for the second terminal to acquire the composite video stream.
In a fourth aspect, an embodiment of the present application provides a remote payment method, applied to a second terminal, where the method includes:
after the second terminal establishes communication connection with the first terminal, receiving a synthesized video stream sent by the first terminal, wherein the synthesized video stream is synthesized by the first terminal through a first video stream and a second video stream acquired by a first camera and a second camera respectively, and the second video stream comprises a collection code;
the second terminal identifies a checkout code from the composite video stream;
the second terminal completes payment for the checkout.
With reference to the fourth aspect, in one possible implementation manner, before the second terminal receives the composite video stream sent by the first terminal, the method includes:
after receiving a payment request from the first terminal, the second terminal displays prompt information and a control, wherein the prompt information is used for prompting a user whether to receive a cash register from the first terminal;
and when the second terminal detects the second user operation aiming at the control, sending an agreement message to the first terminal, wherein the agreement message is used for indicating to send the composite video stream to the second terminal.
With reference to the fourth aspect, in one possible implementation manner, the receiving, by the second terminal, the composite video stream sent by the first terminal includes:
the second terminal receives the encrypted composite video stream sent by the first terminal;
and the second terminal decrypts the encrypted composite video stream to obtain the composite video stream.
In a fifth aspect, an embodiment of the present application provides a remote payment method, applied to a first terminal, where the method includes:
after the first terminal establishes communication connection with the second terminal, acquiring a first video stream through a camera, and generating a second video stream based on a payment code, wherein the second video stream comprises the payment code;
the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, wherein the synthesized video stream comprises payment codes;
the first terminal transmits the composite video stream to the second terminal so that the second terminal displays the composite video stream.
With reference to the fifth aspect, in one possible implementation manner, after the first terminal establishes a communication connection with the second terminal, the method further includes:
after receiving a payment request from a second terminal, the first terminal displays prompt information and a control, wherein the prompt information is used for prompting a user whether to send a payment code to the second terminal, the payment request is generated by the second terminal based on the operation of the first user, and the payment request is used for requesting the payment code;
The first terminal obtains a first video stream through the camera, generates a second video stream based on the payment code, and comprises: when the first terminal detects the second user operation aiming at the control, the first terminal obtains the first video stream through the camera, and generates a second video stream according to the payment code.
With reference to the fifth aspect, in one possible implementation manner, the first video stream includes a face of a user of the first terminal.
With reference to the fifth aspect, in one possible implementation manner, the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, including:
the first terminal synthesizes two frames of images acquired at the same time in the first video stream and the second video stream into one frame of image in the synthesized video stream.
With reference to the fifth aspect, in one possible implementation manner, the sending, by the first terminal, the composite video stream to the second terminal includes:
the first terminal encrypts the synthesized video stream to obtain an encrypted synthesized video stream;
the first terminal sends the encrypted composite video stream to the second terminal, and the encrypted composite video stream is used for the second terminal to acquire the composite video stream.
In a sixth aspect, an embodiment of the present application provides a remote payment method, applied to a second terminal, where the method includes:
After the second terminal establishes communication connection with the first terminal, receiving a synthesized video stream sent by the first terminal, wherein the synthesized video stream comprises a payment code, the synthesized video stream is synthesized by the first video stream and a second video stream, the first video stream is acquired by the first terminal through a camera, the second video stream is generated by the first terminal based on the payment code, and the second video stream comprises the payment code;
the second terminal displays the composite video stream.
With reference to the sixth aspect, in one possible implementation manner, before the second terminal receives the composite video stream sent by the first terminal, the method includes:
and when the second terminal detects the operation of the first user, sending a payment request to the first terminal, wherein the payment request is used for requesting a payment code.
With reference to the sixth aspect, in one possible implementation manner, the first video stream includes a face of a user of the first terminal.
In a seventh aspect, the present application provides an electronic device. The electronic device may include a memory and a processor. Wherein the memory may be used to store a computer program. The processor may be adapted to invoke a computer program to cause an electronic device to perform as or to perform any of the possible implementations of the third aspect or the third aspect.
In an eighth aspect, the present application provides an electronic device. The electronic device may include a memory and a processor. Wherein the memory may be used to store a computer program. The processor may be configured to invoke a computer program to cause the electronic device to perform as in the fifth aspect or any of the possible implementations of the fifth aspect or to perform as in the sixth aspect or any of the possible implementations of the sixth aspect.
A ninth aspect, the present application provides a computer program product comprising instructions, characterized in that the above-mentioned computer program product, when run on an electronic device, causes the electronic device to perform as or to perform as any one of the possible implementations of the third aspect or the third aspect.
In a tenth aspect, the present application provides a computer program product comprising instructions, characterized in that the above-mentioned computer program product, when run on an electronic device, causes the electronic device to perform as in the fifth aspect or any one of the possible implementations of the fifth aspect, or to perform as in the sixth aspect or any one of the possible implementations of the sixth aspect.
In an eleventh aspect, the present application provides a computer readable storage medium comprising instructions, characterized in that the instructions, when run on an electronic device, cause the electronic device to perform as or perform any of the possible implementations of the third aspect or the fourth aspect.
In a twelfth aspect, the present application provides a computer readable storage medium comprising instructions, characterized in that the instructions, when run on an electronic device, cause the electronic device to perform the method as in the fifth aspect or any one of the possible implementation manners of the fifth aspect, or to perform the method as in the sixth aspect or any one of the possible implementation manners of the sixth aspect.
In a thirteenth aspect, an embodiment of the present application provides a payment system, where the payment system includes a first electronic device and a second electronic device, where the first electronic device is an electronic device described in the third aspect, and the second electronic device is an electronic device described in the fourth aspect.
In a fourteenth aspect, an embodiment of the present application provides a payment system, where the payment system includes a first electronic device and a second electronic device, where the first electronic device is an electronic device described in the fifth aspect, and the second electronic device is an electronic device described in the sixth aspect.
It will be appreciated that the electronic device provided in the seventh aspect and the eighth aspect, the computer program product provided in the ninth aspect and the tenth aspect, and the computer readable storage medium provided in the eleventh aspect and the twelfth aspect are all configured to perform the method provided by the embodiments of the present application. Therefore, the advantages achieved by the method can be referred to as the advantages of the corresponding method, and will not be described herein.
Drawings
FIG. 1A is an architecture diagram of a remote payment system provided in an embodiment of the present application;
FIG. 1B is an architecture diagram of another remote payment system provided in an embodiment of the present application;
fig. 2 is a schematic hardware structure of an electronic device 100 according to an embodiment of the present application;
fig. 3 is a software block diagram of an electronic device 100 according to an embodiment of the present application;
FIG. 4 is an application scenario for remote payment based on a checkout code provided in an embodiment of the present application;
FIGS. 5A-5E, 6A-6D illustrate some user interfaces implemented on a first terminal and a second terminal;
FIG. 7 is a flow chart of a remote payment method provided by an embodiment of the present application;
fig. 8 is a schematic diagram of video stream synthesis according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an image composition provided in an embodiment of the present application;
fig. 10 is an application scenario of remote payment based on a payment code provided in an embodiment of the present application;
FIGS. 11A-11D illustrate some user interfaces implemented on a first terminal and a second terminal;
fig. 12 is a flowchart of another remote payment method provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and in addition, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and an acceptable form of the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of the electronic device.
In order to more clearly and in detail describe the remote payment method provided by the embodiment of the present application, the remote payment system provided by the embodiment of the present application is described below.
Referring to fig. 1A, fig. 1A is a schematic diagram of a remote payment system according to an embodiment of the present application. As shown in fig. 1A, the system includes a first terminal 10 and a second terminal 20. Wherein:
the first terminal 10 is provided with a front camera and a rear camera, the first terminal is provided with a front face and a back face, the front face and the back face are opposite, wherein the front camera is positioned on the front face of the first terminal, and the front face of the first terminal is the face of a display screen of the first terminal; the rear camera is positioned at the back of the first terminal, and the back of the first terminal is the opposite surface of the surface where the display screen of the first terminal is positioned.
In some embodiments, the first terminal 10 and the second terminal 20 may establish a communication connection; the first terminal 10 may record videos through a front camera and a rear camera, respectively, for example, the front camera is used for capturing face images, and the rear camera is used for capturing cashier codes; further, the first terminal 10 may fuse the images captured by the front camera and the rear camera to obtain a composite image; the first terminal 10 encrypts the composite photograph; the first terminal 10 may transmit the encrypted composite image to the second terminal 20 based on the above-described communication connection, so that the second terminal 20 verifies the identity of the first terminal 10 through the face image and completes payment based on the checkout code. Here, specific implementation of the first terminal 10 for obtaining the composite image and the like may refer to detailed description of the subsequent method embodiments, which are not described herein in detail.
In other embodiments, the remote payment system may also include a code scanning device 30. Specifically, the first terminal 10 and the second terminal 20 may establish a communication connection; the first terminal 10 may generate a payment code and take an image through a camera; further, the second terminal 20 may fuse the image photographed by the camera with the payment code to obtain a composite image; the second terminal 20 encrypts the composite photograph; the second terminal 20 may transmit the encrypted composite image to the first terminal 10 based on the above-described communication connection such that the second terminal 20 verifies the identity of the first terminal 10 through the composite image and presents the payment code to complete the payment.
Wherein the code scanning device 30 is used for reading the payment code. The code scanning device 30 can be divided into a bar code scanning device, a two-dimensional code scanning device and the like according to the type of the payment code to be read; the reading physical principle can be divided into an infrared code scanning device or a laser code scanning device and the like. The user may also set a payment password (including but not limited to a numeric password, a graphical password, a swipe password, a fingerprint password, a face recognition password, a voiceprint recognition password, an iris recognition password, etc.) for confirming that the payment completes the transaction authorization. When the user sets the password-free payment, the payment can be directly made without the user inputting a password.
The first terminal 10 and the second terminal 20 are electronic devices, and the electronic devices include, but are not limited to, a smart phone, a tablet computer, a personal digital assistant (personal digital assistant, PDA), a wearable electronic device (such as a smart watch and a pair of smart glasses) with a wireless communication function, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, and the like. Exemplary embodiments of electronic devices include, but are not limited to, piggybacking Linux, or other operating system. The electronic device may also be other portable electronic devices such as a Laptop computer (Laptop) or the like. It should also be appreciated that in other embodiments, the electronic device described above may be a desktop computer or the like instead of a portable electronic device.
The communication connection between the first terminal 10 and the second terminal 20 may be a wired connection, a wireless connection. The wireless connection may be a close range connection such as a high fidelity wireless communication (wireless fidelity, wi-Fi) connection, a bluetooth connection, an infrared connection, an NFC connection, a ZigBee connection, or an internet of things (The Constrained Application Protocol, cap) connection, or a long range connection (long range connection includes, but is not limited to, a mobile network supporting 2g,3g,4g,5g, and subsequent standard protocols).
Referring to fig. 1B, fig. 1B is a schematic diagram of another remote payment system according to an embodiment of the present application. As shown in fig. 1B, the system includes a first terminal 10, a second terminal 20, a code scanning device 30, a key server 40, and a payment server 50. Wherein:
the specific contents of the first terminal 10, the second terminal 20, and the code scanning device 30 may be described with reference to fig. 1A.
The key server 40 is used to provide keys to the first terminal 10 and the second terminal 20. The key server 40 may be a public key database, for example, a key management center database, a cloud server, or the like. For example, the first terminal 10 may send the identity of the first terminal 10 to the key server 40; the key server 40 may generate a pair of public key and private key after receiving the identity of the first terminal 10, and send the private key to the first terminal 10, so that the first terminal 10 may encrypt a composite video based on the private key, where the composite video may be generated by fusing images captured by a front camera and a rear camera of the first terminal 10, and at least one frame of image in the composite video includes a checkout code; the key server 40 may store the identity of the first terminal 10 and the public key, and when the second terminal 20 requests the public key of the first terminal 10, send the public key to the second terminal 20, so that the second terminal 20 may decrypt the encrypted composite video with the public key to obtain the checkout code sent by the first terminal 10. It will be appreciated that the method may ensure that the second terminal 20 decrypts the receipt sent by the first terminal 10 to prevent tampering with the receipt by a third party.
The payment server 50 may be a third party server, such as a banking server or the like, the payment server 50 being used to complete the payment. For example, when the first terminal 10 transmits a checkout to the second terminal 20, the second terminal 20 may transmit a payment request to the payment server 50 to cause the payment server 50 to transfer the digital asset in the account corresponding to the second terminal 20 to the account corresponding to the checkout to complete the payment.
Fig. 2 shows a schematic hardware configuration of the electronic device 100.
It should be noted that, the first terminal 10 and the second terminal 20 shown in fig. 1A or fig. 1B may be the electronic device 100.
The embodiment will be specifically described below taking the electronic device 100 as an example. It should be understood that electronic device 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The SIM interface may be used to communicate with the SIM card interface 195 to perform functions of transferring data to or reading data from the SIM card.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, an application required for at least one function (such as a face recognition function, a fingerprint recognition function, a mobile payment function, etc.), and the like. The storage data area may store data created during use of the electronic device 100 (e.g., face information template data, fingerprint information templates, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication.
In this embodiment, the electronic device 100 may execute the remote payment method through the processor 110, and the display screen 194 displays the relevant interface of the remote payment method. For example, the electronic device may use a kylin chip, and an in-se security engine of the kylin chip may implement a hardware-level encryption and decryption technique.
Fig. 3 is a software architecture block diagram of the electronic device 100 provided in the embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the system is divided into four layers, from top to bottom, an application layer, an application framework layer, runtime (run time) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include remote payment applications, three-party payment applications, cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc. applications (which may also be referred to as applications).
In some embodiments, the remote payment application may be a system video chat application or a three-party video application. For example, the remote payment application in the Hua-Cheng mobile phone can be a video chat application carried by the Hua-Cheng mobile phone, such as a smooth connection; for another example, the remote payment software may be an application installed by the mobile phone, such as an instant messaging APP.
In other embodiments, the remote payment application may also be payment enabled. It will be appreciated that the user does not have to invoke the payment functions of other applications when making payments using the remote payment software.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification presented in the form of a chart or scroll bar text in the system top status bar, such as a notification of a background running application, or a notification presented on a screen in the form of a dialog interface. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The Runtime (run time) includes core libraries and virtual machines. Run time is responsible for scheduling and management of the system.
The core library consists of two parts: one part is the function that the programming language (e.g., the java language) needs to call, and the other part is the core library of the system.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes the programming files (e.g., java files) of the application layer and the application framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), two-dimensional graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of two-Dimensional (2D) and three-Dimensional (3D) layers for multiple applications.
Media libraries (media libraries) support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
In the embodiment of the application, the media library is used for encoding and decoding the video.
The three-dimensional graphic processing library is used for realizing 3D graphic drawing, image rendering, synthesis, layer processing and the like.
In the embodiment of the application, the three-dimensional graphics processing library may be used for replacing pixels of the video stream, for example, pixels of a preset area in an image shot by a rear camera are replaced by a face image shot by a front camera.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and a virtual card driver.
In this embodiment of the present application, the kernel layer may further include a hardware encryption module, where the hardware encryption module is configured to implement a function of encrypting and decrypting the data trust zone, for example, decrypting the encrypted video stream.
The workflow of the electronic device 100 software and hardware is illustrated below in connection with capturing a photo scene.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a touch click operation, taking a control corresponding to the click operation as an example of a control of a camera application icon, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera driver by calling a kernel layer, and captures a still image or video by the camera 193.
Fig. 4 illustrates an application scenario for remote payment based on a checkout code.
As shown in fig. 4, the scenario includes a first terminal and a second terminal, where the first user (e.g., a) is a user of the first terminal and the second user (e.g., B) is a user of the second terminal. For example, when a shops in a place such as a mall, the place presents a payment code of the merchant, and a can shoot the payment code through a first terminal and initiate a payment request to a second terminal, so that B can scan the payment code of the merchant through the second terminal to pay the merchant.
The collection code can be a two-dimensional code or a bar code. The terminal scans the cash register of the third user, and can acquire the user information of the third user so as to realize the transfer of accounts to the third user. Wherein, the third user can be a merchant or a person, etc.; the user information of the third user is used for indicating a transaction account of the third user, for example, the second terminal can acquire the user information of the merchant through identifying the cash register, and display a page of the input amount, and further, transfer of the account of the merchant is achieved in response to user operation of the user on the page.
Based on the above scenario, some user interfaces on the first terminal and the second terminal provided in the embodiments of the present application are described below. Fig. 5A-5E and fig. 6A-6D illustrate some user interfaces implemented on the first and second terminals in the remote payment scenario.
Fig. 5A-5E illustrate one manner in which a first terminal and a second terminal establish a common video call. Fig. 5A-5C are user interfaces implemented on a first terminal, and fig. 5D and 5E are user interfaces implemented on a second terminal.
Fig. 5A shows an exemplary user interface 51 on the first terminal for exposing installed applications. The user interface 51 displays: status bars, calendar indicators, weather indicators, trays with commonly used application icons, remote payment application icons 511, camera application icons, and other application icons, etc. Wherein the status bar may include: one or more signal strength indicators 512A and 512B of mobile communication signals (also may be referred to as cellular signals), operator name (e.g., "chinese movement"), one or more signal strength indicators 512C of Wi-Fi signals, battery status indicator 512D, time indicator 512E, etc. In some embodiments, the user interface 51 exemplarily shown in fig. 5A may be a Home screen.
As shown in fig. 5A, the first terminal may detect a user operation of an icon 511 acting on the remote payment application and display a user interface 52 shown in fig. 5B in response to the user operation.
The user interface 52 may be a main interface provided by a remote payment application, the user interface 52 may include a number of contact correspondence information bars 521, and the information bars 521 may include icons 521A of contact avatars, contact names 521B, and call controls 521C.
As shown in fig. 5B, the first terminal may detect a user operation acting on the call control 521C of the electronic device, and in response to the user operation, the first terminal transmits a request for establishing a normal video call to the electronic device (second terminal) corresponding to the call control 521C, and displays the user interface 53 as shown in fig. 5C, and the user interface 53 may display the hang-up control 531.
Referring to fig. 5D, fig. 5D shows a user interface 54 displayed by the second terminal after the second terminal receives a request for setting up a normal video call sent by the first terminal. The user interface 54 includes: identity information of the first terminal, hang-up control 541, and answer control 542.
As shown in fig. 5D, the identity information of the first terminal may be an avatar and a name of the first user. The second user can determine the identity of the first terminal that issued the request via the identity information. The second terminal may detect a user operation on call control 521C of the electronic device, agree to establish a normal video call with the first terminal in response to the user operation, and open the camera of the first terminal, displaying user interface 55 as shown in fig. 5E.
As shown in fig. 5E, the user interface 55 includes: an image taken by the first terminal displayed full screen, a switch control 551, a hang-up control 552, a payment request control 553, and a display box 554. The switching control 551 is used for switching display positions of the image shot by the first terminal and the image shot by the second terminal on the user interface; the hang-up control 552 is used to end the call; the payment request control 553 is used for sending a request for requesting payment to the opposite terminal; the display box 554 is used for displaying an image photographed by the second terminal.
Fig. 6A-6D illustrate one manner in which a first terminal and a second terminal establish a remote payment connection. Fig. 6A is a user interface implemented on a first terminal, and fig. 6B-6D are user interfaces implemented on a second terminal.
After the second terminal agrees to establish the above-described normal video call with the first terminal (e.g., the second user clicks the answer control 542 shown in the user interface 54 shown in fig. 5D), the first terminal displays the user interface 61 shown in fig. 6A. As shown in fig. 6A, the user interface 61 includes: an image captured by the second terminal displayed full screen, a switch control 611, a hang-up control 612, a payment request control 613, and a display box 614. The switching control 611 is used for switching the display positions of the image shot by the first terminal and the image shot by the second terminal on the user interface; the hang-up control 612 is used to end the call; the payment request control 613 is configured to send a request for payment to the peer; the display frame 614 is used to display an image photographed by the first terminal.
As shown in fig. 6A, the first terminal may detect a user operation on the payment request control 613 of the electronic device, and in response to the user operation, the first terminal transmits a request to establish a remote payment connection to the second terminal.
Referring to fig. 6B, fig. 6B shows a user interface 62 displayed by the second terminal after the second terminal receives a request for establishing a communication connection sent by the first terminal. The user interface 62 includes a window 621. Window 621 is used to prompt the user for a first terminal to request establishment of a remote payment connection. The second terminal may detect an operation of determining to establish a remote payment connection with the first terminal, such as the user clicking on control 621A shown in fig. 6B, in response to which the second terminal establishes a communication connection for remote payment with the first terminal. Alternatively, the second terminal detects an operation of refusing to establish a communication connection for remote payment with the first terminal, for example, an operation of clicking a control 621B by the user as shown in fig. 6B, and in response to the operation, the second terminal refuses to establish a communication connection for remote payment with the first terminal.
Without limitation, the user may trigger the second terminal display user interface 62 in other manners, such as the user interface shown in fig. 5B having a remote payment connection control, by which the user may trigger the second terminal display user interface 62 by touching the control, which is not limiting in this embodiment.
As shown in fig. 6B, the second terminal may detect a user operation on control 621A, in response to which the second terminal establishes a remote payment connection with the first terminal, displaying user interface 63 as shown in fig. 6C.
As shown in fig. 6C, the user interface 63 may include: the first terminal, which is full screen display, sends an image to the second terminal, a hang-up control 631, a display window 632, and a return control 633. The display window 632 is used for displaying an image shot by the camera of the second terminal, and the return control 633 is used for returning to the common video call.
In some embodiments, the second terminal displays a user interface 64 as shown in fig. 6D when the checkout code is identified in the image sent by the first terminal to the second terminal. The user interface 64 includes a display field 641, an input box 642, and a confirmation control 643. The display field 641 is used for displaying prompt information, the input box 642 is used for inputting an amount, and the confirmation control 643 is used for confirming that the amount input by the input box 642 is paid to the user corresponding to the cash register.
For example, the second user may enter a specified amount of money in the input box 642 of the user interface 64, click the confirmation control 643, and the second terminal completes the payment in response to the second user clicking the confirmation control 643. For another example, the second terminal may display the user interface 55 shown in fig. 5E after payment is completed, i.e., return to the normal video call interface. For another example, the second user may click on the return control 633 on the user interface 64, and the second terminal displays the user interface 55 shown in fig. 5E in response to the user operation.
In other embodiments, when the second terminal identifies the payment code in the image sent from the first terminal to the second terminal, a prompt message may be displayed on the user interface 63, where the prompt message is used to ask the second user whether to jump to the payment page corresponding to the payment code; after the second user confirms the prompt, the second terminal displays a user interface 64 as shown in fig. 6D.
It should be noted that, the user interface 64 may be a preset display interface of a payment code, for example, a scan interface corresponding to a payment code of a merchant is preset by the merchant; the user interface 64 may further include a prompt message generated by the second terminal, for example, the first terminal sends a request to the second terminal to carry the target amount, the second terminal may display a window for prompting the amount on the user interface 64, for example, the second user may input the prompting amount in the input box 642 of the user interface 64 based on the prompting amount displayed on the user interface 64, click the confirmation control 643, and the second terminal completes payment in response to the operation of clicking the confirmation control 643 by the second user. It should be understood that user interface 64 is merely one exemplary user interface provided by embodiments of the present application and is not limiting of embodiments of the present application.
The remote payment method provided in the embodiment of the present application is described in detail below with reference to the scenario shown in fig. 4 based on the schematic diagram of the remote payment system shown in fig. 1A or fig. 1B, and the hardware and software structure schematic diagrams of the electronic device shown in fig. 2 and fig. 3.
Fig. 7 illustrates a remote payment method provided in an embodiment of the present application. The remote payment method may include some or all of the following steps:
s701, the first terminal and the second terminal establish a common video call.
In some embodiments, the first terminal and the second terminal may be installed with a remote payment application. The remote payment application may be a communication application with a remote payment function. Specifically, the first user may send a request for a common video call to the second terminal through the application program, and after the second terminal agrees with the request for the common video call, the first terminal establishes a common video call with the second terminal. For example, the first terminal initiates a generic video call request to the second terminal in response to a user operation by the first user on the remote payment application. Correspondingly, the second terminal responds to the user operation of the second user agreeing to the common video call request, and establishes the video call with the first terminal. Reference may be made specifically to the relevant descriptions of fig. 5A-5E, which are not repeated here.
After the first terminal and the second terminal establish the common video call, the first user and the second user can perform the video call through the first terminal and the second terminal. For example, the first user may be located in a mall, and the first user may photograph an environment in the mall or photograph a face of the first user through the first terminal so that the second user may see the contents photographed by the first user on the second terminal to verify the identity of the first user.
S702, the first terminal sends a payment request to the second terminal, wherein the payment request is used for requesting payment.
In some embodiments, after the first terminal establishes a normal video call with the second terminal, the first user may initiate a payment request to the second terminal based on the remote payment application, where the payment request is used to request payment. For example, the first user may click on a payment request control 613 displayed on the first terminal as shown in fig. 6A, and accordingly, the first terminal generates a payment request in response to the user operation, and in turn, sends the payment request to the second terminal, the payment request being used to request payment. Reference may be made specifically to the relevant descriptions of fig. 6A-6D, which are not repeated here.
In other embodiments, the first terminal may initiate a call request to the second terminal, the call request including the payment request described above. For example, the first terminal detects a user operation of a control for clicking a call request for requesting establishment of a remote payment connection by the first user in a certain application; the first terminal initiates a call request to the second terminal; accordingly, the second terminal displays a call request for remote payment from the first user, and after detecting a user operation of the second user agreeing to the call request, establishes a remote payment connection with the first terminal to cause the second terminal to display a user interface 63 as shown in fig. 6C. That is, the user may directly initiate a call requesting remote payment to the other user through the remote payment application, and the other user enters a page of remote payment when agreeing to answer the call.
It should be noted that, the first terminal may also send the payment request to the second terminal through other methods, which is not limited herein.
S703, the second terminal responds to the payment request and sends an agreement message to the first terminal, wherein the agreement message is used for indicating the first terminal to send the composite video stream to the second terminal.
In some embodiments, the second terminal receives a payment request from the second terminal, displays the payment request, for example, displays the user interface 62 as shown in fig. 6B; the second user may click on a control on the second terminal that agrees to the payment request, such as click on control 621A shown in user interface 62; correspondingly, the second terminal responds to the user operation to generate an agreement message, and further, the agreement message is sent to the first terminal. The consent message may be receipt information or a message.
S704, the first terminal acquires a first video stream and a second video stream, wherein the first video stream is a video stream acquired through a front camera, the second video stream is a video stream acquired through a rear camera, and at least one frame of image in the second video stream comprises a cashing code.
In some embodiments, after receiving the consent message from the second terminal, the first terminal may start the front camera and the rear camera, for example, the first user adjusts the shooting angle of the front camera to shoot a face, so as to obtain a first video stream; and adjusting the rear camera to shoot the money receiving code to obtain a second video stream.
The content shot by the front camera is used for verifying the payment safety. The first user can control the front-facing camera to shoot the face of the first user, can shoot the environment where the first user is located through the front-facing camera, can shoot other information such as storefronts through the front-facing camera of the first terminal, and specific content shot by the front-facing camera is not limited.
In one implementation, the front-facing camera captures face information of the first user, and the second user can verify the identity of the first user based on the face information. For example, the first user may shoot the face of the first user by controlling the front camera, and then the second user may see the face of the first user through the chat interface, and further, when the second user determines that the first user requests payment through the chat interface, the second user may determine that the current payment is safe, and execute the payment operation; when the second user cannot confirm that the first user requests payment or the displayed face which is not the first user through the chat interface, the second user can exit the current chat interface and refuse to pay for the first user.
In another implementation, the front-facing camera shoots an environment in which the first user is located, and the second user can verify whether the current payment environment is safe based on the environment in which the first user is located. For example, the first user may control the front-facing camera to capture an environment in which the first user is located, and then the second user may determine whether the first user is in a secure payment environment, and further, when the second user determines that the environment in which the first user is located is not abnormal through the chat interface, the second user may determine that current payment is secure, and execute a payment operation; when the second user determines that the environment where the first user is located is abnormal through the chat interface, the second user can exit the current chat interface and refuses to pay for the first user. For example, the first user helps the second user purchase the product a instead of the first user, and the second user can refuse to pay instead of the first user when the second user determines that the store of the product a where the first user is located is not a regular exclusive store based on the front-facing camera.
Optionally, the first terminal may adjust shooting parameters of the front camera and the rear camera to adjust a size of a shot image. For example, the first terminal may control the rear camera to collect images in a full-screen size, and control the front camera to collect images in a non-full-screen size, so that the images collected by the front camera are fused in the images collected by the rear camera during subsequent synthesis. It can be understood that the first terminal can reduce operations such as image compression or cutting by image synthesis and improve processing efficiency by controlling the size of the image acquired by the camera.
And S705, the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream.
In some embodiments, the first terminal may fuse two frames of images with the same time stamp in the first video stream and the second video stream to obtain an image with the time stamp in the composite video stream, where the time stamp of the image is the time when the image is acquired. It should be understood that a video stream is a sequence of images arranged in time order from a plurality of frames of images. That is, the first terminal may fuse two frames of images acquired by the front camera and the rear camera at the same time to obtain a composite image at the time; further, the composite images at each time shot by the first terminal are arranged in time sequence, and a composite video stream is obtained.
Referring to fig. 8, fig. 8 is a schematic diagram of video stream synthesis according to an embodiment of the present application. As shown in (a) of fig. 8, images in the first video stream including a first image of a first time stamp, a second image of a second time stamp, and a third image of a third time stamp are represented by dotted arrows. As shown in (B) of fig. 8, the images in the second video stream are represented by solid arrows, and the second video stream includes a fourth image of the first time stamp, a fifth image of the second time stamp, and a sixth image of the third time stamp. As shown in (C) of fig. 8, the first terminal may fuse the first image and the fourth image to obtain a first composite image, fuse the second image and the fifth image to obtain a second composite image, fuse the third image and the sixth image to obtain a third composite image, and sort the composite images according to the time stamp to obtain a composite video stream.
In one implementation, the first terminal may replace a preset area in the image in the second video stream with the image in the first video stream. Referring to fig. 9, the first image is an image acquired by a front camera, for example, the first image is an image with a first timestamp in the first video stream; the fourth image is an image acquired by the rear camera when the front camera acquires the first image, for example, the fourth image is an image of the first timestamp in the second video stream; the size of the first image is the size of the preset area, and the size of the fourth image is the size of the preset screen. Specifically, the first terminal may clear the pixels in the preset area in the fourth image, and replace the pixels in the first image in the preset area in the fourth image, so as to obtain a first composite image as shown in fig. 9. The position of the preset area may be preset, for example, located in the upper right corner of the image; the method for determining the preset area is not limited herein, and may be determined based on an algorithm, for example, the preset area is set in an area other than the two-dimensional code area after determining the position of the two-dimensional code.
In yet another implementation, the first terminal may splice two frames of images having the same time stamp in the first video stream and the second video stream to obtain the image having the time stamp in the composite video stream. For example, the first terminal may acquire a first image with a first preset size through the front camera, and acquire a second image with a second preset size through the rear camera, wherein the first preset size and the second preset size are added to be equal to a preset screen size; and then, the first image and the second image are spliced to obtain a third image.
In some embodiments, the first terminal may display the composite video stream such that the first user may adjust the photographing angles of the front camera and the rear camera by previewing the composite video stream. For example, a first user shoots a cash register image of a merchant through a rear camera and shoots a portrait image of the first user through a front camera; the first user can preview the image synthesized by the cash register image and the face image through the first terminal, and then, the positions of the front camera and the rear camera are adjusted, so that the second terminal can display the clear face image and the complete cash register. It can be appreciated that when the first user directly sends the photographed image including the payment code to the other party, the image quality problem, such as unclear image or incomplete payment code, will often occur; by previewing and sending the video, the first user can adjust the angle of shooting the cash register in real time and update the cash register in real time so as to ensure that the second terminal can receive the effective cash register, and the second terminal can conveniently identify the cash register to finish payment.
S706, the first terminal encrypts the synthesized video stream to obtain a third video stream.
In some embodiments, the first terminal may obtain the private key from the key server, and further, the first terminal may encrypt the composite video stream through an encryption algorithm based on the private key, and may obtain the third video stream.
The first terminal can acquire the private key in a manner that the first terminal can apply for the private key from the key server when the first user logs in the remote payment application; the key server may generate a public key and a private key of the first terminal based on the encryption algorithm, transmit the private key of the first terminal to the first terminal, and disclose the public key of the first terminal so that the first terminal may encrypt data using the private key based on the encryption algorithm and other users may acquire the public key of the first terminal from the key server to decrypt the encrypted data of the first terminal. The key server may be a public key database, for example, a key management center database, a cloud server, or the like.
The encryption Algorithm includes a symmetric cryptosystem encryption Algorithm (Data Encryption Standard, DES), an advanced encryption standard Algorithm (Advanced Encryption Standard, AES), an RSA encryption Algorithm, a Base64 encryption Algorithm, an MD5 Message-Digest Algorithm (MD 5 Message-Digest algorism), an SHA-256 encryption Algorithm, an XXXTEA encryption Algorithm, a digital signature Algorithm, and the like, which are not limited herein.
Taking the RSA encryption algorithm (also called RSA signature technology) as an example, the signature process is as follows:
(1) A generates a pair of keys (public and private), the private key is not public, and A itself remains. The public key is public and available to anyone.
(2) A signs the message by using the private key of the A to form a signature, and transmits the signed message and the message to B together.
(3) And B, after receiving the message, checking the signature after obtaining the public key of A, and if the checked signature content is consistent with the message, proving that the message is replied by A.
In this embodiment of the present application, a may be a first terminal, and B may be a second terminal; a may also be a first terminal and a key server, the key server being configured to perform step (1); the first terminal is configured to perform step (2), and B may be the second terminal.
And S707, the first terminal sends the third video stream and the payment request parameter to the second terminal, wherein the payment request parameter comprises the identification of the first terminal and the indication information of the encryption algorithm.
The identity of the first terminal is used for acquiring a public key of the first terminal, and the identity of the first terminal can be an international mobile equipment identification (International Mobile Equipment Identity number, IMEI) or an account number and the like; the indication information of the encryption algorithm is used for indicating an encryption algorithm used when the first terminal encrypts the composite video stream.
Correspondingly, the second terminal receives the third video stream and the payment request parameter from the second terminal.
Optionally, the payment request parameter may further include a request amount, a request time, and validity period information, where the validity period information is used to indicate a validity time of the third video stream.
It will be appreciated that the first terminal may also encrypt audio data, where the audio data is a recording of the first terminal when capturing the first video stream and the second video stream. For example, the first terminal may package and encrypt the audio data together with the composite video stream and send the encrypted audio data to the second terminal, so that the second terminal may communicate with the first user in real time based on the audio data when playing the composite video stream, for example, determine a request amount based on the audio data to complete payment.
And S708, the second terminal decrypts the third video stream based on the identification of the first terminal and the indication information of the encryption algorithm to obtain the composite video stream.
In some embodiments, after receiving the payment request parameter and the third video stream sent by the first terminal, the second terminal obtains the identity of the first terminal and the indication information of the encryption algorithm from the payment request parameter; determining an encryption algorithm based on the indication information of the encryption algorithm, and acquiring a public key of the first terminal from a key server based on the identity of the first terminal; and then decrypting the third video stream based on the public key and the encryption algorithm to obtain a composite video stream.
For example, the remote payment software may be a bloom application, the key server may be a bloom cloud server, and when the first user uses the bloom as an account to log in the bloom application, the first terminal sends a request for applying a private key to the bloom cloud server, where the request includes the bloom as an account; based on the request, the Hua-Cheng cloud server generates a private key and a public key of the first terminal and sends the private key to the first terminal; and storing the public key and the Hua-Cheng account in an associated manner, and transmitting the public key stored in the Hua-Cheng account to the second terminal when the second terminal requests the public key corresponding to the Hua-Cheng account from the Hua-Cheng cloud server.
Optionally, the second terminal may verify the payment request parameter, and decrypt the third video stream based on the identity of the first terminal when the payment request parameter is verified; and returning to the video call in step S701 when the payment request parameter check is not passed. For example, the payment request parameter includes request time and validity period information, the second terminal determines that the payment request parameter passes the verification when the time interval between the request time and the current time point is within the validity period, otherwise, the verification is not passed, for example, the time interval between the request time and the current time point is five minutes, and the validity period is 1 minute, then the second terminal may determine that the payment request parameter does not pass the verification, and does not decrypt the third video stream.
S709, the second terminal displays the composite video stream.
The user interface for displaying the composite video stream by the second terminal may refer to the user interface shown in fig. 6C, which is not described herein. It can be understood that when the second terminal displays the composite video stream, the second user can judge the image sent by the first terminal, and confirm whether the identity and environment of the first user are abnormal, so as to ensure payment security.
Optionally, the second terminal may also perform security verification on the composite video stream in real time; when the security verification fails, the remote payment video is interrupted, and the video chat is switched to the normal video chat, for example, the video call page in step S701 may be displayed.
In one implementation, the second terminal may identify environmental information in each frame of image in the composite video stream, and determine that the security verification failed when there is a significant difference in the environmental information. For example, the second terminal may input each frame of image in the composite video stream into the trained neural network model, to obtain environmental information in each frame of image, where the environmental information includes a feature quantity, for example, the feature quantity may be an identified object such as a license plate or text information; and comparing the environmental information in the front and rear frame images, and determining that the security verification fails when the feature quantity exceeding the preset number is different.
In another implementation, the second terminal may identify a front portion and a rear portion in each frame of the composite image in the composite video stream, calculate motion vectors of the front portion and the rear portion in the front and rear two frames of the composite image, respectively, and determine that the security verification fails when the motion vector directions of the front portion and the rear portion in the front and rear two frames of the composite image are inconsistent. The front part is an image acquired by the front camera, and the rear part is an image acquired by the rear camera; the second terminal may calculate the motion vectors of the front and rear two-frame images by a block matching method, a quadratic optical flow prediction method, and the like, which is not limited herein. It will be appreciated that since the front and rear cameras are located on the same terminal device, the movement trend of the front part and the movement trend of the rear part should be identical, and the composite video stream is at risk of being tampered when the movement trend of the front part and the movement trend of the rear part are not identical.
For example, the first terminal may transmit, to the second terminal, indication information of positions of the leading portion and the trailing portion in the composite image, so that the second terminal determines the leading portion and the trailing portion in each frame of the composite image based on the above indication information; the second terminal divides the front part and the rear part into blocks respectively, calculates the motion vector of each block in the front and rear frames of images, takes the average value of the motion vectors of all the blocks in the front part as the motion vector of the front part, and takes the average value of the motion vectors of all the blocks in the rear part as the motion vector of the rear part; and comparing the motion vector of the front part with the motion vector of the rear part, and determining that the security verification is successful if the difference between the motion vector of the front part and the motion vector of the rear part is within a preset range, or else, determining that the security verification is unsuccessful.
And S710, the second terminal displays a payment page when the synthesized video stream identifies the cashing code.
In some embodiments, after receiving the composite video stream, the second terminal invokes an interface of the payment application to identify the two-dimensional code in the composite video stream, and then analyzes the two-dimensional code, and displays a payment page corresponding to the two-dimensional code, such as the payment page shown in fig. 6D.
In other embodiments, the second terminal first identifies a payment code from the composite video stream through the remote payment application, starts the payment application when the payment code is identified to be included in the composite video stream, identifies the payment code through the payment application, and displays a payment page corresponding to the two-dimensional code, such as the payment page shown in fig. 6D; alternatively still, the second terminal may display an interface including a tab that may include a plurality of payment means, such as a plurality of installed payment applications, when composing the video stream to the identification checkout code; further, in response to a user operation for on the option box, a payment application selected by the user is launched, and a payment page is displayed.
In still other embodiments, the remote payment application has a payment function, and after receiving the composite video stream, the second terminal may directly identify the two-dimensional code in the composite video stream, further analyze the two-dimensional code, and display a payment page corresponding to the two-dimensional code, such as the payment page shown in fig. 6D.
Note that the payment page may also display the amount of the request sent by the first terminal, and the payment page is not limited herein.
S711, the second terminal responds to the user operation of the second user to finish the payment for the cash register.
In some embodiments, the second user may enter the target amount at an input box 642 of the user interface 64; upon receiving a confirmation operation from the second user (e.g., clicking on control 643 shown in user interface 64), the second terminal completes the payment for the checkout code.
In one implementation, the second terminal recognizes that the checkout code may obtain an account corresponding to the checkout code; the second terminal generates a payment request based on the account corresponding to the cash register, the target amount and the account of the second terminal, wherein the payment request is used for requesting the digital asset of the target amount in the account of the second terminal to be transferred to the account corresponding to the cash register; the second terminal sends the payment request to a third party server; after receiving the payment request, the third party server transfers the digital asset of the target amount in the account of the second terminal to the account corresponding to the cash register; after completing the transfer of the digital asset, the third party server may send an indication of the success of the payment to the second terminal. The third party server is a server of a payment platform and is used for completing transfer of digital assets, such as a server of a bank.
In some embodiments, after the second terminal completes the payment, the transaction credential information may be displayed and sent to the first terminal to cause the first terminal to display the transaction credential information. The payment credential information may include payment amount, transaction item information, payee information, and the like, among others.
Fig. 10 illustrates an application scenario for remote payment based on a payment code.
As shown in fig. 10, the scenario includes a code scanning device, a first terminal and a second terminal, where the first user (e.g., a) is a user of the first terminal, and the second user (e.g., B) is a user of the second terminal. For example, a may request B for a payment code when shopping in a place such as a mall, and the second terminal transmits the payment code of B to the first terminal; b may present the payment code to, for example, a merchant; the merchant can scan the payment code displayed by the first terminal through the code scanning device to finish collection.
The payment code may be any information for mobile payment, such as bar codes, two-dimensional codes, payment code digit strings, etc. Note that the payment code may also be referred to as a checkout code.
Based on the above scenario, some user interfaces on the first terminal and the second terminal provided in the embodiments of the present application are described below. Fig. 11A-11D illustrate some user interfaces implemented on a first terminal and a second terminal in this remote payment scenario.
Fig. 5A to 5E illustrate a manner in which the first terminal and the second terminal establish a common video call, which is not described herein.
Fig. 11A-11D illustrate one manner in which a first terminal and a second terminal establish a remote payment connection. Fig. 11A is a user interface implemented on a first terminal, and fig. 11B to 11D are user interfaces implemented on a second terminal.
After the second terminal agrees to establish the above-described normal video call with the first terminal (e.g., the second user clicks the answer control 542 shown in the user interface 54 shown in fig. 5D), the first terminal may display the user interface 11 shown in fig. 11A. As shown in fig. 11A, the user interface 11 includes: an image captured by the second terminal displayed full screen, a switch control 111, a hang-up control 112, a payment request control 113 and a display box 114. The switching control 111 is used for switching display positions of an image shot by the first terminal and an image shot by the second terminal on the user interface; the hang-up control 112 is used to end the call; the payment request control 113 is configured to send a request for requesting payment to the peer; the display frame 114 is used for displaying an image photographed by the first terminal.
As shown in fig. 11A, the first terminal may detect a user operation on the payment request control 113 of the electronic device, and in response to the user operation, the first terminal transmits a request to establish a remote payment connection to the second terminal.
Referring to fig. 11B, fig. 11B shows the user interface 12 displayed by the second terminal after the second terminal receives the request for establishing the communication connection sent by the first terminal. The user interface 12 includes a window 121. Window 121 is used to prompt the user for a first terminal to request establishment of a remote payment connection. The second terminal may detect an operation of determining to establish a remote payment connection with the first terminal, such as the user clicking on control 121A shown in fig. 11B, in response to which the second terminal establishes a communication connection for remote payment with the first terminal. Alternatively, the second terminal detects an operation of refusing to establish a communication connection for remote payment with the first terminal, for example, an operation of clicking the control 121B by the user as shown in fig. 11B, and in response to the operation, the second terminal refuses to establish a communication connection for remote payment with the first terminal.
The user may trigger the second terminal to display the user interface 12 in other manners without being limited to the interface shown in fig. 11A, which is not limited in the embodiment of the present application.
As shown in fig. 11B, the second terminal may detect a user operation acting on the control 121A, and in response to the user operation, the second terminal establishes a remote payment connection with the first terminal, displaying the user interface 13 shown in fig. 11C or the user interface 14 shown in fig. 11D.
In some embodiments, the second terminal detects a user operation on control 121A on user interface 12, displays user interface 13 as shown in fig. 11C, and the first terminal still displays a generic video call interface as shown in fig. 11A.
As shown in fig. 11C, the user interface 13 may include: the method comprises the steps of generating a composite image, a control 131, a window 132 and a return control 133 by a second terminal in full screen display, wherein the composite image is an image synthesized by an image shot by a front camera of the second terminal and a payment code generated by the second terminal; the control 131 is configured to receive a long press operation of a user to send a composite video stream; the window 132 is used for displaying an image sent by the first terminal; the return control 133 is used to return to the above-described ordinary video call. As shown in fig. 11C, the payment code is a bar code, and the front camera of the second terminal captures a face image of the second user.
As shown in fig. 11C, the second terminal may detect a user operation acting on the control 131, and in response to the user operation, send the composite video stream to the first terminal. For example, when the user presses the control 131 at a time length of 9:00 and releases the control 131 at a time length of 9:01, the second terminal determines that the composite video stream in the time period of 9:00 to 9:01 is the composite video stream to be sent to the first terminal; when the user releases control 131, a video stream for a period of 9:00 to 9:01 is sent to the first terminal.
Upon receiving the composite video stream sent by the second terminal, the first terminal may switch the normal video call interface to a user interface for displaying the composite video stream, such as the user interface 14 shown in fig. 11D. It should be noted that, the first terminal may pop up a prompt message after receiving the composite video stream, where the prompt message is used to indicate that payment code information sent by the first terminal is received; further, when the first user determines to display the payment code information, the composite video stream is displayed.
Fig. 11D shows the user interface 14 displayed by the first terminal after receiving the composite video stream described above. The user interface 14 may display the video stream sent by the second terminal and the return control 141 in full screen, where the return control 141 is used to return to the above-mentioned ordinary video call.
In other embodiments, the second terminal detects a user operation acting on the control 121A on the user interface 12, synthesizes the payment code with the image captured by the front-facing camera to obtain a synthesized video stream, and sends the synthesized video stream to the first terminal; the first terminal and the second terminal may each display a user interface 14 as shown in fig. 11D, and the user interface 14 may display the composite video stream and the return control 141 in full screen, where the return control 141 is used to return to the normal video call.
The user may also trigger the second terminal to display the user interface 14 in other ways, not limited to the interfaces shown in fig. 11B and 11C, as the embodiments of the present application are not limited. It should be understood that user interface 14 is merely one exemplary user interface provided by embodiments of the present application and is not limiting of embodiments of the present application.
The remote payment method provided in the embodiment of the present application is described in detail below with reference to the scenario shown in fig. 10 based on the schematic diagram of the remote payment system shown in fig. 1A or fig. 1B, and the hardware and software structure schematic diagrams of the electronic device shown in fig. 2 and fig. 3.
Fig. 12 illustrates a remote payment method provided in an embodiment of the present application. The remote payment method may include some or all of the following steps:
s801, a first terminal and a second terminal establish a common video call.
The details of step S701 may be referred to herein, and will not be described in detail.
S802, the first terminal sends a payment request to the second terminal, wherein the payment request is used for requesting a payment code.
The details of step S702 may be referred to herein, and will not be described in detail.
S803, the second terminal responds to the payment request to acquire a first video stream and a second video stream, wherein the first video stream is a video stream acquired through a front-facing camera, and the second video stream is a payment code generated by the second terminal.
In some embodiments, after receiving a payment request from a second terminal, the first terminal displays a user interface 12 as shown in fig. 11B, and when a user clicks a control 121A displayed on the user interface 12, the second terminal responds to the user operation and starts a front camera to record a video, so as to obtain a first video stream; meanwhile, the payment code is generated in real time through the payment application, and a second video stream is obtained, wherein the second video stream is an image sequence of payment code images ordered according to time. The payment code can be a bar code or a two-dimensional code.
In some embodiments, the first terminal may adjust the photographing parameters of the front camera to adjust the size of the photographed image. For example, the first terminal may generate a barcode image of a preset screen size, and control the front camera to collect an image in a non-full screen size, so that the image collected by the front camera is fused in the barcode image during subsequent synthesis. It can be understood that the first terminal can reduce operations such as image compression or cutting by image synthesis and improve processing efficiency by controlling the size of the image acquired by the camera.
S804, the second terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream.
The specific reference may be made to the above step S705 for synthesizing the relevant content of the video by the first terminal, which is not described herein.
In some embodiments, the second terminal may display the composite video stream after it has been obtained, e.g., the second terminal may display a user interface 63 as shown in fig. 6C.
S805, the second terminal encrypts the synthesized video stream to obtain a third video stream.
The specific encryption process may refer to the content related to the encryption of the video stream by the first terminal in step S706, which is not described herein.
In some embodiments, the second terminal determines a partial video stream from the composite video stream in response to a user operation, encrypts the partial video stream, and obtains a third video stream. For example, the second terminal may display the composite video stream through the user interface 63 as shown in fig. 6C; and the user presses the control 631 for a long time, the second terminal determines the synthesized video stream displayed by the second terminal as a partial video stream during the long time pressing of the control 631 by the user, and the second terminal encrypts the partial synthesized video stream when the user releases the control 631 to obtain a third video stream.
S806, the second terminal sends the third video stream to the first terminal, and the payment request parameter comprises the identity of the second terminal and the indication information of the encryption algorithm.
In some embodiments, the second terminal may display the composite video stream after obtaining the composite video stream, e.g., the second terminal may display the user interface 63 as shown in fig. 6C, further, the second terminal may receive a user operation of the control 631 by the user and send a third video stream to the first terminal. Specifically, the user presses the control 631 for a long time, the second terminal determines the video during the long time pressing of the control 631 as a partially synthesized video stream, encrypts the partially synthesized video stream when the user releases the control 631 to obtain a third video stream, and sends the third video stream to the first terminal.
In other embodiments, the second terminal detects a user operation acting on the control 121A on the user interface 12, synthesizes the payment code with the image captured by the front-facing camera to obtain a synthesized video stream, and sends the synthesized video stream to the first terminal; the first terminal and the second terminal may each display a user interface 13 as shown in fig. 11C, and the user interface 14 may display the composite video stream and the return control 141 in full screen, where the return control 141 is used to return to the normal video call. See for details the description of the interface diagrams, and are not repeated here.
The indication information of the encryption algorithm is used for indicating an encryption algorithm used by the second terminal when encrypting the composite video stream.
Correspondingly, the first terminal receives the third video stream and the payment request parameter from the second terminal.
Optionally, the payment request parameter may further include a request amount, a request time, and validity period information, where the validity period information is used to indicate a validity time of the third video stream.
S807, the first terminal decrypts the third video stream based on the identity of the second terminal and the indication information of the encryption algorithm, and the composite video stream is obtained.
The method can specifically refer to the step S707 of decrypting the related content of the video stream by the second terminal, which is not described herein.
S808, the first terminal displays the composite video stream so that the code scanning device scans the payment code displayed by the first terminal to finish collection.
The user interface for displaying the composite video stream by the first terminal may refer to the user interface shown in fig. 11D, which is not described herein.
For example, the merchant scans the payment code displayed by the first terminal through the code scanning device to complete collection of the target amount. Specifically, the code scanning device scans the payment code to obtain an account corresponding to the payment code; the code scanning equipment generates a payment request based on an account corresponding to the payment code, a target amount and an account of a merchant, wherein the payment request is used for requesting the digital asset of the target amount in the account corresponding to the payment code to be transferred to the account of the merchant; the code scanning equipment sends the payment request to a third party server; after receiving the payment request, the third party server transfers the digital asset of the target amount in the account corresponding to the payment code to the account of the merchant; after completing the transfer of the digital asset, the third party server may send an indication of the success of the payment to the code scanning device. The third party server is a server of a payment platform and is used for completing transfer of digital assets, such as a server of a bank. It will be appreciated that the third party server may also send a successful deduction message to the second terminal after the transfer of the digital asset is completed.
The embodiment of the application also provides electronic equipment, which comprises one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories being operable to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the methods described in the above embodiments.
Embodiments of the present application also provide a computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform the method described in the above embodiments.
Embodiments of the present application also provide a computer-readable storage medium including instructions that, when executed on an electronic device, cause the electronic device to perform the method described in the above embodiments.
It is understood that the embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present application should be included in the protection scope of the present application.
Claims (35)
1. A remote payment method, the method being applied to a communication system comprising a first terminal and a second terminal, the method comprising:
the first terminal and the second terminal establish communication connection;
the first terminal acquires a first video stream and a second video stream through a first camera and a second camera respectively, wherein the second video stream comprises a cashing code;
the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream;
The first terminal sends the composite video stream to the second terminal;
the second terminal identifying the checkout code from the composite video stream;
the second terminal completes payment for the checkout code.
2. The method of claim 1, wherein after the first terminal and the second terminal establish a communication connection, the method further comprises:
the first terminal sends the payment request to the second terminal when detecting the operation of the first user, wherein the payment request is used for requesting payment;
after receiving the payment request, the second terminal displays prompt information and a control, wherein the prompt information is used for prompting a user whether to receive a cash register from the first terminal;
when the second terminal detects a second user operation for the control, sending an agreement message to the first terminal;
the first terminal obtains a first video stream and a second video stream through a first camera and a second camera respectively, and the method comprises the following steps: and after receiving the agreement message, the first terminal acquires the first video stream and the second video stream through the first camera and the second camera respectively.
3. The method according to claim 1 or 2, wherein the first video stream comprises a face of a user of the first terminal.
4. A method according to any of claims 1-3, wherein the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, comprising:
and the first terminal synthesizes two frames of images with the same shooting time in the first video stream and the second video stream into one frame of image in the synthesized video stream.
5. The method according to any one of claim 1 to 4, wherein,
the first terminal sending the composite video stream to the second terminal, comprising: the first terminal encrypts the synthesized video stream to obtain an encrypted synthesized video stream; the first terminal sends the encrypted synthesized video stream to the second terminal;
the method further comprises the steps of: and the second terminal decrypts the encrypted composite video stream when receiving the encrypted composite video stream to obtain the composite video stream.
6. The method according to any one of claims 1-5, further comprising:
The second terminal identifies environmental information in each frame of image in the composite video stream; the environmental information includes at least one feature quantity;
the second terminal determines payment safety when the number of the same characteristic quantity of every two adjacent frames of images in the synthesized video stream is larger than a preset number;
the second terminal completing payment for the checkout based on the composite video stream, comprising: and the second terminal completes payment for the cashing code based on the composite video stream when determining payment security.
7. The method according to any one of claims 1-5, further comprising:
the second terminal is used for synthesizing a front part and a rear part in each frame of image in the video stream, wherein the front part is an image shot by the first camera, and the rear part is an image shot by the second camera;
the second terminal calculates motion vectors of a front part and a rear part of two adjacent frames of images in the synthesized video stream respectively;
the second terminal determines payment security when the direction of the motion vector of the front part is consistent with that of the motion vector of the rear part;
The second terminal completing payment for the checkout based on the composite video stream, comprising: and the second terminal completes payment for the cashing code based on the composite video stream when determining payment security.
8. A remote payment method, the method being applied to a communication system comprising a first terminal and a second terminal, the method comprising:
the first terminal and the second terminal establish communication connection;
the first terminal obtains a first video stream through a camera, and generates a second video stream based on a payment code, wherein the second video stream comprises the payment code;
the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, wherein the synthesized video stream comprises the payment code;
the first terminal sends the composite video stream to the second terminal;
the second terminal displays the composite video stream.
9. The method of claim 8, wherein after the first terminal and the second terminal establish a communication connection, the method further comprises:
the second terminal sends the payment request to the first terminal when detecting the operation of the first user, wherein the payment request is used for requesting a payment code;
After receiving the payment request, the first terminal displays prompt information and a control, wherein the prompt information is used for prompting a user whether to send a payment code to the second terminal;
the first terminal obtains a first video stream through a camera, generates a second video stream based on a payment code, and comprises: and when the first terminal detects the second user operation aiming at the control, acquiring the first video stream through the camera, and generating the second video stream according to the payment code.
10. The method according to claim 8 or 9, wherein the first video stream comprises a face of a user of the first terminal.
11. The method according to any one of claims 8-10, wherein the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, comprising:
and the first terminal synthesizes two frames of images acquired at the same time in the first video stream and the second video stream into one frame of image in the synthesized video stream.
12. The method according to any one of claims 8 to 11, wherein,
the first terminal sending the composite video stream to the second terminal, comprising: the first terminal encrypts the synthesized video stream to obtain an encrypted synthesized video stream; the first terminal sends the encrypted synthesized video stream to the second terminal;
The method further comprises the steps of: and the second terminal decrypts the encrypted composite video stream when receiving the encrypted composite video stream to obtain the composite video stream.
13. The method according to any one of claims 8-12, further comprising:
the second terminal identifies environmental information in each frame of image in the composite video stream; the environmental information includes at least one feature quantity;
the second terminal determines payment safety when the number of the same characteristic quantity of every two adjacent frames of images in the synthesized video stream is larger than a preset number;
the second terminal displaying the composite video stream, comprising: and the second terminal displays the composite video stream when determining payment security.
14. The method according to any one of claims 8-12, further comprising:
the second terminal is used for synthesizing a front part and a rear part in each frame of image in the video stream, wherein the front part is an image shot by the first camera, and the rear part is an image shot by the second camera;
the second terminal calculates motion vectors of a front part and a rear part of two adjacent frames of images in the synthesized video stream respectively;
The second terminal determines payment security when the direction of the motion vector of the front part is consistent with that of the motion vector of the rear part;
the second terminal displaying the composite video stream, comprising: and the second terminal displays the composite video stream when determining payment security.
15. A remote payment method, applied to a first terminal, the method comprising:
after the first terminal establishes communication connection with the second terminal, the first terminal respectively obtains a first video stream and a second video stream through the first camera and the second camera, and the second video stream comprises a cashing code;
the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream;
the first terminal sends the composite video stream to the second terminal, the composite video stream being used by the second terminal to identify the checkout from the composite video stream and complete payment for the checkout.
16. The method of claim 15, wherein the first terminal, after establishing the communication connection with the second terminal, further comprises:
the first terminal sends the payment request to the second terminal when detecting the operation of the first user, wherein the payment request is used for requesting payment;
The first terminal obtains a first video stream and a second video stream through a first camera and a second camera respectively, and the method comprises the following steps: and after receiving an agreement message, the first terminal acquires the first video stream and the second video stream through the first camera and the second camera respectively, wherein the agreement message is generated by the second terminal based on a second user operation after receiving the payment request.
17. The method according to claim 15 or 16, wherein the first video stream comprises a face of a user of the first terminal.
18. The method according to any one of claims 15-17, wherein the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, comprising:
and the first terminal synthesizes two frames of images with the same shooting time in the first video stream and the second video stream into one frame of image in the synthesized video stream.
19. The method according to any of claims 15-18, wherein the first terminal sending the composite video stream to the second terminal comprises:
the first terminal encrypts the synthesized video stream to obtain an encrypted synthesized video stream;
And the first terminal sends the encrypted composite video stream to the second terminal, and the encrypted composite video stream is used for the second terminal to acquire the composite video stream.
20. A remote payment method, applied to a second terminal, the method comprising:
after the second terminal establishes communication connection with the first terminal, receiving a composite video stream sent by the first terminal, wherein the composite video stream comprises a collection code, the composite video stream is synthesized by the first terminal through a first video stream and a second video stream acquired by a first camera and a second camera respectively, and the second video stream comprises the collection code;
the second terminal identifying the checkout code from the composite video stream;
the second terminal completes payment for the checkout code.
21. The method of claim 20, wherein before the second terminal receives the composite video stream from the first terminal, the method comprises:
after receiving a payment request from the first terminal, the second terminal displays prompt information and a control, wherein the prompt information is used for prompting a user whether to receive a cash register from the first terminal;
And when the second terminal detects the second user operation on the control, sending an agreement message to the first terminal, wherein the agreement message is used for indicating to send the composite video stream to the second terminal.
22. The method according to any one of claims 20 or 21, wherein the second terminal receiving the composite video stream from the first terminal comprises:
the second terminal receives the encrypted composite video stream sent by the first terminal;
and the second terminal decrypts the encrypted composite video stream to obtain the composite video stream.
23. A remote payment method, applied to a first terminal, the method comprising:
after the first terminal establishes communication connection with the second terminal, acquiring a first video stream through a camera, and generating a second video stream based on a payment code, wherein the second video stream comprises the payment code;
the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, wherein the synthesized video stream comprises the payment code;
and the first terminal sends the composite video stream to the second terminal so that the second terminal displays the composite video stream.
24. The method of claim 23, wherein the first terminal, after establishing the communication connection with the second terminal, further comprises:
after receiving a payment request from a second terminal, the first terminal displays prompt information and a control, wherein the prompt information is used for prompting a user whether to send a payment code to the second terminal, the payment request is generated by the second terminal based on the operation of the first user, and the payment request is used for requesting the payment code;
the first terminal obtains a first video stream through a camera, generates a second video stream based on a payment code, and comprises: and when the first terminal detects the second user operation aiming at the control, acquiring the first video stream through the camera, and generating the second video stream according to the payment code.
25. The method according to claim 23 or 24, wherein the first video stream comprises a face of a user of the first terminal.
26. The method according to any one of claims 23-25, wherein the first terminal synthesizes the first video stream and the second video stream to obtain a synthesized video stream, comprising:
And the first terminal synthesizes two frames of images acquired at the same time in the first video stream and the second video stream into one frame of image in the synthesized video stream.
27. The method according to any of claims 23-26, wherein the first terminal sending the composite video stream to the second terminal comprises:
the first terminal encrypts the synthesized video stream to obtain an encrypted synthesized video stream;
and the first terminal sends the encrypted composite video stream to the second terminal, and the encrypted composite video stream is used for the second terminal to acquire the composite video stream.
28. A remote payment method, applied to a second terminal, the method comprising:
the second terminal receives a synthesized video stream sent by the first terminal after communication connection is established with the first terminal, wherein the synthesized video stream comprises a payment code, the synthesized video stream is synthesized by a first video stream and a second video stream, the first video stream is acquired by the first terminal through a camera, the second video stream is generated by the first terminal based on the payment code, and the second video stream comprises the payment code;
The second terminal displays the composite video stream.
29. The method of claim 28, wherein the second terminal, prior to receiving the composite video stream from the first terminal, includes:
and when the second terminal detects the operation of the first user, the second terminal sends the payment request to the first terminal, wherein the payment request is used for requesting a payment code.
30. The method of any one of claims 28 or 29, wherein the first video stream comprises a face of a user of the first terminal.
31. An electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 15-22 or 23-30.
32. A computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 15-22 or 23-30.
33. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 15-22 or 23-30.
34. A payment system comprising a first electronic device for performing the method of any of claims 15-19 and a second electronic device for performing the method of any of claims 20-22.
35. A payment system comprising a first electronic device for performing the method of any of claims 23-27 and a second electronic device for performing the method of any of claims 28-30.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111248039.1A CN116029716A (en) | 2021-10-26 | 2021-10-26 | Remote payment method, electronic equipment and system |
PCT/CN2022/127046 WO2023071985A1 (en) | 2021-10-26 | 2022-10-24 | Remote payment method, electronic device, and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111248039.1A CN116029716A (en) | 2021-10-26 | 2021-10-26 | Remote payment method, electronic equipment and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116029716A true CN116029716A (en) | 2023-04-28 |
Family
ID=86073224
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111248039.1A Pending CN116029716A (en) | 2021-10-26 | 2021-10-26 | Remote payment method, electronic equipment and system |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN116029716A (en) |
WO (1) | WO2023071985A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118053243B (en) * | 2024-01-18 | 2024-07-12 | 重庆嗨客网络科技有限公司 | Payment collection voice broadcast detection method, system, terminal and storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108712602A (en) * | 2018-04-24 | 2018-10-26 | Oppo广东移动通信有限公司 | Camera control method, device, mobile terminal and storage medium |
CN108629586B (en) * | 2018-05-04 | 2023-07-07 | 努比亚技术有限公司 | Payment verification method, device and computer readable storage medium |
CN109151180B (en) * | 2018-07-27 | 2020-09-01 | 维沃移动通信有限公司 | Object identification method and mobile terminal |
CN112732994B (en) * | 2021-01-07 | 2022-01-28 | 上海携宁计算机科技股份有限公司 | Method, device and equipment for extracting webpage information and storage medium |
-
2021
- 2021-10-26 CN CN202111248039.1A patent/CN116029716A/en active Pending
-
2022
- 2022-10-24 WO PCT/CN2022/127046 patent/WO2023071985A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2023071985A1 (en) | 2023-05-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110321790B (en) | Method for detecting countermeasure sample and electronic equipment | |
US11910197B2 (en) | Service processing method and device | |
CN111046365A (en) | Face image transmission method, numerical value transfer method, device and electronic equipment | |
CN110111101B (en) | Electronic payment method and electronic device | |
CN114040242B (en) | Screen projection method, electronic equipment and storage medium | |
CN113408016B (en) | Method and device for storing ciphertext | |
CN114553814B (en) | Method and device for processing push message | |
CN110751503A (en) | Advertisement processing method and electronic equipment | |
US20220335107A1 (en) | Method for Processing Application Program and Related Product | |
CN110138999B (en) | Certificate scanning method and device for mobile terminal | |
EP4181003A1 (en) | Permission management method and terminal device | |
WO2023071985A1 (en) | Remote payment method, electronic device, and system | |
CN111027374B (en) | Image recognition method and electronic equipment | |
CN116669020B (en) | Password management method, password management system and electronic equipment | |
CN116305093B (en) | Method for operating applet and electronic device | |
US20240370537A1 (en) | Cross-Device Authentication Method and Apparatus | |
EP4250141A1 (en) | Page display method and apparatus, and electronic device and readable storage medium | |
CN116527266A (en) | Data aggregation method and related equipment | |
CN115239322A (en) | Payment method | |
CN113867851A (en) | Electronic equipment operation guide information recording method, electronic equipment operation guide information acquisition method and terminal equipment | |
CN111741040A (en) | Connection establishing method, address obtaining method, device, equipment and storage medium | |
CN117234351B (en) | Touch processing method, terminal equipment and storage medium | |
US20240233933A1 (en) | Contact tracing method and related device | |
CN117641359A (en) | Data processing method and electronic equipment | |
CN114117458A (en) | Key using method and related product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |