[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN116301541A - Method for sharing file, electronic device and computer readable storage medium - Google Patents

Method for sharing file, electronic device and computer readable storage medium Download PDF

Info

Publication number
CN116301541A
CN116301541A CN202310226143.3A CN202310226143A CN116301541A CN 116301541 A CN116301541 A CN 116301541A CN 202310226143 A CN202310226143 A CN 202310226143A CN 116301541 A CN116301541 A CN 116301541A
Authority
CN
China
Prior art keywords
application
file
window
user
sharing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310226143.3A
Other languages
Chinese (zh)
Inventor
官睿
许昭宇
屠子恂
洪英明
赵玉航
肖袁圆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202310226143.3A priority Critical patent/CN116301541A/en
Publication of CN116301541A publication Critical patent/CN116301541A/en
Priority to PCT/CN2023/120145 priority patent/WO2024178962A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of terminals and discloses a method for sharing files, electronic equipment and a computer readable medium. The method for sharing the file is applied to the electronic equipment and comprises the steps of displaying a conference window of conference software; detecting a drag operation of a user on a first file; in response to the drag operation, the conference software is caused to open the first file through the target application, and the window of the opened first file is set as a shared window on the conference window. By the method, when the user shares the file in the conference, the target file to be shared does not need to be opened, and complicated sharing operation in conference software is not needed. Only the target file to be shared is required to be dragged into the area in the conference software window. The file sharing process is simple and quick, and is convenient for users to operate.

Description

Method for sharing file, electronic device and computer readable storage medium
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a method for sharing files, an electronic device, and a computer readable medium.
Background
Teleconferencing becomes an indispensable part of enterprise communication collaboration, and various kinds of conference software or conference applications are installed on terminal electronic devices such as mobile phones and computers used by users for teleconferencing. In the process of communication through conference software, conference materials such as documents, pictures and the like often need to be shared, so that both parties of the conference can conveniently and accurately understand communication contents, and efficient cooperation is realized. For convenience of description, the above-described shared document, picture, and other conference materials may be hereinafter referred to as a target file. It will be appreciated that during some conference software operations, it is often necessary for a user to first operate an application program, such as a document (word) or a slide (PPT), that can open a target file, on an electronic device, such as a mobile phone. Then, the user can click on the sharing control on the conference window and select the window of the corresponding document or picture according to the prompt so as to realize the purpose of sharing the file.
However, the above-described process of sharing files is excessively cumbersome in terms of user operation steps, and the implementation of the process depends on the target file having been opened by the corresponding application. When the target file is not opened, the corresponding window is not displayed on the electronic equipment such as the mobile phone, and the user cannot find the opened window of the target file on the conference window for sharing, so that the user experience is poor.
Disclosure of Invention
The application aims to provide a method for sharing files, electronic equipment and a computer readable storage medium.
A first aspect of the present application provides a method for sharing a file, applied to a first electronic device, where the method includes: displaying a first application interface of a first application; detecting a first operation of a user on a first file; and responding to the first operation, displaying a second application interface of the second application in the first application interface, wherein the second application interface comprises the opened first file.
It will be appreciated that in one embodiment, the first application may be, for example, conference software and the first application interface may be, for example, a conference window opened by the conference software. When the user performs sharing of the first file, the first file is dragged to the first conference window through a first operation, for example. And opening the first file through the second application and displaying a second application interface, such as an interface window for displaying the first file. And then, displaying the interface window of the first file on the conference window, thereby achieving the purpose of sharing the first file in conference software.
In a possible implementation of the first aspect, the method further includes: the first application interface comprises a first sharing area, and the detection of the first operation of the user on the first file comprises the following steps: detecting an operation of dragging a first file to a first sharing area by a user; and, in response to the first operation, displaying a second application interface of the second application in the first application interface, including: and displaying a second application interface in the first sharing area.
It is appreciated that in one embodiment, the first shared region may be, for example, a full member shared region, and the user drags the first file into the full member shared region. For example, when a user drags a first file to be released in a first sharing area, when the user releases the first file, the electronic device can detect that the user releases the coordinate position of the first file, calculate the drop point position information of the first file according to the coordinate position, and determine that the first file falls into the first sharing area according to the drop point position information of the first file. When the drop point position of the first file is determined to be in the first sharing area, the conference window can share the second application interface for all the participants. For example, the conference window shares the interface window of the open first file to all participants.
In a possible implementation of the first aspect, the method further includes: the first sharing area displays a third application interface of a third application, and comprises a first area triggering split screen sharing and a second area triggering alternative sharing; and, the method comprises: detecting an operation of dragging a first file to a first area by a user, and dividing and displaying a second application interface and a third application interface in a first sharing area; and detecting the operation of dragging the first file to the second area by the user, and replacing the third application interface displayed in the first sharing area with the second application interface for display.
It may be appreciated that, in an embodiment, the third application interface may be, for example, a file being shared by the user or other participants, and when the user wants to share the file again, the first file may be dragged to a region shared by a split screen in the first sharing region or replace the shared region, so as to implement sharing of the first file by different sharing manners. When the user drags the first file to the area of the split screen sharing, the first sharing area of the conference window can split screen to share the window of the first file dragged by the user and the file window being shared by other participants. When the user drags the first file into the alternate sharing area, the first sharing area of the conference window will only share the window of the first file dragged by the user.
In a possible implementation of the first aspect, the method further includes: the first sharing area comprises a plurality of sharing windows for displaying different contents, and the first application interface comprises at least one window control for indicating to switch to the corresponding sharing window; and, the method comprises: and detecting user operation acting on a first window control in the at least one window control, and switching the content displayed in the first sharing area to the display content of the first window indicated by the first window control.
It will be appreciated that in one embodiment, the user shares multiple windows on the conference window, at which point a switch control will be displayed on the conference window. By switching the control, the user can display the sharing window to be displayed as a foreground sharing window on the conference window, and other windows are temporarily hidden as background sharing windows.
In a possible implementation of the first aspect, the method further includes: the first application interface includes at least one second shared area, wherein the second shared area includes a first user identification using the first application, the first user identification indicating a viewing authority of the first user for content displayed in the second shared area, and detecting a first operation of the user for the first file includes: detecting an operation of dragging the first file to at least one second sharing area by a user; and, in response to the first operation, displaying a second application interface of the second application in the first application interface, including: and responding to the first operation, and displaying a second application interface in a second sharing area acted by the first operation.
It will be appreciated that in one embodiment, the second shared area may be, for example, a meeting participant's avatar area and the first user identification may be, for example, a meeting participant's avatar. The user drags the first file to the head portrait of the appointed conferee, so that the function of sharing the first file for the single conferee can be realized.
In a possible implementation of the first aspect, the method further includes: the first electronic device detects a first operation of a user on a first file in the following manner; acquiring first position information of a first file corresponding to the first operation; and determining that the position indicated by the first position information is positioned in a target area triggering the shared file in the first application interface, and detecting a first operation.
It will be appreciated that in one embodiment, the first operation may represent, for example, an operation in which the user drags the first file for sharing. When the user releases the first file, if the drop point position of the first file is located in the conference window, the operation of the user can be determined to be the operation of sharing the file. The drop point location of the first file may be calculated based on the location where the user released the first file.
In a possible implementation of the first aspect, the method further includes: after detecting the first operation of the user on the first file, the method further comprises: and responding to the first operation, displaying a first prompt interface, wherein the first prompt interface is used for prompting a user to drag the first file to a designated position or a designated area in the area covered by the first application interface.
It will be appreciated that in one embodiment, the first alert interface may include, for example, alert information indicating a release location where the user drags the first file to share. Or indicating the user to drag the first file to be released in different areas so as to realize the operation of sharing the first file in different modes.
In a possible implementation of the first aspect, the method further includes: the first application is an application running on the second electronic device, and a first application interface of the first application is displayed, including: the method comprises the steps that first electronic equipment receives a first information stream sent by second electronic equipment, wherein the first information stream is used for displaying a first application interface; the first application interface is displayed.
It will be appreciated that in one embodiment, the first application may be, for example, conference software running on board a vehicle, and the first application interface may be, for example, a conference software interface. The first electronic device may be, for example, a cell phone and the first information stream may include, for example, process information that the conference software application is running. After the mobile phone receives the first information stream, for example, a conference window of conference software can be displayed on the multi-device collaboration interface.
In a possible implementation of the first aspect, the method further includes: the first electronic device runs a fourth application, a first file is displayed on a fourth application interface provided by the fourth application, and a fifth application interface provided by the fourth application displays a first application interface of a first application run by the second electronic device; and, a first operation comprising: and dragging the first file from the fourth application interface to a second operation on the fifth application interface, and dragging the first file to a third operation in the first application interface, wherein the second operation and the third operation are continuous operations.
It will be appreciated that in one embodiment, the first application is, for example, an application on a cell phone that enters a multi-device collaboration interface. The fourth application interface may be, for example, an interface of the multi-device collaboration interface corresponding to an application running on the mobile phone. And after the first file is opened on the mobile phone, displaying the opened first file on a fourth application interface. The fifth application interface may be, for example, an interface of the multi-device collaboration interface to an application running on the vehicle. The conference window of the conference software running on the vehicle is displayed on the fifth application interface. It is understood that the fourth application interface and the fifth application interface both belong to a multi-device collaboration interface, but correspond to applications running on different electronic devices. And switching the fourth application interface to the fifth application interface on the multi-device collaborative interface through the second operation. And the third operation can drag the corresponding first file on the fourth application interface to the conference software of the fifth application interface, so that the function of sharing the file across devices is realized.
In a possible implementation of the first aspect, the method further includes: a second operation of dragging the first file from the fourth application interface to the fifth application interface, comprising: dragging the first file from the fourth application interface to a fourth operation on a second window control for indicating switching to a fifth application interface, and dragging the first file to a fifth operation in a corresponding area of the fifth application interface, wherein the fourth operation and the fifth operation are continuous operations.
It is to be appreciated that in some embodiments, switching between the fourth application interface and the fifth application interface in the multi-device collaborative interface may be accomplished through a touch control. By way of example, the fourth application interface may be switched to the fifth application interface by dragging the first file to move to the control corresponding to the fifth application interface through the fourth operation. The fifth operation is a drag operation on the first file after the fourth application interface is switched to the fifth application interface, for example, drag the first file onto the conference window of the conference software.
In a possible implementation of the first aspect, the method further includes: the first application includes any one of an instant messaging application, conference software, and a live application.
A second aspect of the present application provides an electronic device comprising: one or more processors; one or more memories; the one or more memories store one or more programs that, when executed by the one or more processors, cause the electronic device to perform the method of sharing files described above.
A third aspect of the present application provides a computer-readable storage medium having stored thereon instructions that, when executed on a computer, cause the computer to perform the method of sharing files described above.
Drawings
FIG. 1 illustrates an application scenario diagram of sharing files in a teleconference;
FIG. 2 shows a schematic diagram of a user sharing a file according to the present embodiment;
FIG. 3 illustrates a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of an implementation of a method for sharing files according to embodiment 1 of the present application;
fig. 5 is an interface schematic diagram illustrating a user performing a sharing operation on a target file on a mobile phone according to embodiment 1 of the present application;
fig. 6 is an interface schematic diagram illustrating a user performing a sharing operation on a target file at a task management interface of a mobile phone according to embodiment 1 of the present application;
FIG. 7 is a schematic interface diagram illustrating setting a target file as a shared window according to embodiment 1 of the present application;
FIG. 8 is a schematic diagram of a user sharing a target file on a computer in different sharing manners according to embodiment 1 of the present application;
FIG. 9a is a schematic diagram illustrating the operation of packet sharing according to embodiment 1 of the present application;
FIG. 9b is a schematic diagram illustrating another packet sharing operation according to embodiment 1 of the present application;
FIG. 10 is a schematic diagram showing a user switching between different shared windows according to embodiment 1 of the present application;
FIG. 11 is a schematic diagram of an interface for completing sharing of a target file by a user performing a mobile operation on a computer according to embodiment 1 of the present application;
FIG. 12 shows a software architecture block diagram of an electronic device according to embodiment 1 of the present application;
FIG. 13 shows an interactive flow diagram of a file sharing process according to embodiment 1 of the present application;
FIG. 14 illustrates a schematic diagram of a multi-device collaboration interface, according to embodiment 2 of the present application;
FIG. 15 illustrates an interactive flow diagram for sharing files across devices according to embodiment 2 of the present application;
fig. 16 is a schematic diagram of an interface for performing a move operation on a window to complete window sharing in a multi-device collaboration interface according to embodiment 2 of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be described in detail below with reference to the accompanying drawings and specific embodiments of the present application.
Fig. 1 shows a schematic view of an application scenario in which files are shared in a teleconference.
As shown in fig. 1, the scenario includes a computer 10. The computer 10 runs conference software and displays a conference window 11. For example, when a user shares a target file, the interface 10a of the computer 10 needs to open the target file first and display the target file on a screen, for example, a window 12 in the form of a window. The user then clicks the set sharing control 13 on the conference window 11 by means of operation 14. Next, as in interface 10b of computer 10, the conference software generates a selection window 15 for selecting the target file in response to operation 14, the selection window 15 including window icons for all windows opened by the user, such as window icon 17 for window 12. After the user selects the window icon 17 of the window 12 through operation 16, the conference software shares the window 12 in response to operation 16, and online participants can view the file content displayed in the window 12.
It can be seen that in the process of sharing files, the target file that the user wants to share in the meeting needs to be opened by a corresponding application program, for example, the document that the user wants to share needs to be opened first by a word application or the like. Then, the user can select the application program window for opening the target file through the sharing control provided on the window of the conference software to realize sharing. Therefore, for unopened target files, the operation process of sharing in the conference process is complicated, and the use experience of users is reduced.
In order to solve the above problems, the present application provides a method for sharing files, which is applied to an electronic device installed with conference software. Specifically, when a specific user operation indicating to share the target file in the conference is detected, if the target file is determined not to be opened, the file can be opened through a pre-installed application program capable of opening the target file, and then a window for opening the target file is set as a sharing window, so that the purpose of quickly sharing the target file in the conference process is achieved. The specific user operation may be, for example, an operation of dragging the target file to a conference window created corresponding to the conference software currently operated by the electronic device such as a computer, a mobile phone, etc. The target file may be, for example, a file in a format of a word document, a PPT document, a drawing document, or the like, and the application program capable of opening the target file, which is installed in advance, may be, for example, a word application, a PPT application, a drawing application, or the like, which is not limited herein. Therefore, when a user shares the target file, the target file is not required to be opened first, and only specific user operation is required to be executed, for example, the target file is dragged into a conference window to be shared, the corresponding application can be triggered and called to open the target file, and the window for opening the target file can be automatically set as a sharing window and opened in the conference window for sharing, so that the user operation is facilitated.
Further, when the user cuts out the shared window or opens other interfaces, the electronic device may further control the shared window to still display the target file shared by the user, or control the shared window to display blank content or display the shared window as a window in a gray screen state, without displaying other interfaces opened by the user. Thus, the privacy of the user can be protected, and the use experience of the user is improved.
The window of the opening target file is taken as a sharing window and is put into an area corresponding to the release target file by the user operation, and the opening target file can be realized by identifying the drop point position information of the release target file corresponding to the user operation. For example, after receiving a release operation of dragging a target file by a user, the electronic device responds to the release operation, records drop point position information of the target file, and generates a sharing broadcast, wherein the sharing broadcast comprises the drop point position information of the target file and receiving permission information, and the receiving permission information indicates that conference software can receive the sharing broadcast. After receiving the sharing broadcast, the conference software determines whether the user wants to share the target file according to the drop point position information of the target file. The conference software determines the attribute of the target file according to the sharing broadcast, then determines the window for displaying the content of the target file according to the attribute of the target file, and sets the window for displaying the content of the target file as a sharing window. When the target file is not opened, the electronic device may first run a preset application program to open the target file, and create a window displaying the content of the target file. Further, the electronic device can set the newly created window displaying the contents of the target file as the shared window.
It can be understood that, in the process of sharing files by using the conference software, the target files are objects for the user to perform file sharing operation. Accordingly, the application program for opening the target file may be referred to as a target application hereinafter.
For example, fig. 2 shows a schematic diagram of an operation interface for sharing files by a user according to the present embodiment.
As shown in fig. 2, users share files through conferencing software on computer 20. As in interface 20a of computer 20, the user drags picture 24 in gallery 23 to move to the meeting window 21 area of the meeting software by operation 22 to release, computer 20 obtains the landing position information of picture 24 and then generates a shared broadcast, and the receiving authority information in the shared broadcast indicates that the meeting software can receive the shared broadcast. The conference software acquires the drop point position information of the picture 24 after receiving the sharing broadcast, and determines that the drop point position of the picture is located within the range of the conference window 21. The conference software analyzes the shared broadcast, determines the attribute of the picture 24 as a picture, and then the conference software mobilizes the gallery 23 to open the picture 24. As with interface 20b of computer 20, conference software sets the open picture 24 to be shared interface 25, causing shared interface 25 to be displayed full screen on conference window 21. An operation control 26 is also provided in the sharing interface 25, for a user to operate on the target file during the sharing process.
The structure of the electronic device according to some embodiments of the present invention will be described in detail.
Fig. 3 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
It can be appreciated that the method for sharing files provided in the embodiments of the present application, the applicable electronic device may include, but is not limited to: a mobile phone, tablet, computer, smart watch, car meeting terminal, desktop, laptop, handheld computer, netbook, and an augmented reality (augmented reality, AR) \virtual reality (VR) device, a smart television, smart watch, or other wearable device, server, portable gaming device, portable music player, reader device, or other electronic device with one or more processors embedded or coupled therein, or capable of accessing a network. In the embodiment of the present application, the electronic device 100 shown in fig. 3 may be, for example, a mobile phone or a computer in various possible implementations. For distinguishing the description, the following description will be given by adding different numbers to each electronic device, and this is collectively described herein.
As shown in fig. 3, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a user identification module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. In this embodiment of the present application, the processor 100 may control the execution of the content of the opening target file through the controller, and the specific execution process refers to the related description below, which is not described herein.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system. In this embodiment of the present application, the instruction stored in the memory is, for example, to send out a sharing broadcast when the user performs the operation of releasing the target file, and the specific execution process is described in the following related description, which is not repeated herein.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information. In the embodiment of the application, when the window for opening the target file is set as the shared window, the conference software renders the corresponding interface on the conference software into the display window of the target file through the GPU.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a Mini-LED, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "receiver," is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: when a touch operation with the touch operation intensity smaller than the first pressure threshold is applied to a window of the conference software, the conference software executes an instruction for entering a small window mode. When the touch operation with the touch operation intensity being larger than or equal to the first pressure threshold value acts on the window of the conference software, the conference software executes the instruction of split screen.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. For example, the user selects the object file on the display screen 194 to drag, the touch sensor 180K detects the user's operation, the user's operation is transferred to the application processor, the touch time type is determined as the drag operation of the user, and the visual output of the object file related to the drag operation of the user moving along the drag path of the user is provided through the display screen 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
Based on the scenario shown in fig. 2 and the electronic device structure shown in fig. 3, a specific implementation procedure of the method for sharing files provided in the present application on a single electronic device is first described in conjunction with embodiment 1.
Example 1
FIG. 4 is a flow chart illustrating an implementation of a method for sharing files according to an embodiment of the present application.
It should be understood that the main execution body of each step in the flowchart shown in fig. 4 may be the electronic device 100 having the structure shown in fig. 3, for example, a computer, a mobile phone, etc., which is not limited herein. In order to simplify the description, the description of the execution subject of each step will not be repeated when each step of the flow shown in fig. 4 is described.
It will be appreciated that the object file in this embodiment represents an object for intended sharing that is available for the user to perform an operation.
In one example, the target file may be, for example, the content itself that the user expects to share, e.g., a desktop, a document within an opened folder, a picture, or an already opened window.
In another example, the target file may be, for example, an identification pointing to a user intended target file by which content, such as shortcuts, application icons, etc., that the user is intended to share can be found.
It will be appreciated that the location of the drop point in this embodiment relates to the coordinate position of the object file on the cell phone screen when the user's finger leaves the cell phone screen.
In one example, the drop point position may be, for example, a release position operated by a user, and if the operating system of the mobile phone does not support inertial movement, the release position operated by the user is the drop point position. For example, when a user drags a picture on a mobile phone screen through a finger to move, the coordinate position of the target file when the finger leaves the mobile phone screen is the drop point position of the target file.
In another example, the drop point position may be, for example, a position coordinate calculated based on a release position operated by the user, and if the mobile phone operating system can support inertial movement, the target file will continue to move forward along the direction in which the finger of the user moves by a distance, where the drop point position is a distance from the release position; the operating system of the mobile phone calculates the landing point position of the target file based on the release position of the user and an algorithm of inertial movement. For example, a user drags a picture to move in a window range of conference software through a finger on a display screen of the mobile phone, an operating system of the mobile phone supports inertial movement, the finger of the user leaves a screen of the mobile phone outside the window range of the conference software, and the drop point position information of the picture is calculated according to an algorithm of inertial movement of the operating system of the mobile phone, a coordinate position of the finger of the user leaving the screen of the mobile phone and a moving direction of the picture dragged by the user.
Specifically, as shown in fig. 4, the method for sharing files includes the following steps:
s401: and detecting user operation of dragging the target file by a user, and calculating the drop point position information of the target file.
Illustratively, when a user wants to share a file, dragging the target file to move to the range of the conference window, the electronic device detects the dragging operation of the user in real time, and when the user releases the target file, the drop point position information of the target file is recorded. For example, a user drags a picture in a gallery application to move, the mobile phone detects the coordinate position of the picture dragged by the user in real time, and when the finger of the user leaves the display screen of the mobile phone, the mobile phone calculates the drop point position information of the picture according to the coordinate of the finger of the user leaving the display screen.
Fig. 5 is an interface schematic diagram illustrating a user performing a sharing operation on a target file on a mobile phone according to an embodiment of the present application.
As shown in fig. 5, in some embodiments, a user shares a picture 513 in gallery application 512 on cell phone 510 through conferencing software 511, at which point gallery application 512 and conferencing software 511 are displayed on the interface of cell phone 510 by split screen. The user drags the picture 513 to move into range of the conference software 511 by operation 514 to release.
In other embodiments, the user shares the application window 553 of gallery application 522 on handset 520 via conference software 521, at which point gallery application 522 and conference software 521 are displayed on the interface of handset 520 in a split screen manner. The user drags the application window 553 to move into range of the conference software 521 to release by operation 524.
In other embodiments, the user shares a picture 533 in the gallery application 532 on the phone 530 through the conferencing software 531, where the window of the conferencing software 531 is displayed on the window of the gallery application 532 through a small window. The user drags the picture 533 through operation 534 to move into range of the conference software 531 to release.
In some embodiments, the target file is a window thumbnail/card of the task management interface, and the user drags the window thumbnail/card to move on the task management interface to realize window sharing. The task management interface displays thumbnail images/cards of a plurality of application windows, and drags the thumbnail image/card of one application window into the thumbnail image/card of the conference software so as to set the application window to be shared.
For example, fig. 6 is an interface schematic diagram of a user performing a sharing operation on a target file at a task management interface of a mobile phone according to an embodiment of the present application.
As shown in fig. 6, in the task management interface 601 in the mobile phone 600, a conference window 602 and a gallery window 603 of conference software running on the mobile phone 600 may be displayed, the user selects the gallery window 603 through an operation 604, and the mobile phone 600 displays the gallery window 603 as an icon 605 in response to the operation 604. The user then drags icon 605 of gallery window 603 to move within the area of conference window 602 via operation 606, at which point the conference software displays a prompt on conference window 602 to drag to share here in response to operation 606. When the user releases the icon 605 of gallery window 603 within the range of conference window 602, handset 600 can calculate the drop point location based on the coordinates of user released icon 605.
S402: and judging whether the user operation is an operation for indicating the shared file or not based on the drop point position information of the target file released by the user.
Illustratively, it is determined whether the user operation is an operation indicating a shared file based on whether the drop point position information is within a window range of the conference software.
If the judgment result is yes, for example, when the drop point position information indicates that the drop point position is located in the currently created conference window of the conference software, the user is indicated to operate as an operation for indicating the shared file. At this point, the electronic device may continue to execute step S403 described below, open the target file and trigger sharing.
If the judgment result is negative, for example, the drop point position information indicates that the drop point position does not fall into the conference window currently created by the conference software, which indicates that the user operation is not the operation of indicating to share the file, and the target file is not required to be shared. At this time, the electronic device may end the execution flow of the shared file.
S403: based on the properties of the target file, the target file is opened and sharing is triggered.
By way of example, the attributes of the target file may include at least one of a type of the target file, a storage location, and an application identifier of the target application, for example. It will be appreciated that the conference software determines that the properties of the target file are read directly from the operating system of the electronic device after the target file is shared. For example, when the target file is a certain document, the electronic device may directly read the type, storage path, and other attributes of the document from the operating system. When the target file is a window, the operating system of the electronic device determines the application identifier and other attributes of the application program to which the window belongs. When the target file is a shortcut, the operating system of the electronic device determines that the type of the target file is the shortcut, and reads the attributes such as the storage path of the file pointed by the shortcut. When the target file is an application icon, the application icon can be regarded as a shortcut, and the attribute of the shortcut is basically consistent with that of the shortcut.
Illustratively, when the conferencing software determines that the attributes of the target file include a storage path, it indicates that the target file is a file that has not yet been opened. At this time, the electronic device first finds the target file in the memory according to the storage path, and then applies for opening/viewing the content of the target file, which means that the content of the target file is displayed in the form of a window on the screen, to the operating system. And the operating system determines a target application supporting the opening of the content of the target file according to the type of the target file, and then feeds back an application identifier of the target application to the conference software. If the operating system has multiple applications that can open the contents of the target file, the target application may have multiple applications. The conference software calls the target application to open the target file according to the protocol among the applications through the application identifier, so that the target file is displayed on a screen in the form of a window, and then the newly opened window is set as a shared window. If the plurality of applications all support the protocol of the conference software, the user can select the target application for opening the target file, or default the application program used by the user for opening the target file recently as the target application.
If the conference software determines that the attribute of the target file comprises the application identifier, the conference software indicates that the target file is an opened window, and the conference software directly calls the window of the target file through the application identifier and then sets the window as a shared window.
FIG. 7 illustrates an interface diagram for setting a target file as a shared window according to an embodiment of the present application.
As shown in fig. 7, in an interface 700a of a mobile phone 700, a conference window 701 of conference software and a gallery window 702 of a gallery are displayed in a split screen manner, a user drags a photo 703 in the gallery window 702 to share, the photo 703 is not yet opened, a storage path indicated by an attribute of the photo 703 is "internal storage/DCIM/Camera/img_20221115_123456. Jpg", then the conference software first finds the photo 703 dragged by the user according to the storage path, then applies for opening and viewing the photo 703 to an operating system of the mobile phone 700, the operating system determines to open the application as the gallery according to a picture format of the photo 703, then feeds back an application identifier of the gallery to the conference software, the conference software calls up the gallery to open the photo 703 according to an inter-application protocol through the application identifier, and sets the opened window 704 as the sharing window.
As shown in fig. 7, in the interface 700b of the mobile phone 700, after the sharing window 704 is set, the conference window 701 and the gallery window 702 exit from windowing, and the sharing window 704 is displayed in a full-screen display manner and is shared. In the sharing window 704, an operation control 705 is further provided, and through the operation control 705, the user can perform operations such as annotating, stopping sharing, sharing audio, switching sharing, and the like.
To avoid other participants from viewing private content when the user cuts out the shared window 704, a gray screen may be shared or the shared window 704 may still be displayed. It will be appreciated that when the user exits the current target application (e.g., gallery window 702 in fig. 8), the conference software is invoked and the conference window 701 is displayed in a small window/split screen manner. At this time, the original shared photo 703 is still displayed in the small window/split screen window; or, a gray screen can be directly displayed, and other participants cannot see the content outside the small window/split screen window, so that the privacy of the user is protected.
S404: and determining a sharing mode of the target file based on the drop point position information of the target file and displaying a sharing window.
Illustratively, the user operates the drop point position of the release target file, and different sharing modes exist in different areas of the conference software.
For example, in some examples, the open window of the target file is set to the latest shared window as long as the drop point location is within the conference window. If the conference software does not have a shared window currently, the open window of the target file is directly set as the shared window.
If the conference software is currently sharing a window and the conference software can only share one window at the same time, the open window of the target file preempts the authority of the original shared window.
If the conference software is currently sharing the windows and the conference software supports sharing a plurality of windows at the same time, the conference software places the original shared window in the background, and takes the open window of the target file as the current foreground shared window.
In some examples, when conference software shares multiple shared windows, a switch control may be set to switch between a foreground shared window and a background shared window. Wherein the foreground shared window is a window currently displayed on the conference software interface. The background shared window is a window which is temporarily hidden and is not displayed on the conference software interface.
In some examples, the conference software displays at least one region in a main window of the conference software in response to a drag operation on the target file, each region corresponding to a sharing manner, and determines the sharing manner based on the region in which the drop point location is located. For example, the main window of the conference software may be divided into two areas: the split screen shared area and the alternate shared area. When meeting software detects the dragging operation of the target file, corresponding prompt information is respectively displayed in the two areas, and the prompt information indicates the sharing mode of the areas. Wherein the main window is an area for displaying the shared window on the conference software.
In the sharing, the drop point position is located in the main window of the conference software, and the sharing window shares all the participants.
In some examples, the window of the conference software further includes a participant window, the participant window displays an avatar or video picture of the participant, and when the drop point position falls into the participant window, only the target file is shared to the participant, and gray screens or last shared content are displayed on the shared windows of other participants.
When the target file is expected to be shared to a plurality of participants, the sliding track of the target file can be controlled, the shared object is determined according to the sliding track, when the sliding track passes through the window of the participant, the participant is determined to be the shared object, and after the dragging operation, the window of the target file is shared to the sharing object.
It can be understood that when a plurality of meeting participants in a meeting place access a meeting through a meeting terminal, a meeting participant window corresponding to the meeting terminal is selected, so that a target file can be shared to a specific meeting place.
With respect to the above step 404, different sharing manners of the object file are described below with reference to the corresponding drawings.
When a user drags a file to share, different sharing modes can be realized by releasing the file in different areas of a conference software window. In some examples, when a user is using conferencing software on an electronic device for a video conference, other participants or users have shared file content, and the user also wants to share another file. At this time, the user can drag the file to a different area of the conference software window, and the conference software responds to the operation of the user to realize replacement sharing or split screen sharing.
Specifically, fig. 8 is a schematic diagram showing a user sharing a target file on a computer through different sharing modes according to an embodiment of the present application.
As shown in fig. 8, a conference window 801 of conference software may be displayed on the computer 800, where the conference window 801 has already shared the second file 802, and when the user drags the target file 803 to the conference window 801 to share, the conference window 801 displays a split screen sharing area 804 and an alternative sharing area 805. At this time, the user may move the target file 803 to the split screen sharing area 804 or replace the sharing area 805 to achieve sharing.
When the user moves the target file 803 to the screen sharing area 804, the conference window 801 screen-shares the target file 803 and the existing second file 802.
When the user moves the target file 803 to the replacement sharing area 805, the sharing operation applies for preempting the sharing authority to the conference software, and after the application is successful, the sharing of the existing second file 802 is canceled on the conference window 801, and only the currently dragged target file 803 is shared. It will be appreciated that, when applying for preempting the sharing rights, the conference software may authenticate the user's ID and determine whether to perform the replacement sharing based on the authentication result.
As shown in fig. 8, when the user drags the target file 803 into the conference window 801, the screen sharing area 804 on the conference window 801 displays the prompt information of "screen sharing", and the replacement sharing area 805 displays the prompt information of "replacement sharing".
It can be appreciated that in other embodiments, the sharing manner is not limited to the split screen sharing and the replacement sharing, and the conference window is not limited to the area where the split screen sharing and the replacement sharing are displayed, so long as the user drags the target file to the conference window through the mobile operation, and the conference software invokes the corresponding application to open the target file and set the sharing, which is within the scope of the present application.
In some examples, a user may want to share files with only one participant or a portion of the participants when sharing files on a computer using conferencing software. At this time, the user may drag the file onto the head portrait of the designated participant in the conference software window, and the conference software responds to the operation of the user to share the file with the participant.
In particular, fig. 9a and 9b show a schematic diagram of the operational procedure of packet sharing according to an embodiment of the present application.
As shown in fig. 9a, in computer 910, when a user shares a file through conference software, conference window 911 of the conference software may display conference participant avatar interface 916, and full member sharing interface 914. The user moves the target file 913 to the region of the full-member sharing interface 914 through the drag operation 912 to achieve the purpose of sharing the target file 913 to all participants. In other examples, when the user drags the target file 913 into the region of the participant avatar interface 916 through operation 915, the target file 913 is shared with the participant corresponding to the participant avatar. For example, the user drags the object file 913 onto the avatar of the meeting participant 4, and then shares the object file 913 with the meeting participant 4. When the terminal is a meeting place shared with the person 4, the target file 913 is shared only with the meeting place.
As shown in fig. 9b, in computer 920, meeting software meeting window 921 may have a meeting participant avatar interface 924 and a holonomic sharing interface 925 displayed thereon. When a user wants to share the target file 922 with multiple attendees, for example, the user may drag the target file 922 over the attendee avatar interface 923 by operation 923, hover over the attendee window that the user wants to share, and then the attendee may be selected by the meeting software. After the multiple conferees are respectively selected, the multiple conferees are shared. When the meeting participant window to be shared is continuous, a quick selection may be achieved, for example, by sliding after hovering. For example, in fig. 9b, user movement target file 922 first hovers over the window of meeting participant 2, meeting software selects the window of meeting participant 2, and adds a selected identification, such as the portion of meeting participant 2 in fig. 9b being a ground tint. The drag trajectory then slides sequentially through the windows of participants 3, 4 and when the window of participant 4 is released, the target file 922 of the drag is visible only to participants 2, 3, 4 and not to participants 1, 5, 6.
In some examples, a user may share files on a computer using conferencing software, and when multiple files need to be shared, a switch control may be generated on a window of the conferencing software. The user touches different switching controls, and the conference software responds to the operation of the user and displays the sharing interfaces corresponding to the different switching controls.
Specifically, fig. 10 shows a schematic diagram of a user switching between different shared windows according to an embodiment of the present application.
As shown in fig. 10, in a computer 1000, a first window 1002 and a control for indicating to switch windows may be displayed on a conference window 1001 of conference software. Such as a first window control 1003, a second window control 1004, and a stop sharing control 1005. The user can quickly implement the switching of the first window 1002 and the second window by clicking on the controls of the different windows. For example, an operation in which the user clicks the first window control 1003 is detected, the conference window 1001 sets the second window as a background shared window and the first window 1002 as a foreground shared window in response to the operation by the user, and the first window 1002 is displayed on the conference window 1001. Therefore, when the user wants to share the previous first window 1002, the user does not need to execute the sharing process such as dragging again, and the user may click on the first window control 1003 to switch.
It will be appreciated that in other embodiments, there may be 1 or more shared windows. Likewise, the number of window controls corresponds to the shared window, and is not limited herein. The operation of switching the sharing window can also be implemented in the mobile phone 700 shown in fig. 7, as shown in fig. 7, where the operation control 705 includes a control for switching the sharing, and the user clicks the control for switching the sharing to implement switching of different target files.
FIG. 11 is a diagram illustrating an interface for a user to complete sharing of a target file by performing a mobile operation on a computer according to an embodiment of the present application.
As shown in FIG. 11, in computer 1100, a user expects a "three quarter summary report. Ppt" file 1102 of shared desktop 1101. At this time, the user does not need to open the file 1102 or set a sharing operation through multiple steps on the conference software, and the user only needs to select the file 1102, press the left mouse button for a long time, drag the file 1102 to move to the conference window 1103 of the conference software through the operation 1104. During the movement process, the computer 1100 acquires the coordinates of the cursor on the desktop 1101 in real time, and the position of the cursor is the position of the file 1102. When the file 1102 enters the conference window 1103, a prompt message of "drag to share here" is displayed in the conference window 1103. After detecting the photoelectric signal that the mouse is no longer pressed for a long time, the computer 1100 records the position of the cursor when the operation 1104 is released. If inertial movement is supported, a final drop point position is calculated according to the inertial motion model based on the position of the cursor at the time of release of operation 1104 and the drop point position is recorded. The computer 1100 determines whether the drop point location is within the conference window 1103, thereby determining whether the conference software responds to operation 1104 by calling the corresponding application to open the file 1102 and sharing the opened window according to the attributes of the file 1102. For example, in this embodiment, the conference software calls the power point application open according to the inter-application protocol "three quarter summary report. Ppt".
In some examples, the target file is, for example, a shortcut 1105, a window 1106, or an icon 1107 corresponding to window 1106 of the second file. After determining that the attribute of the shortcut 1105 of the target file, for example, the second file, is the shortcut, the computer 1100 further obtains the storage location and the file type of the pointed second file according to the information recorded by the shortcut. For example, referring to the computer side, the information can be read directly from the attribute column of the shortcut. For example, the file pointed by the shortcut is "D: \XXXX conference report\second file. Xlsx", the storage path of the target file is "D: \XXXX conference report\second file. Xlsx", the file type is xlsx, the mobile phone 1100 calls up the corresponding form application to open the second file, and displays the second file on the screen in the form of a window, and then the window of the opened second file is set as a shared window. Upon determining that the property of the target file, e.g., window 1106, is window, computer 1100 invokes window 1106 and shares according to the application identifier of window 1106. Upon determining that the attribute of the object file, for example, the icon 1107 of the window 1106 is an application icon, the computer 1100 obtains the storage location pointing to the window 1106 from the information recorded by the icon 1107 of the window 1106, and since the window 1106 has been opened, the window 1106 can be directly set as a shared window.
Fig. 12 shows a software architecture block diagram of an electronic device according to an embodiment of the present application.
The operating system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
As shown in fig. 12, the application layer may include a series of application packages. The application packages may include applications for meetings, gallery, calendar, conversation, map, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 12, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, a touch driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated below in connection with a scenario in which a user drags a file.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including the coordinates of the touch, the timestamp of the touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a drag operation, taking a control corresponding to the drag operation as an example of a picture of a gallery application interface, the gallery application calls an interface of an application framework layer, starts a view system, starts a display driver by calling a kernel layer, and provides visual output that the picture moves along a drag path of a user through a display screen 194.
Based on the software architecture block diagram shown in fig. 12, fig. 13 shows an interaction flow diagram of a file sharing process according to an embodiment of the present application. The main execution body of each step in the flowchart shown in fig. 13 may be the electronic device 100 having the structure shown in fig. 3, for example, a computer, a mobile phone, etc., which is not limited herein. In order to simplify the description, the description of the execution subject of each step will not be repeated when each step of the interaction flow shown in fig. 13 is described.
Specifically, as shown in fig. 13, the interaction flow includes the following steps:
s1301, the target application 1310 and/or the conference software 1320 detects an operation that the user instructs to exit the full screen display, triggering interface windowing.
When the user uses the electronic device to share the file, the user touches the conference software 1320 interface of the electronic device for a preset time or slides, and the conference software 1320 enters a windowed display in response to the user operation. The target application 1310 represents an application corresponding to a target file that the user wants to share, and the manner in which the target application 1310 enters the windowed display is similar to the conference software 1320, and at least one of the interfaces of the target application 1310 and the conference software 1320 enters the windowed display.
Illustratively, the user runs the conference software 1320 and the gallery application on the mobile phone, and wants to share the pictures in the gallery application through the conference software 1320, the gallery application is the target application 1310, and the pictures in the gallery application are the target files. The manner in which the user triggers the window display of the interface of the conference software 1320 may be, for example, that after the user touches the interface 1s of the conference software 1320 with a finger, the conference software 1320 enters the small window mode in response to the triggering operation. In another embodiment, the user may also cause meeting software 1320 to enter a widget mode in other ways, not limited herein.
In S1302, the target application 1310 detects an operation of moving the target file by the user in real time.
After the target application 1310 and/or the conference software 1320 are displayed in a window, the user may drag the target file to move. The target application 1310 detects in real time the coordinate position where the user drags the target file, and the target application 1310 can detect even if the user moves the target file outside the window of the target application 1310.
S1303, the target application 1310 issues a shared broadcast in response to a release operation of the user.
After detecting the operation of releasing the target file by the user, the target application 1310 transmits a shared broadcast including at least one of reception authority information, an attribute of the target file, and drop point position information to all applications in response to the operation of releasing the target file by the user.
The receiving application permission information indicates what application can receive the shared broadcast, for example, by applying a tag/type, and only the tag/type field needs to be added in the shared broadcast. For example, in this embodiment, the conference software 1320 may receive a shared broadcast.
The attributes of the target file include at least one of a type of the target file, a storage location, and an application identifier of the target application 1310. The application identifier of the target application 1310 is a scene shared for the drag window for the unique pointing to the dragged target application 1310. For example, the installation package name of the target application 1310 is provided to launch the target application 1310 with the installation package name, thereby calling out the window that the target application 1310 has opened.
The conference software 1320 receives the shared broadcast and determines whether to respond to the shared broadcast S1304.
The target application 1310 sends a shared broadcast to all running applications in the electronic device, but not all applications can receive the shared broadcast, and the reception permission information indicates that the conference software 1320 can receive the shared broadcast. After receiving the shared broadcast, the conference software 1320 determines whether to respond to the shared broadcast by locating the drop point within the shared broadcast.
When the drop point position of the target file is within the window range of the conference software 1320, the conference software 1320 determines the target file as the content that the user wants to share in response to the sharing broadcast.
When the drop point location of the target file is not within the window of the conference software 1320, the conference software 1320 does not respond to the shared broadcast.
S1305, the conference software 1320 parses the sharing broadcast, and performs a corresponding sharing operation.
After meeting software 1320 determines to respond to the shared broadcast, the attributes of the target file are further parsed from the shared broadcast. Meeting software 1320 invokes the corresponding application to open the target file according to the inter-application protocol based on the attributes of the target file. And setting the window of the opened target file as a shared window, the operation of opening and setting the shared window referring to step S403.
S1306, the conference software 1320 issues a notification broadcast of successful sharing.
After the conference software 1320 completes the sharing operation of the target file, a notification broadcast of the sharing success is sent to the target application 1310.
S1307, the target application 1310 and the conference software 1320 exit the windowed display.
After the conference software 1320 completes sharing, the conference software 1320 logs off broadcast listening. Both the target application 1310 and the conferencing software 1320 exit the windowed display, and the sharing window is displayed in a full screen display and shared.
The method for sharing files provided in the present application is specifically described below with reference to another embodiment 2, and is applied to a specific implementation process in a cross-device scenario.
Example 2
In some scenarios, a user uses one electronic device for video conferencing, but the file is stored on another electronic device. A typical scenario is when a user uses a car meeting terminal on a car, but the files are often not transferred to the car but stored on the cell phone. In this embodiment, the user is using the mobile phone to perform a video conference, and the target file to be shared is stored in the mobile phone.
Specifically, fig. 14 shows a schematic diagram of a multi-device collaboration interface, according to an embodiment of the present application.
As shown in fig. 14, a user enters a multi-device collaboration interface 1401 on a mobile phone 1400, and an application window running on a device that logs in to the same cloud account may be displayed on the multi-device collaboration interface 1401. For example, a setup window 1413, gallery window 1423, document window 1433, and form window 1443 running on the cell phone 1400, or a music window 1453, conference window 1463, and video window 1473 running on the car phone. Therefore, after the user needs to open the target file to be shared on the mobile phone 1400, the application window corresponding to the target file to be shared by the user is displayed on the multi-device collaboration interface 1401. For example, the content that the user wants to share is a picture, and the gallery application 1423 needs to be run on the handset 1400 to open the picture. After the user enters the multi-device collaborative interface 1401 via the handset 1400, a gallery window 1423 of open pictures may be displayed on the multi-device collaborative interface 1401.
At the top of the multi-device collaboration interface 1401 of the handset 1400, controls 1402 for all devices that log on to the current user account are displayed, such as "", cell phone ", car phone", and ", computer". Upon entering the multi-device collaboration interface 1401, the running application window on the handset 1400 is displayed by default. The user may click on a control 1402 representing a different device to view an application window running on the different device via operation 1404. For example, after the user clicks on "car machine", the multi-device collaboration interface 1401 displays an application window running on the car machine.
For example, the user may drag the gallery window 1423 to the control 1402 corresponding to "vehicle machine" on the multi-device collaboration interface 1401, and the multi-device collaboration interface 1401 executes the drag operation of the user and displays the application window corresponding to the vehicle machine. Then the user drags the picture into the conference window 1463 to complete the sharing operation across devices.
Specifically, fig. 15 shows an interaction flow diagram for sharing files across devices according to an embodiment of the present application.
As shown in fig. 15, the interaction flow includes the following steps:
s1501, a communication connection is established between handset 1510, cloud 1520 and a synergistic device 1530.
In the multi-device interaction process, both the mobile phone 1510 and the collaborative device 1530 need to be in communication connection with the cloud 1520, so that the cloud 1520 can obtain the process information of the devices logged in the same account logged in the cloud 1520, so as to determine the running application and the application interface thereof on the collaborative device 1530 according to the process information. The cooperative device 1530 may be, for example, a car-mounted computer or the like. When a user needs to share a file of the mobile phone 1510, an application corresponding to the file to be shared is first found, and the application is the target application. The user opens the file through the target application program, so that the file displays the content in the file in a window mode, on one hand, the subsequent sharing operation is convenient, and on the other hand, the user can check whether the file is the file which is wanted to be shared. Accordingly, the cooperating devices need to run conference software in order to receive the file.
S1502, handset 1510 displays a multi-device collaboration interface.
Illustratively, the user presses the middle of the window of the mobile phone 1510 on the interface of the mobile phone 1510, and then slides upwards, the window of the mobile phone 1510 responds to the user operation to enter the multi-device collaboration interface, in other embodiments, the manner of entering the multi-device collaboration interface is not limited to the above operation, and the user can enter the multi-device collaboration interface through a multi-finger sliding screen and a customized manner, which is not limited herein. At this time, the mobile phone 1510 obtains, through the cloud 1520, the collaborative device 1530 that has logged in to the cloud 1520 account, and the applications and application windows that are running on the plurality of collaborative devices 1530, and displays the application windows on the multi-device collaborative interface of the mobile phone 1510. As shown in fig. 14, a control 1402 for switching different devices is set on the multi-device collaborative interface 1401, a user touches the control 1402 of the different devices, the multi-device collaborative interface 1401 is switched to a display interface of a corresponding device in response to a touch operation of the user, and an application window running on the corresponding device is displayed.
S1503, the mobile phone 1510 detects the operation of the user.
When the user shares the file, the target file is selected in the multi-device collaboration interface of the mobile phone 1510, and dragged to a control corresponding to the collaborative device 1530 to be shared, for example, a control corresponding to the receiving device. The mobile phone 1510 records a path along which the user drags the target file in response to the user's movement operation. At this time, the multi-device collaborative interface of the mobile phone 1510 switches to the interface of the receiving device, and the multi-device collaborative interface can display the running application on the receiving device, so that the user drags the target file to move to the receiving object and releases the target file. At this time, the mobile phone 1510 records the drop point position of the target file in response to the release operation of the user.
Specifically, fig. 16 is a schematic diagram of an interface for completing window sharing by performing a moving operation on a window in a multi-device collaboration interface according to an embodiment of the present application.
As shown in fig. 16, after the mobile phone 1600 enters the multi-device collaboration interface 1601, a device name indicating that the same cloud account has been logged in is set above the multi-device collaboration interface 1601. Such as the cell phone control 1604a, the computer control 1604b, and the car phone control 1604c shown in fig. 16. When a user operation on the handset control 1604a is detected, an application window corresponding to an application running on the handset 1600 is displayed on the multi-device collaboration interface 1601. For example, a document window 1602a, a settings window 1602b, a gallery window 1602c, and a table window 1602d are running on the cell phone 1600. When a user operation on the vehicle control 1604c is detected, an application window corresponding to an application running on the vehicle is displayed on the multi-device collaborative interface. For example, a conference window 1605c, a video window 1605b, and a music window 1605a, which are running on the vehicle.
Illustratively, the user shares the document window 1602a on the handset 1600 to the conference software 1605c of the vehicle as follows: as shown in the multi-device collaborative interface 1601 indicated by the handset control 1604a, the user drags the document window 1602a to move into the handset control 1604c by operating 1603a, selecting the corresponding receiving device. For example, the receiving device is "car set", and the mobile phone 1600 records the moving path of the document window 1602a in response to the moving operation of the user.
As shown in the multi-device collaborative interface 1601 indicated by the in-vehicle control 1604c, after the user moves the document window 1602a to the in-vehicle control 1604c, the multi-device collaborative interface 1601 switches display as an application window running on the in-vehicle. The user continues to drag the document window 1602a through the operation 1603c until the user moves to the area of the conference window 1605c and releases the document window, and the conference window 1605c is the receiving object. At this time, the cellular phone 1600 calculates the drop point position of the document window 1602a in response to the release operation by the user.
S1504, the handset 1510 determines a receiving device and a receiving object based on the movement path and the release position of the user operation.
After the user finishes the mobile operation on the target file, the mobile phone 1510 determines the receiving device according to the mobile path information, and obtains the interface of the receiving device through the cloud 1520, where the distribution of the running application and application window of the receiving device can be displayed on the interface of the receiving device. The handset 1510 determines the received object, i.e., which application on the receiving device is able to receive the data of the target file, based on the application window distribution and drop point location information on the receiving device interface.
In S1505, the mobile phone 1510 sends the cloud 1520 an application data stream packed into a shared information stream from the received rights information and the target file.
After determining the receiving device and the receiving object, handset 1510 encapsulates the target file into an application data stream, which includes the data of the target file. And then the application data stream and the receiving authority information are packaged into a shared information stream, wherein the receiving authority information indicates that the receiving object can receive the application data stream. After the application data streams are encapsulated into the shared information streams, the shared information streams are transmitted to the receiving device through the data channels of the cloud 1520.
S1506, the cooperative device 1530 determines that the conference software opens the application data stream based on the reception authority information.
After receiving the shared information stream, the receiving device analyzes the shared information stream to obtain the receiving authority information and the application data stream, and then sends the application data stream to the receiving object according to the receiving authority information. In this embodiment, the data stream of the document is sent to the conference software.
S1507, the cooperative device 1530 parses the application data stream and performs sharing according to a preset operation.
The conference software on the receiving device determines the attribute of the target file after receiving the application data stream, then calls the corresponding application to open the target file based on the attribute of the target file and the protocol between the applications, and sets the window of the target file as the shared window, and the operations of opening the target file and setting the shared window refer to step S403.
For example, after receiving the data stream of the picture, the conference software on the "vehicle machine" in fig. 16 determines that the attribute of the picture is the picture through the application data stream of the picture, supports to open the application of the picture as a gallery, and then invokes the gallery application on the "vehicle machine" to open the picture, so that the picture is displayed in the form of a window, and the conference software sets the opened window as a shared window.
It can be understood that the application data stream is that data of the target application is sent to the receiving device through a data channel of the cloud through the mobile phone, the receiving device stores the target application after receiving the data of the target application, and then the target application is opened through the relevant local application. The target application may be a document, gallery, or web page link, etc.
In summary, according to the method for sharing files provided by the embodiment of the application, the electronic device determines that the user shares the target file by detecting the operation that the user drags the target file to move and moves the target file into the window of the conference software. The conference software can call the corresponding target application to open the target file, and set the window of the opened target file as a sharing window, so that the sharing of the target file can be completed. Therefore, when the user shares the target file, the target file does not need to be opened, the user does not need to share the target file through various sharing setting operations on the conference software, and the process of sharing the file is simple.
It will be understood that, although the terms "first," "second," etc. may be used herein to describe various features, these features should not be limited by these terms. These terms are used merely for distinguishing and are not to be construed as indicating or implying relative importance.
Furthermore, various operations will be described as multiple discrete operations, in a manner that is most helpful in understanding the illustrative embodiments; however, the order of description should not be construed as to imply that these operations are necessarily order dependent, and that many of the operations be performed in parallel, concurrently or with other operations. Furthermore, the order of the operations may also be rearranged. When the described operations are completed, the process may be terminated, but may also have additional operations not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering is not required. Rather, in some embodiments, these features may be described in a different manner and/or order than shown in the illustrative figures. Furthermore, the inclusion of structural or methodological features in a particular figure does not imply that all embodiments need to include such features.
The embodiments of the present application have been described in detail above with reference to the accompanying drawings, but the application of the technical solution of the present application is not limited to the applications mentioned in the embodiments of the present application, and various structures and modifications can be easily implemented with reference to the technical solution of the present application, so as to achieve the various beneficial effects mentioned herein. Various changes, which may be made by those of ordinary skill in the art without departing from the spirit of the present application, are intended to be covered by the claims herein.

Claims (13)

1. A method for sharing files is applied to a first electronic device, and is characterized in that,
displaying a first application interface of a first application;
detecting a first operation of a user on a first file;
and responding to the first operation, displaying a second application interface of a second application in the first application interface, wherein the second application interface comprises the opened first file.
2. The method of claim 1, wherein the first application interface includes a first sharing area, and wherein the detecting the first operation of the user on the first file includes:
detecting an operation of dragging the first file to the first sharing area by a user; and, in addition, the processing unit,
The responding to the first operation, displaying a second application interface of a second application in the first application interface, comprises the following steps:
and displaying the second application interface in the first sharing area.
3. The method of claim 2, wherein the first sharing area displays a third application interface of a third application, and wherein the first sharing area includes a first area triggering split screen sharing and a second area triggering alternate sharing; and, in addition, the processing unit,
the method comprises the following steps:
detecting an operation of dragging the first file into the first area by a user, and dividing and displaying the second application interface and the third application interface in a screen in the first sharing area;
and detecting an operation of dragging the first file to the second area by the user, and replacing the third application interface displayed in the first sharing area with the second application interface for display.
4. The method of claim 2, wherein the first sharing area includes a plurality of sharing windows displaying different content, and the first application interface includes at least one window control for indicating a switch to a respective sharing window; and, in addition, the processing unit,
The method comprises the following steps:
and detecting user operation acting on a first window control in the at least one window control, and switching the content displayed in the first sharing area to the display content of the first window indicated by the first window control.
5. The method of claim 1, wherein the first application interface includes at least one second shared region, wherein the second shared region includes a first user identification using the first application, the first user identification indicating a first user's viewing rights to content displayed in the second shared region, and wherein the detecting a first operation by the user on a first file comprises:
detecting an operation of dragging the first file to at least one second sharing area by a user; and, in addition, the processing unit,
the responding to the first operation, displaying a second application interface of a second application in the first application interface, comprises the following steps:
and responding to the first operation, and displaying the second application interface in the second sharing area acted by the first operation.
6. The method of any one of claims 1 to 5, wherein the first electronic device detects a first operation of a user on a first file by;
Acquiring first position information of the first file corresponding to the first operation;
and determining that the position indicated by the first position information is positioned in a target area triggering the shared file in the first application interface, and detecting the first operation.
7. The method according to any one of claims 1 to 6, wherein after detecting a first operation of the user on the first file, the method further comprises:
and responding to the first operation, displaying a first prompt interface, wherein the first prompt interface is used for prompting a user to drag the first file to a designated position or a designated area in an area covered by the first application interface.
8. The method of claim 1, wherein the first application is an application running on a second electronic device, and wherein,
the first application interface for displaying the first application includes:
the first electronic device receives a first information stream sent by the second electronic device, wherein the first information stream is used for displaying the first application interface;
and displaying the first application interface.
9. The method of claim 8, wherein the first electronic device runs a fourth application, and wherein a fourth application interface provided by the fourth application displays the first file and a fifth application interface provided by the fourth application displays a first application interface of a first application run by the second electronic device; and, in addition, the processing unit,
The first operation includes:
a second operation of dragging the first file from the fourth application interface to the fifth application interface, and,
dragging the first file to a third operation in the first application interface, wherein the second operation and the third operation are continuous operations.
10. The method of claim 9, wherein the dragging the first file from the fourth application interface to the second operation on the fifth application interface comprises:
dragging the first file from the fourth application interface to a fourth operation on a second window control for indicating a switch to the fifth application interface, and,
dragging the first file to a fifth operation in the corresponding area of the fifth application interface, wherein the fourth operation and the fifth operation are continuous operations.
11. The method of any of claims 1 to 10, wherein the first application comprises any of an instant messaging application, conference software, a live application.
12. An electronic device, comprising: one or more processors; one or more memories; the one or more memories stores one or more programs that, when executed by the one or more processors, cause the electronic device to perform the method of sharing files of any of claims 1-10.
13. A computer readable storage medium having stored thereon instructions which, when executed on a computer, cause the computer to perform the method of sharing a file as claimed in any of claims 1 to 10.
CN202310226143.3A 2023-02-28 2023-02-28 Method for sharing file, electronic device and computer readable storage medium Pending CN116301541A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310226143.3A CN116301541A (en) 2023-02-28 2023-02-28 Method for sharing file, electronic device and computer readable storage medium
PCT/CN2023/120145 WO2024178962A1 (en) 2023-02-28 2023-09-20 Method for sharing file, electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310226143.3A CN116301541A (en) 2023-02-28 2023-02-28 Method for sharing file, electronic device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN116301541A true CN116301541A (en) 2023-06-23

Family

ID=86828220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310226143.3A Pending CN116301541A (en) 2023-02-28 2023-02-28 Method for sharing file, electronic device and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN116301541A (en)
WO (1) WO2024178962A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024178962A1 (en) * 2023-02-28 2024-09-06 华为技术有限公司 Method for sharing file, electronic device, and computer-readable storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013062582A1 (en) * 2011-10-28 2013-05-02 Hewlett-Packard Development Company, L.P. Grouping a participant and a resource
US20160299671A1 (en) * 2015-04-10 2016-10-13 Yu An Opening New Application Window in Response to Remote Resource Sharing
CN114371896B (en) * 2021-12-30 2023-05-16 北京字跳网络技术有限公司 Prompting method, device, equipment and medium based on document sharing
CN114816293A (en) * 2022-03-17 2022-07-29 联想(北京)有限公司 File sharing method and file sharing equipment
CN116301541A (en) * 2023-02-28 2023-06-23 华为技术有限公司 Method for sharing file, electronic device and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024178962A1 (en) * 2023-02-28 2024-09-06 华为技术有限公司 Method for sharing file, electronic device, and computer-readable storage medium

Also Published As

Publication number Publication date
WO2024178962A1 (en) 2024-09-06

Similar Documents

Publication Publication Date Title
US20240168624A1 (en) Screen capture method and related device
US11722449B2 (en) Notification message preview method and electronic device
WO2021018067A1 (en) Floating window management method and related device
WO2020143408A1 (en) Display method and related apparatus
WO2020119492A1 (en) Message processing method and related apparatus
WO2020062159A1 (en) Wireless charging method and electronic device
WO2022179405A1 (en) Screen projection display method and electronic device
EP4227784A1 (en) Notification message management method and electronic device
CN116360725B (en) Display interaction system, display method and device
CN114040242B (en) Screen projection method, electronic device and storage medium
CN112995727A (en) Multi-screen coordination method and system and electronic equipment
CN114115770B (en) Display control method and related device
CN114356195B (en) File transmission method and related equipment
US20240370218A1 (en) Screen sharing method and related device
WO2022127670A1 (en) Call method and system, and related device
WO2024045801A1 (en) Method for screenshotting, and electronic device, medium and program product
CN115756270B (en) A method, device and system for content sharing
WO2023005711A1 (en) Service recommendation method and electronic device
WO2024178962A1 (en) Method for sharing file, electronic device, and computer-readable storage medium
CN115543163A (en) Screen projection display method and electronic equipment
CN115242994B (en) Video call system, method and device
WO2023020012A1 (en) Data communication method between devices, device, storage medium, and program product
CN115328592B (en) Display method and related device
CN117724640B (en) Split screen display method, electronic equipment and storage medium
US20250060865A1 (en) Screen capture method, electronic device, medium, and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination