[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113672135A - Object control method and device and electronic equipment - Google Patents

Object control method and device and electronic equipment Download PDF

Info

Publication number
CN113672135A
CN113672135A CN202110939732.7A CN202110939732A CN113672135A CN 113672135 A CN113672135 A CN 113672135A CN 202110939732 A CN202110939732 A CN 202110939732A CN 113672135 A CN113672135 A CN 113672135A
Authority
CN
China
Prior art keywords
control
manipulation
target
input
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110939732.7A
Other languages
Chinese (zh)
Inventor
解晓鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Hangzhou Co Ltd
Original Assignee
Vivo Mobile Communication Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Hangzhou Co Ltd filed Critical Vivo Mobile Communication Hangzhou Co Ltd
Priority to CN202110939732.7A priority Critical patent/CN113672135A/en
Publication of CN113672135A publication Critical patent/CN113672135A/en
Priority to PCT/CN2022/111049 priority patent/WO2023020328A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an object control method, an object control device and electronic equipment, wherein the method comprises the following steps: receiving a first input to a target manipulation object, wherein the target manipulation object comprises a first manipulation object and a second manipulation object; and responding to the first input, and controlling the sub control objects in the first control object according to the target functions in the second control object, wherein the target functions are functions associated with the first control object. According to the embodiment of the application, the control efficiency can be improved.

Description

Object control method and device and electronic equipment
Technical Field
The embodiment of the application relates to the field of information processing, in particular to an object control method and device and electronic equipment.
Background
With the increasing number of operation objects (such as information and applications) in electronic devices, users may have a need to use multiple operation objects. When a user wants to control a plurality of control objects, the control objects to be opened need to be opened respectively, and the operation is complicated.
Disclosure of Invention
The embodiment of the application provides an object control method and device and electronic equipment, and aims to solve the problem that if a user wants to control a plurality of control objects, the operation is complex.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an object manipulation method, which may include:
receiving a first input to a target manipulation object, wherein the target manipulation object comprises a first manipulation object and a second manipulation object;
and responding to the first input, and controlling the sub control objects in the first control object according to the target functions in the second control object, wherein the target functions are functions associated with the first control object.
In a second aspect, an embodiment of the present application provides an object manipulating apparatus, which may include:
the device comprises a receiving module, a first input module and a second input module, wherein the receiving module is used for receiving a first input of a target control object, and the target control object comprises a first control object and a second control object;
and the control module is used for responding to the first input and controlling the sub control objects in the first control object according to the target functions in the second control object, wherein the target functions are functions related to the first control object.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, by receiving a first input to a target control object, the target control object comprises a first control object and a second control object; in response to the first input, the sub-manipulation objects in the first manipulation object are manipulated according to the target function associated with the first manipulation object in the second manipulation object. The sub-control objects in the first control object can be controlled quickly by using the target function, the control objects needing to be opened do not need to be opened respectively, and the control efficiency is improved.
Drawings
The present application may be better understood from the following description of specific embodiments of the application taken in conjunction with the accompanying drawings, in which like or similar reference numerals identify like or similar features.
Fig. 1 is a flowchart of an object manipulation method according to an embodiment of the present disclosure;
fig. 2 is a schematic view illustrating an object manipulation according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a fast login provided in an embodiment of the present application;
fig. 4 is a schematic diagram of file transmission provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of an object manipulating device according to an embodiment of the present disclosure;
fig. 6 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 7 is a schematic hardware structure diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
In view of the problems in the related art, embodiments of the present application provide an object control method, an object control device, an electronic device, and a storage medium, so as to solve the problem in the related art that a user has to perform complicated operations when the user wants to control a plurality of control objects.
The method provided by the embodiment of the application can be applied to the application scenes and can also be applied to any scene with more complicated operation when a user wants to control a plurality of control objects.
According to the method provided by the embodiment of the application, a first input to a target control object is received, and the target control object comprises a first control object and a second control object; in response to the first input, the sub-manipulation objects in the first manipulation object are manipulated according to the target function associated with the first manipulation object in the second manipulation object. The sub-control objects in the first control object can be controlled quickly by using the target function, the control objects needing to be opened do not need to be opened respectively, and the control efficiency is improved.
Based on the application scenario, the object manipulation method provided by the embodiment of the present application is described in detail below.
Fig. 1 is a flowchart of an object manipulation method according to an embodiment of the present disclosure.
As shown in fig. 1, the object manipulating method may include steps 110 to 120, and the method is applied to an object manipulating apparatus, and specifically as follows:
step 110, receiving a first input to a target manipulation object, where the target manipulation object includes a first manipulation object and a second manipulation object.
And 120, responding to the first input, and controlling the sub control objects in the first control object according to the target functions in the second control object, wherein the target functions are functions associated with the first control object.
The object control method provided by the application receives a first input to a target control object, wherein the target control object comprises a first control object and a second control object; in response to the first input, the sub-manipulation objects in the first manipulation object are manipulated according to the target function associated with the first manipulation object in the second manipulation object. The sub-control objects in the first control object can be controlled quickly by using the target function, the control objects needing to be opened do not need to be opened respectively, and the control efficiency is improved.
The contents of steps 110 to 120 are described below:
first, step 110 is involved.
Wherein, the above mentioned first and second manipulation objects may include: files and applications, etc.
In a possible embodiment, in the case that the interface of the first control object and the object identifier of the second control object are displayed, step 110 may specifically include the following steps:
receiving a first input of object identifications of a first target object and a second control object in an interface of a first control object;
correspondingly, step 120 may specifically include the following steps:
and responding to the first input, calling a target function of a second control object to perform control processing on the second target object, wherein the first target object is associated with the second target object.
Illustratively, a first input of an object identification of a first target object (a commodity in the shopping class application A) in the shopping class application A dragging to the shopping class application B is received; and responding to the first input, calling target functions (such as a search function, a collection function, a shopping cart adding function and the like) of the second control object to control and process the second target object (the commodity in the shopping application B). The first target object and the second target object involved in the above are related, for example, the first target object and the second target object are the same commodity, or the first target object and the second target object are the same type of commodity.
Therefore, by responding to the first input of the object identifications of the first target object and the second target object in the interface of the first control object, the target function of the second control object can be quickly called to perform control processing on the second target object associated with the first target object.
In a possible embodiment, step 110 may specifically include:
receiving a drag input of an object identifier of a target manipulation object displayed on an interface;
after receiving the first input, the display position of the object identifier of the target manipulation object on the interface is maintained unchanged.
The first input may be a drag input to an object identifier of a target manipulation object displayed on the interface, or an input to press the target manipulation object at the same time, so as to prevent a false touch. In addition, after the first input is received, the display position of the object identifier of the target manipulation object on the interface is maintained unchanged, that is, the APP arrangement in the display area is not affected after the first input is received.
Next, step 120 is involved.
The related sub control objects can include function controls, display areas, files and the like in the first control object.
In a possible embodiment, the first manipulation object is a first icon, the second manipulation object is a second icon, and the step 120 may specifically include the following steps:
and responding to the first input, and controlling the sub-control object in the first application according to the target function in the second application, wherein the first application corresponds to the first icon, and the second application corresponds to the second icon.
Specifically, the first control object is a file library stored with files, and the second control object is a communication application. When the fact that the file library is dragged to the communication application is detected, the file library is automatically opened, files in the file library are displayed (such as pictures in an album, characters in a document and a webpage in a browser are displayed), and meanwhile a sharing page of the communication application is displayed so that a communication object can be selected for file sharing.
Illustratively, as shown in FIG. 2, first, in response to a first input of dragging a file library (such as a photo album) onto App3, a first interface corresponding to the file library and a second interface corresponding to App3 are displayed in a split screen mode; the first interface comprises at least one file, and the second interface comprises at least one communication object. Then, receiving a third input to the target file in the first interface and the target communication object in the second interface; the at least one file comprises a target file and the at least one communication object comprises a target communication object; finally, the target file is sent to the target communication object in response to the third input.
Therefore, the target function can be used for controlling the sub-control object in the first application quickly.
In a possible embodiment, the first control object is an object identifier in the first application, and the second control object is a second icon, and step 120 may specifically include the following steps:
and responding to the first input, calling a target function of a second application to control and process the object identification in the first application, wherein the second application corresponds to the second icon, and the object identification corresponds to the target function.
Specifically, in response to the first input, a target function (collection function) of the second application corresponding to the second icon is called to perform control processing on an object identifier (commodity identifier) in the first application, where the object identifier (commodity identifier) corresponds to the target function (collection function).
For example, the first application (shopping application a) and the second application (shopping application B) are both shopping applications, and in response to the first input, the collection function of the shopping application B is invoked to perform manipulation processing (collection processing) on a certain product identifier in the first application, where the certain product identifier corresponds to the collection function.
In a possible embodiment, step 120 may specifically include the following steps:
displaying a function identification of at least one function in the second control object in response to the first input; at least one function is associated with the first input;
receiving a second input of function identifiers of the target function, wherein the function identifier of at least one function comprises the function identifier of the target function;
and responding to the first input, and controlling the sub control objects in the first control object according to the target function.
Specifically, first, in response to a first input, displaying a function identifier of at least one function associated with the first input in the second manipulation object; then, in response to a second input of the function identifier of the target function, the sub-manipulation objects in the first manipulation object are manipulated according to the target function.
Alternatively, when the first input to the target manipulation object is received for the first time, all the function controls of the second manipulation object may be traversed in advance, the function associated with the first manipulation object is displayed (A, B, C, D, etc.), and the user selects one target function and records the dragging relationship between the first manipulation object and the second manipulation object. When the first input to the target control object is subsequently received again, the target function in the second control object can be directly used.
In a possible embodiment, the first control object is a third application, the second control object is a communication application, and the step of controlling the sub-control objects in the first control object according to the target function may specifically include the following steps:
acquiring user login information in a second control object;
and logging in the third application according to the user login information.
Here, the communication application is required to expose the login authorization service and provide an authorization interface to return the user login information. When the communication application is detected to be dragged to other control objects, the other control objects are automatically opened, login interfaces of the other control objects are called, user login information transmitted by the communication application is directly entitled to the other control object applications, and the user login operation can be completed after the other control objects take account information. After the login operation is completed, the user can play other applications smoothly.
Exemplarily, as shown in fig. 3, a first input to a target control object is received, a second control object (App3) is a communication application, and in response to the first input, user login information in the second control object is acquired; and logging in the third application according to the user login information, and entering an App3 interface corresponding to the user login information after logging in.
Therefore, the third application can be quickly logged in by acquiring the user login information in the second control object and logging in the third application according to the user login information.
In a possible embodiment, the first control object is a file library icon, the second control object is a communication application icon, and the step 120 may specifically include the following steps:
responding to the first input, displaying an interface corresponding to the first control object, wherein the interface comprises at least one file, and the at least one file comprises a target file;
receiving a third input to the interface of the first control object;
responding to a third input of the user, and determining a target file in the interface corresponding to the first control object;
and sending the target file to a preset contact in the second control object, and/or publishing the target file in an information sharing platform of the second control object.
The second control object may pre-record a control function corresponding to each third-party control object (e.g., a function of issuing an information sharing platform corresponding to a camera, and a function of sending a target file to a preset contact corresponding to an album), so that when the first control object is detected to be dragged to the second control object, the target function associated with the first control object in the second first control object may be obtained based on the first control object, the first control object is preferentially opened, and after the function of the first control object is used (e.g., photographing is performed), a page associated with the second control object is opened for control.
Exemplarily, as shown in fig. 4, first, in response to a first input, an interface of a first manipulation object is displayed; then, responding to a third input of the interface of the first control object (photo album), determining a target file in the interface of the first control object; and finally, sending the target file to a preset contact in the second control object, and/or publishing the target file in an information sharing platform of the second control object.
Therefore, the target file in the interface of the first control object is determined by responding to the third input of the interface of the first control object, so that the target file can be rapidly sent to the preset contact in the second control object, and/or the target file is published in the information sharing platform of the second control object.
In a possible embodiment, in a case that the interface of the first manipulation object and the object identifier of the second manipulation object are displayed, receiving a fourth input to the first function control in the interface of the first manipulation object and the object identifier of the second manipulation object; the first control object and the second control object are the same type of control objects; and responding to the fourth input, displaying an interface of a second control object, wherein the interface of the second control object comprises a second function control, and the first function control is associated with the second function control.
Exemplarily, first, in a case where an interface of a first manipulation object (shopping class application a) and an object identifier of a second manipulation object (shopping class application B) are displayed, a fourth input to a first function control (shopping cart) and an object identifier of the second manipulation object in the interface of the first manipulation object is received; the first control object and the second control object are the same type of control objects, for example, both are shopping applications, both are music playing applications, or both are video applications. Then, in response to a fourth input, an interface of a second manipulation object (shopping class application B) is displayed, the interface of the second manipulation object including a second functionality control (shopping cart), the first functionality control being associated with the second functionality control.
Therefore, the interface of the second control object, which comprises the second function control, is displayed in response to the fourth input of the object identifications of the first function control and the second control object in the interface of the first control object, and the second function control of the second control object can be opened quickly.
In addition, the above step related to the manipulation of the sub-manipulation object in the first manipulation object according to the target function in the second manipulation object may further include the following embodiments:
dragging the photo album to a picture editing application to realize quick editing of the picture; dragging files such as web pages, texts and the like to communication applications to realize rapid sharing of the files; dragging files such as web pages, texts and the like to communication applications to realize quick editing of the files; editing APP realization; and dragging files such as web pages and texts to the voice application to realize the quick reading of the files.
In summary, in the embodiment of the present application, by receiving a first input to a target manipulation object, the target manipulation object includes a first manipulation object and a second manipulation object; in response to the first input, the sub-manipulation objects in the first manipulation object are manipulated according to the target function associated with the first manipulation object in the second manipulation object. The sub-control objects in the first control object can be controlled quickly by using the target function, the control objects needing to be opened do not need to be opened respectively, and the control efficiency is improved. It should be noted that in the object manipulation method provided in the embodiment of the present application, the execution main body may be an object manipulation device, or a control module in the object manipulation device, configured to execute the loaded object manipulation method. In the embodiment of the present application, an object manipulating device is taken as an example to execute a loaded object manipulating method, and the object manipulating method provided in the embodiment of the present application is described.
In addition, based on the object manipulation method, an object manipulation device is further provided in the embodiments of the present application, which is specifically described in detail with reference to fig. 5.
Fig. 5 is a schematic structural diagram of an object manipulating device according to an embodiment of the present disclosure.
As shown in fig. 5, the object manipulating apparatus 500 may include:
the receiving module 510 is configured to receive a first input to a target manipulation object, where the target manipulation object includes a first manipulation object and a second manipulation object.
The manipulation module 520 is configured to receive, in response to the first input, a manipulation to the sub-manipulation object in the first manipulation object according to a target function in the second manipulation object, where the target function is a function associated with the first manipulation object.
In a possible embodiment, the manipulation module 520 includes:
the display module is used for responding to the first input and displaying the function identification of at least one function in the second control object; at least one function is associated with the first manipulation object.
The receiving module 510 is further configured to receive a second input of a function identifier of the target function, where the function identifier of at least one function includes the function identifier of the target function.
The manipulation module 520 is specifically configured to, in response to the first input, manipulate the sub-manipulation objects in the first manipulation object according to the target function.
In a possible embodiment, the first manipulation object is a first icon, the second manipulation object is a second icon, and the manipulation module 520 is specifically configured to:
and responding to the first input, and controlling the sub-control object in the first application according to the target function in the second application, wherein the first application corresponds to the first icon, and the second application corresponds to the second icon.
In a possible embodiment, the first control object is an object identifier in the first application, the second control object is a second icon, and the control module 520 is specifically configured to:
and responding to the first input, calling a target function of a second application to control and process the object identification in the first application, wherein the second application corresponds to the second icon, and the object identification corresponds to the target function.
In a possible embodiment, the first control object is a file library icon, the second control object is a communication application icon, and the control module 520 includes:
and the display module is used for responding to the first input and displaying an interface corresponding to the first control object, wherein the interface comprises at least one file, and the at least one file comprises a target file.
The receiving module 510 is further configured to receive a third input to the interface of the first control object.
And the determining module is used for responding to a third input of the user and determining the target file in the interface corresponding to the first control object.
And the sending module is used for sending the target file to a preset contact in the second control object and/or publishing the target file in the information sharing platform of the second control object.
In a possible embodiment, the first control object is a third application, the second control object is a communication application, and the control module 520 includes:
and the acquisition module is used for acquiring the user login information in the second control object.
And the login module is used for logging in the third application according to the user login information.
In a possible embodiment, in a case that an interface of a first control object and an object identifier of a second control object are displayed, the receiving module 510 is specifically configured to:
a first input is received of object identifications of a first target object and a second manipulation object in an interface of a first manipulation object.
The control module 520 is specifically configured to:
and responding to the first input, calling a target function of a second control object to perform control processing on the second target object, wherein the first target object is associated with the second target object.
In a possible embodiment, in a case that an interface of a first control object and an object identifier of a second control object are displayed, the receiving module 510 is specifically configured to:
receiving a fourth input of the object identifications of the first function control and the second control object in the interface of the first control object; the first control object and the second control object are the same type of control objects.
And the display module is used for responding to the fourth input and displaying an interface of a second control object, the interface of the second control object comprises a second function control, and the first function control is associated with the second function control.
In summary, the object control device provided in the embodiment of the present application receives a first input to a target control object, where the target control object includes a first control object and a second control object; in response to the first input, the sub-manipulation objects in the first manipulation object are manipulated according to the target function associated with the first manipulation object in the second manipulation object. The sub-control objects in the first control object can be controlled quickly by using the target function, the control objects needing to be opened do not need to be opened respectively, and the control efficiency is improved.
The object control device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The object manipulating device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The object control device provided in the embodiment of the present application can implement each process implemented by the object control device in the method embodiments of fig. 2 to fig. 4, and is not described here again to avoid repetition.
Optionally, as shown in fig. 6, an electronic device 600 is further provided in this embodiment of the present application, and includes a processor 601, a memory 602, and a program or an instruction stored in the memory 602 and executable on the processor 601, where the program or the instruction is executed by the processor 601 to implement each process of the above-mentioned chat group creation method embodiment, and can achieve the same technical effect, and in order to avoid repetition, it is not described here again.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 7 is a schematic hardware structure diagram of another electronic device according to an embodiment of the present application.
The electronic device 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Among them, the input unit 704 may include a graphic processor 7041 and a microphone 7042; the display unit 706 may include a display panel 7061; the user input unit 707 may include a touch panel 7071 and other input devices 7072; memory 709 may include applications and an operating system.
Those skilled in the art will appreciate that the electronic device 700 may further comprise a power supply (e.g., a battery) for supplying power to various components, and the power supply 711 may be logically connected to the processor 710 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 7 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
A user input unit 707 for receiving a first input to a target manipulation object, the target manipulation object comprising a first manipulation object and a second manipulation object.
The processor 710 is configured to receive, in response to the first input, a manipulation to a sub-manipulation object in the first manipulation object according to a target function in the second manipulation object, where the target function is a function associated with the first manipulation object.
Optionally, the display unit 706 is configured to display a function identifier of at least one function in the second control object in response to the first input; at least one function is associated with the first manipulation object.
The user input unit 707 is further configured to receive a second input of a function identifier of the target function, the function identifier of the at least one function comprising the function identifier of the target function.
The processor 710 is specifically configured to, in response to the first input, manipulate the sub manipulation object in the first manipulation object according to the target function.
Optionally, the processor 710 is configured to, in response to the first input, manipulate a sub manipulation object in the first application according to a target function in the second application, where the first application corresponds to the first icon and the second application corresponds to the second icon.
Optionally, the processor 710 is configured to, in response to the first input, invoke a target function of a second application to perform manipulation processing on an object identifier in the first application, where the second application corresponds to the second icon and the object identifier corresponds to the target function.
Optionally, the display unit 706 is configured to display, in response to the first input, an interface corresponding to the first control object, where the interface includes at least one file, and the at least one file includes the target file.
The user input unit 707 is further configured to receive a third input to the interface of the first manipulation object.
And the processor 710 is configured to determine, in response to a third input by the user, a target file in the interface corresponding to the first control object.
And the processor 710 is configured to send the target file to a preset contact in the second control object, and/or publish the target file in the information sharing platform of the second control object.
Optionally, the processor 710 is configured to obtain user login information in the second control object.
And a processor 710 for logging in the third application according to the user login information.
Optionally, the user input unit 707 is configured to receive a first input of object identifications of the first target object and the second manipulation object in the interface of the first manipulation object.
The processor 710 is specifically configured to, in response to the first input, invoke a target function of a second target object to perform manipulation processing on the second target object, where the first target object is associated with the second target object.
Optionally, the user input unit 707 is configured to receive a fourth input of object identifications of the first function control and the second manipulation object in the interface of the first manipulation object; the first control object and the second control object are the same type of control objects.
The display unit 706 is configured to display, in response to the fourth input, an interface of a second manipulation object, where the interface of the second manipulation object includes a second function control, and the first function control is associated with the second function control.
In the embodiment of the application, by receiving a first input to a target control object, the target control object comprises a first control object and a second control object; in response to the first input, the sub-manipulation objects in the first manipulation object are manipulated according to the target function associated with the first manipulation object in the second manipulation object. The sub-control objects in the first control object can be controlled quickly by using the target function, the control objects needing to be opened do not need to be opened respectively, and the control efficiency is improved. The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the object control method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the embodiment of the object control method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An object manipulation method, comprising:
receiving a first input to a target manipulation object, the target manipulation object comprising a first manipulation object and a second manipulation object;
and responding to the first input, and manipulating a sub manipulation object in the first manipulation object according to a target function in the second manipulation object, wherein the target function is a function associated with the first manipulation object.
2. The method of claim 1, wherein the manipulating of the child manipulation object of the first manipulation object according to the target function in response to the first input comprises:
displaying a function identifier of at least one function in the second control object in response to the first input; the at least one function is associated with the first manipulation object;
receiving a second input of a function identifier of the target function, wherein the function identifier of the at least one function comprises the function identifier of the target function;
and responding to the first input, and controlling the sub control objects in the first control object according to the target function.
3. The method of claim 1, wherein the first manipulation object is a first icon and the second manipulation object is a second icon, and wherein, in response to the first input, manipulating the sub-manipulation objects in the first manipulation object according to the target function in the second manipulation object comprises:
and responding to the first input, and controlling a sub control object in the first application according to a target function in the second application, wherein the first application corresponds to the first icon, and the second application corresponds to the second icon.
4. The method of claim 1, wherein the first manipulation object is an object identifier in a first application, the second manipulation object is a second icon, and the manipulating, in response to the first input, the sub-manipulation objects in the first manipulation object according to the target function in the second manipulation object comprises:
and responding to the first input, calling the target function of a second application to perform control processing on an object identifier in the first application, wherein the second application corresponds to the second icon, and the object identifier corresponds to the target function.
5. The method of claim 1, wherein the first control object is a document library icon, the second control object is a communication application icon, and the controlling the sub-control objects in the first control object according to the target function in response to the first input comprises:
responding to the first input, and displaying an interface corresponding to the first control object, wherein the interface comprises at least one file, and the at least one file comprises a target file;
receiving a third input to the interface of the first manipulation object;
responding to a third input of a user, and determining the target file in the interface corresponding to the first control object;
and sending the target file to a preset contact in the second control object, and/or publishing the target file in an information sharing platform of the second control object.
6. The method according to claim 1, wherein the first control object is a third application, the second control object is a communication application, and the controlling the sub-control objects in the first control object according to the target function comprises:
acquiring user login information in the second control object;
and logging in the third application according to the user login information.
7. The method of claim 1, wherein receiving a first input to a target manipulation object while displaying an interface of the first manipulation object and an object identifier of a second manipulation object comprises:
receiving a first input of object identifications of a first target object and a second target object in an interface of the first control object;
the responding to the first input, and controlling the sub-control objects in the first control object according to the target function, includes:
and responding to the first input, calling the target function of the second control object to perform control processing on the second target object, wherein the first target object is associated with the second target object.
8. The method according to claim 1, wherein when the interface of the first manipulation object and the object identifier of the second manipulation object are displayed, the method further comprises:
receiving a fourth input of the object identifications of the first function control and the second control object in the interface of the first control object; the first control object and the second control object are the same type of control object;
in response to the fourth input, displaying an interface of a second manipulation object, the interface of the second manipulation object including a second functionality control, the first functionality control being associated with the second functionality control.
9. An object manipulation apparatus, comprising:
the device comprises a receiving module, a first input module and a second input module, wherein the receiving module is used for receiving a first input of a target control object, and the target control object comprises a first control object and a second control object;
and the control module is used for responding to the first input and controlling the sub control objects in the first control object according to the target functions in the second control object, wherein the target functions are functions related to the first control object.
10. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the object handling method according to any one of claims 1-8.
CN202110939732.7A 2021-08-16 2021-08-16 Object control method and device and electronic equipment Pending CN113672135A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110939732.7A CN113672135A (en) 2021-08-16 2021-08-16 Object control method and device and electronic equipment
PCT/CN2022/111049 WO2023020328A1 (en) 2021-08-16 2022-08-09 Object manipulating method and apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110939732.7A CN113672135A (en) 2021-08-16 2021-08-16 Object control method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN113672135A true CN113672135A (en) 2021-11-19

Family

ID=78543158

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110939732.7A Pending CN113672135A (en) 2021-08-16 2021-08-16 Object control method and device and electronic equipment

Country Status (2)

Country Link
CN (1) CN113672135A (en)
WO (1) WO2023020328A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023020328A1 (en) * 2021-08-16 2023-02-23 维沃移动通信(杭州)有限公司 Object manipulating method and apparatus, and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106775334A (en) * 2016-11-14 2017-05-31 北京奇虎科技有限公司 File call method, device and mobile terminal on mobile terminal
CN110752981A (en) * 2019-09-29 2020-02-04 维沃移动通信有限公司 Information control method and electronic equipment
CN111062024A (en) * 2019-11-25 2020-04-24 泰康保险集团股份有限公司 Application login method and device
CN111459384A (en) * 2020-03-31 2020-07-28 维沃移动通信有限公司 Dialing control method and electronic equipment
CN111880713A (en) * 2020-07-25 2020-11-03 Oppo广东移动通信有限公司 Object processing method, related device, terminal and computer storage medium
CN112527221A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Data transmission method and related equipment
CN113055525A (en) * 2021-03-30 2021-06-29 维沃移动通信有限公司 File sharing method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113672135A (en) * 2021-08-16 2021-11-19 维沃移动通信(杭州)有限公司 Object control method and device and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106775334A (en) * 2016-11-14 2017-05-31 北京奇虎科技有限公司 File call method, device and mobile terminal on mobile terminal
CN112527221A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Data transmission method and related equipment
CN110752981A (en) * 2019-09-29 2020-02-04 维沃移动通信有限公司 Information control method and electronic equipment
CN111062024A (en) * 2019-11-25 2020-04-24 泰康保险集团股份有限公司 Application login method and device
CN111459384A (en) * 2020-03-31 2020-07-28 维沃移动通信有限公司 Dialing control method and electronic equipment
CN111880713A (en) * 2020-07-25 2020-11-03 Oppo广东移动通信有限公司 Object processing method, related device, terminal and computer storage medium
CN113055525A (en) * 2021-03-30 2021-06-29 维沃移动通信有限公司 File sharing method, device, equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023020328A1 (en) * 2021-08-16 2023-02-23 维沃移动通信(杭州)有限公司 Object manipulating method and apparatus, and electronic device

Also Published As

Publication number Publication date
WO2023020328A1 (en) 2023-02-23

Similar Documents

Publication Publication Date Title
CN112306607B (en) Screenshot method and device, electronic equipment and readable storage medium
CN112306325B (en) Interaction control method and device
CN115079884B (en) Session message display method, device, equipment and storage medium
CN112540709A (en) Split screen display method and device, electronic equipment and readable storage medium
CN112286887A (en) File sharing method and device and electronic equipment
CN111966860A (en) Audio playing method and device and electronic equipment
CN113311968A (en) Application program correlation method and device
CN111831181A (en) Application switching display method and device and electronic equipment
CN113849092A (en) Content sharing method and device and electronic equipment
CN112486444A (en) Screen projection method, device, equipment and readable storage medium
CN112948844B (en) Control method and device and electronic equipment
CN112486388B (en) Picture sharing method and device and electronic equipment
CN114116098B (en) Application icon management method and device, electronic equipment and storage medium
CN112698762B (en) Icon display method and device and electronic equipment
CN113672135A (en) Object control method and device and electronic equipment
CN112035026B (en) Information display method and device, electronic equipment and storage medium
CN112291412A (en) Application program control method and device and electronic equipment
CN112286615A (en) Information display method and device of application program
CN111817944A (en) Picture sharing method and device and electronic equipment
CN113726953B (en) Display content acquisition method and device
CN113835592B (en) Application function combination method and device and electronic equipment
EP4351117A1 (en) Information display method and apparatus, and electronic device
CN113037618B (en) Image sharing method and device
CN114416681A (en) File sharing method and electronic equipment
CN111796733B (en) Image display method, image display device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination