CN114722010B - Folder processing method, intelligent terminal and storage medium - Google Patents
Folder processing method, intelligent terminal and storage medium Download PDFInfo
- Publication number
- CN114722010B CN114722010B CN202210653895.3A CN202210653895A CN114722010B CN 114722010 B CN114722010 B CN 114722010B CN 202210653895 A CN202210653895 A CN 202210653895A CN 114722010 B CN114722010 B CN 114722010B
- Authority
- CN
- China
- Prior art keywords
- target
- folder
- application
- type
- identifier
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/10—File systems; File servers
- G06F16/16—File or folder operations, e.g. details of user interfaces specifically adapted to file systems
- G06F16/168—Details of user interfaces specifically adapted to file systems, e.g. browsing and visualisation, 2d or 3d GUIs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a folder processing method, an intelligent terminal and a storage medium. The method comprises the following steps: s11: outputting candidate identification information in response to the first operation; s12: determining a target identifier in the candidate identifier information; s13: and responding to a second operation aiming at the target identification, and adding a target object corresponding to the target identification into a target folder based on an adding mode corresponding to the second operation. The folder processing method and the folder processing device can simplify the folder processing flow, and therefore folder processing efficiency is improved.
Description
Technical Field
The application relates to the technical field of communication, in particular to a folder processing method, an intelligent terminal and a storage medium.
Background
With the increasing number of applications and files of the mobile terminal, when a user uses the terminal device, the user needs to store and edit the files in the mobile terminal, and activities such as web browsing, chatting, sending and receiving mails and the like are more and more frequent when the user plays games through the mobile terminal, and various application programs and folders are continuously increased in the mobile terminal.
In the course of conceiving and implementing the present application, the inventors found that at least the following problems existed: in the face of numerous applications and files in the mobile terminal, the efficiency of processing the files is low at present.
The foregoing description is provided for general background information and does not necessarily constitute prior art.
Disclosure of Invention
In view of the above technical problems, the present application provides a folder processing method, an intelligent terminal and a storage medium, which can simplify the folder processing flow, thereby improving the folder processing efficiency.
In order to solve the above technical problem, the present application provides a folder processing method, including the following steps:
s11: outputting candidate identification information in response to the first operation;
s12: determining a target identifier in the candidate identifier information;
s13: and responding to a second operation aiming at the target identification, and adding the target object corresponding to the target identification to a target folder based on an adding mode corresponding to the second operation.
Optionally, the S11 includes:
outputting an opening interface of the target folder;
and responding to the first operation aiming at the opening interface, and outputting candidate identification information.
Optionally, the responding to the first operation on the opening interface outputs candidate identification information, including:
recognizing or determining that a first operation aiming at the opening interface is a preset operation, and outputting candidate identification information; and/or the presence of a gas in the gas,
and identifying or determining the operation area of the first operation as a preset area, and outputting candidate identification information.
Optionally, the S13 includes:
responding to a second operation aiming at the target identifier, and identifying or determining that the operation type of the second operation is single-point touch;
outputting a target application icon corresponding to the target identifier;
and responding to a third operation aiming at the target application icon, and adding the target application icon into a target folder.
Optionally, the adding the target application icon to the target folder in response to the third operation on the target application icon includes:
responding to a third operation aiming at the target application icon, and detecting the number of the application icons of the same type in the target folder;
when detecting that the number of the application icons of the same type in the target folder is larger than a preset value, generating a homologous folder in the target folder, and enabling the target application icons and the target folder;
and when detecting that the number of the application icons of the same type in the target folder is smaller than or equal to a preset value, displaying the target application icon in the preset position of the target folder according to the operation position corresponding to the third operation.
Optionally, the S13 includes:
identifying or determining that the operation type of the second operation is multi-point touch;
merging the target objects corresponding to the target identifications according to the operation track of the second operation, and outputting a merged folder corresponding to the object merging result;
and adding the merged folder into the target folder.
Optionally, the merging the target object corresponding to the target identifier according to the operation trajectory of the second operation, and outputting a merged folder corresponding to an object merging result, includes:
responding to a second operation aiming at the target identification, and determining an application icon corresponding to each target identification;
and adding the application icon to the merged folder.
Optionally, the merging the target object according to the operation trajectory of the second operation, and outputting a merged folder corresponding to an object merging result, includes:
responding to a second operation aiming at the target object, and identifying or determining the application type of the target object as a target type;
and adding the application icon with the application type as the target type into the merged folder.
Optionally, after the adding the application icon with the application type being the target type into the merged folder, the method further includes:
and hiding the application icon with the application type being the target type in the target folder.
Optionally, the S13 includes:
responding to a second operation aiming at the target object, and adding an object folder corresponding to the target object into the target folder;
and displaying the object folder in the target folder, and adjusting the object folder from a first control state to a second control state.
The application also provides a folder processing method, which comprises the following steps:
s21: responding to the first operation, and triggering the target object to enter a preset state;
s22: outputting a target menu or a target interface according to the preset state;
s23: and responding to a second operation aiming at the target menu, and adding the target object into a first folder, or responding to a third operation, and adding the target object into the first folder of the target interface.
Optionally, the S22 includes:
identifying or determining that the state type of the target object is a suspension state, and outputting a target menu at a preset position of a current interface;
and identifying or determining that the state type of the target object is a zooming state, triggering to enter a split screen mode, displaying the current interface in a first split screen area, and displaying the target interface in a second split screen area.
Optionally, the adding the target object to the first folder in response to the second operation on the target menu includes:
identifying or determining that the operation position of the second operation is a first menu item of the target menu;
identifying or determining that the operation track of the second operation points to the first menu item;
outputting a first folder corresponding to the first menu item;
and outputting the dynamic effect of the target object added into the first folder.
Optionally, the method further comprises:
responding to a fourth operation aiming at the target menu, and identifying or determining a second menu item corresponding to the fourth operation;
and previewing a second folder corresponding to the second menu item.
Optionally, the method further comprises:
identifying or determining that the operation position of the second operation is not the first menu item of the target menu;
and outputting the dynamic effect and/or prompt information of the failed addition.
Optionally, the adding the target object to the first folder of the target interface in response to the third operation includes:
identifying or determining a first folder in response to a third operation on the target interface;
and adding the target object to a first folder of the target interface.
Optionally, the adding the target object to the first folder of the target interface in response to the third operation includes:
responding to a fifth operation aiming at a target control in the target interface, and identifying or determining a first folder of the target interface;
and outputting the dynamic effect of the first folder of the target interface added by the target object.
The application also provides an intelligent terminal, including: the device comprises a memory and a processor, wherein the memory is stored with a folder processing program, and the folder processing program realizes the steps of the method when being executed by the processor.
The present application also provides a storage medium storing a computer program which, when executed by a processor, performs the steps of the method as described above.
As described above, the folder processing method of the present application outputs candidate identification information in response to a first operation; determining a target identifier in the candidate identifier information; and responding to a second operation aiming at the target identification, and adding the target object corresponding to the target identification to the target folder based on the adding mode corresponding to the second operation.
As described above, the folder processing method according to the present application may trigger the target object to enter the preset state in response to the first operation, output the target menu or the target interface according to the preset state, add the target object to the first folder in response to the second operation for the target menu, or add the target object to the first folder in response to the third operation. According to the folder processing method, the target menu or the target interface can be output according to the corresponding state of the target object, the third operation is obtained according to the second operation or the target interface aiming at the target menu, the target object is added into the first folder of the target interface, and through the technical scheme, the folder processing flow can be simplified, so that the folder processing efficiency is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic hardware structure diagram of a mobile terminal implementing various embodiments of the present application;
fig. 2 is a diagram illustrating a communication network system architecture according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a first implementation of a folder processing method provided in an embodiment of the present application;
fig. 4 to fig. 9 are schematic interface diagrams of a first implementation manner of a folder processing method according to an embodiment of the present application;
fig. 10 is a schematic flowchart of a second implementation manner of a folder processing method according to an embodiment of the present application;
fig. 11 to fig. 30 are schematic interface diagrams of a second implementation manner of a folder processing method according to an embodiment of the present application;
FIG. 31 is a schematic structural diagram of a first implementation of a folder handling device according to an embodiment of the present application;
fig. 32 is a schematic structural diagram of a second implementation manner of a folder processing device according to an embodiment of the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings. Specific embodiments of the present application have been shown by way of example in the drawings and will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, a reference to an element identified by the phrase "comprising one of 82308230a of 82303030, or an element defined by the phrase" comprising another identical element does not exclude the presence of the same element in a process, method, article, or apparatus comprising the element, and elements having the same designation may or may not have the same meaning in different embodiments of the application, the particular meaning being determined by its interpretation in the particular embodiment or by further reference to the context of the particular embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if," as used herein, may be interpreted as "at \8230; \8230when" or "when 8230; \823030when" or "in response to a determination," depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. As used herein, the terms "or," "and/or," "including at least one of the following," and the like, are to be construed as inclusive or meaning any one or any combination. For example, "includes at least one of: A. b, C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C ", again for example," a, B or C "or" a, B and/or C "means" any one of the following: a; b; c; a and B; a and C; b and C; a and B and C'. An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
The words "if", as used herein may be interpreted as "at \8230; \8230whenor" when 8230; \8230when or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that step numbers such as S11 and S12 are used herein for the purpose of more clearly and briefly describing the corresponding contents, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S12 first and then S11 in the specific implementation, but these should be within the protection scope of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The intelligent terminal may be implemented in various forms. For example, the smart terminal described in the present application may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present application, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, wiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 ), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex-Long Term Evolution), TDD-LTE (Time Division duplex-Long Term Evolution), 5G (Time Division duplex-Long Term Evolution), and the like.
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope of not changing the essence of the application.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of the phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Optionally, the light sensor includes an ambient light sensor and a proximity sensor, the ambient light sensor may adjust the brightness of the display panel 1061 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1061 and/or the backlight when the mobile terminal 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing gestures of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometers and taps), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Alternatively, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch orientation of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 107 may include other input devices 1072 in addition to the touch panel 1071. Optionally, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited thereto.
Alternatively, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 1, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, and the like), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby integrally monitoring the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally, the application processor mainly handles operating systems, user interfaces, application programs, etc., and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the mobile terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system provided in an embodiment of the present application, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an epc (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Optionally, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Optionally, the eNodeB2021 may connect with other enodebs 2022 via backhaul (e.g., X2 interface), the eNodeB2021 connects to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide some registers to manage functions such as home location register (not shown) and holds some user-specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems (e.g. 5G), and the like.
Based on the above mobile terminal hardware structure and communication network system, various embodiments of the present application are provided.
The user can add the corresponding application to the same file as desired for subsequent processing of the folder and/or application.
Taking adding an application into a folder as an example, in a current scheme of adding an application into a folder, after a user selects a target application, the user needs to drag the target application into the target folder, if the mobile terminal is provided with too many applications, the user needs to perform a page turning operation in the dragging process, so that not only is a case of misoperation easily caused, but also the operation is invalid due to the fact that a finger touches another application during page turning, and the operation is complicated, so that the efficiency of file processing is low.
In order to solve the foregoing technical problem, an embodiment of the present application provides a folder processing method, where a mobile terminal may output candidate identification information in response to a first operation, the mobile terminal determines a target identification in the candidate identification information, and the mobile terminal responds to a second operation for the target identification, and adds a target object corresponding to the target identification to a target folder and adds the target object to a first folder based on an addition manner corresponding to the second operation.
Optionally, the present application further provides a folder processing method, where the mobile terminal outputs candidate identification information in response to the first operation, and the mobile terminal adds the target object to the target folder in response to the second operation for the candidate identification information.
Hereinafter, the technical means shown in the present application will be described in detail by specific examples. It should be noted that the following embodiments may exist alone or in combination with each other, and description of the same or similar contents is not repeated in different embodiments.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating a first implementation manner of a folder processing method according to an embodiment of the present disclosure. The folder processing method may specifically include:
s11: in response to the first operation, candidate identification information is output.
Optionally, the first operation may be an operation for a folder, a display interface, and/or a current device. Optionally, the first operation may be at least one of: the mobile phone comprises a long press, a click, a double click, a continuous click, a sliding operation in a preset direction, a dragging, an air-separating gesture, a fingerprint identification and a voice control, wherein optionally, the long press refers to a long press operation on a screen page of the mobile phone through a finger or a touch tool.
Optionally, the double-click refers to a double-click operation performed on a screen page of the mobile phone by a finger or a touch tool.
Optionally, the continuous clicking refers to continuously clicking on the screen page of the mobile phone by a finger or a touch tool, such as continuously clicking 3 times or more.
Optionally, the sliding operation in the preset direction refers to that the sliding direction is preset by software, and the software slides on a screen page of the mobile phone according to the preset sliding direction through a finger or a touch tool. Optionally, the preset direction may be a direction from top to bottom, the preset direction may be a direction from bottom to top, the preset direction may be a first up-down direction and then a left-right direction, the preset direction may be a first left-right direction and then a top-down direction, the preset direction may be a clockwise upper semicircle, the preset direction may be a counterclockwise lower semicircle, the preset direction may be a clockwise arc, and the like. The embodiment of the application does not particularly limit the specific form of the preset direction, and can be adjusted correspondingly according to specific requirements.
Optionally, the clear gesture is performed according to a preset gesture within a certain distance from the screen page of the mobile phone by a finger or a stylus. Optionally, the clear gesture may be at least one of: drawing circles in the air, drawing circular arcs in the air, drawing semi-circles in the air, drawing straight lines in the air, drawing curves in the air, drawing opposite signs in the air, drawing characters in the air and the like.
Optionally, the preset state may be a display state, a control state, or the like, for example, in response to the first operation, the target object is triggered to enter a highlighted state, and for example, in response to the first operation, the target object is triggered to enter a virtualized display state.
Alternatively, the target object may be at least one of: the system comprises an application icon, a popup window, a prompt message list, a control and a miniature folder, wherein the size of the miniature folder is smaller than that of a first folder, the miniature folder can be specifically set according to actual requirements, and the detailed description is omitted.
The candidate identification information includes a plurality of candidate identifications, each candidate identification corresponds to a candidate object, and the candidate object may be at least one of the following objects: application icons, popup windows, prompt information lists, controls and miniature folders.
Optionally, in some embodiments, the step of "outputting candidate identification information in response to the first operation" may specifically include:
outputting an opening interface of the target folder;
and responding to the first operation aiming at the opening interface, and outputting the candidate identification information.
Optionally, the step "responding to the first operation for opening the interface and outputting the candidate identification information" may specifically include:
recognizing or determining that a first operation aiming at an opening interface is a preset operation, and outputting candidate identification information; and/or the presence of a gas in the atmosphere,
and identifying or determining the operation area of the first operation as a preset area, and outputting candidate identification information.
And S12, determining target identification in the candidate identification information.
Optionally, in response to a selection operation for a candidate identifier in the candidate identifier information, the selected candidate identifier is determined as the target identifier. Optionally, the selection operation may be at least one operation of clicking, long pressing, continuous clicking, shaking, and voice, and may be specifically selected according to the actual situation.
S13: and responding to a second operation aiming at the target identification, and adding the target object corresponding to the target identification into the target folder based on the adding mode corresponding to the second operation.
Optionally, the second operation may be at least one of: for specific embodiments of the present invention, reference may be made to the related description in the above embodiments, and details are not repeated herein.
Optionally, the addition manner corresponding to the second operation may be preset, the addition manner may be a single target object addition manner, a multiple target object addition manner, an associated target object addition manner, and the like, and may be specifically set according to the fact, for example, in response to a click operation for the target identifier, the application icon corresponding to the target identifier is added to the target folder, that is, step S13 may specifically include:
responding to a second operation aiming at the target identification, and identifying or determining that the operation type of the second operation is single-point touch;
outputting a target application icon corresponding to the target identifier;
and responding to a third operation aiming at the target application icon, and adding the target application icon into the target folder.
Optionally, referring to fig. 4, it is recognized or determined that the operation type of the second operation is single-point touch, that is, the second operation is a click operation for the target identifier T, the target application icon S corresponding to the target identifier T is output, and the target application icon is added to the target folder D in response to a third operation for the target application icon S.
Optionally, the second operation may also be a long-press operation, in this case, the number of application icons of the same type in the target folder may be detected in response to the long-press operation, and based on a detection result, a corresponding object addition result is output in the target folder, that is, the step "add the target application icon into the target folder in response to a third operation on the target application icon" may specifically include:
responding to a third operation aiming at the target application icon, and detecting the number of application icons of the same type in the target folder;
when the number of the application icons of the same type in the target folder is detected to be larger than a preset value, generating a homologous folder in the target folder, and enabling the target application icons and the target folder to be in the same;
and when the number of the application icons of the same type in the target folder is detected to be smaller than or equal to a preset value, displaying the target application icon in the preset position of the target folder according to the operation position corresponding to the third operation.
Optionally, referring to fig. 5, in response to a third operation on the target application icon, detecting the number of application icons of the same type in the target folder, and optionally, if the target application icon S is a music type, detecting the number of application icons of the music type in the target folder, for example, if the application icon a, the application icon B, and the application icon C in the target folder are all music type application icons, and are preset to be 2, generating a homologous folder Q in the target folder D, where the homologous folder Q is a large folder, it needs to be noted that the large folder may directly open an application program in one step, and the application program in the homologous folder Q is not clicked again without opening the folder, thereby facilitating subsequent opening of the application program in the homologous folder Q.
Optionally, referring to fig. 6, in response to a third operation on the target application icon, detecting the number of application icons of the same type in the target folder, and optionally, if the target application icon S is of a music type, detecting the number of application icons of the music type in the target folder, for example, if the application icon a, the application icon B, and the application icon C in the target folder are all application icons of the music type and are preset to 3, displaying the target application icon S in the target folder D according to an operation position corresponding to the third operation.
Optionally, in some embodiments, the second operation may be a multi-point touch operation, and the target objects corresponding to the target identifiers may be merged according to the touch points corresponding to the multi-point touch operation, and the merging result is added to the target folder, that is, step S13 may specifically include:
identifying or determining that the operation type of the second operation is multi-point touch;
merging the target objects corresponding to the target identifications according to the operation track of the second operation, and outputting a merged folder corresponding to the object merging result;
and adding the merged folder into the target folder.
Optionally, referring to fig. 7, recognizing or determining that the operation type of the second operation is multi-point touch, determining a target application identifier a and a target application identifier B selected by the second operation, merging an application icon a and an application icon B corresponding to the target identifiers, outputting a merged folder H containing the application icon a and the application icon B, and then adding the merged folder H to the target folder. Optionally, when the application icon a or the application icon B is included in the target folder, the application icon a or the application icon B in the target file is hidden when the merged folder H is added to the target folder.
Optionally, in some embodiments, the step "merging the target objects corresponding to the target identifiers according to the operation trajectory of the second operation, and outputting the merged folder corresponding to the object merging result" may specifically include:
responding to a second operation aiming at the target identification, and determining an application icon corresponding to each target identification;
add the application icon to the merge folder.
Optionally, referring to fig. 8, the operation type of the second operation is identified or determined as multi-point touch, the target application identifier a and the target application identifier B selected by the second operation are determined, then, the application type corresponding to the application icon A1 and the application type corresponding to the application icon B1 corresponding to the target identifier are identified, and finally, the application icon of the target type is added to the merge folder H, as shown in the figure, in this example, the application icon A2 and the application icon B2 are added to the merge folder H.
Optionally, in some embodiments, the step "merging the target objects corresponding to the target identifiers according to the operation track of the second operation, and outputting the merged folder corresponding to the object merging result" may specifically include:
responding to a second operation aiming at the target object, and identifying or determining the application type of the target object as a target type;
and adding the application icon with the application type as the target type into the combined folder.
Optionally, referring to fig. 9, when the target folder D is in the expanded state, in response to the first operation, candidate identification information is output, where the candidate identification information may be displayed through a list or may be displayed through a compass, for convenience of description, fig. 9 illustrates in a list form as an example, and in response to a click operation on the candidate object list S, a candidate object corresponding to a clicked candidate object identifier is added to the target folder D, so as to achieve a purpose of quickly adding a target object to the target folder D, thereby improving efficiency of folder processing.
As described above, the folder processing method according to the present application may output the candidate identifier information in response to the first operation, determine the target identifier in the candidate identifier information, and add the target object corresponding to the target identifier to the target folder based on the addition manner corresponding to the second operation in response to the second operation for the target identifier. According to the folder processing method, the second operation aiming at the candidate identification information can be received, the target object is added into the target folder, and through the technical scheme, the folder processing flow can be simplified, so that the folder processing efficiency is improved.
Referring to fig. 10, fig. 10 is a schematic flowchart illustrating a first implementation manner of a folder processing method according to an embodiment of the present application. The folder processing method may specifically include:
and S21, responding to the first operation, and triggering the target object to enter a preset state.
Optionally, the first operation may be at least one of: the specific embodiments of the present invention can refer to the related descriptions in the above embodiments, and details thereof are not repeated herein.
Optionally, referring to fig. 11, the target object is a short message popup window T, and the short message popup window T is triggered to enter a highlighted state in response to a shaking operation for the mobile terminal, where an edge of the short message popup window T may be highlighted, and a short message content of the short message popup window T may also be highlighted.
Optionally, referring to fig. 12, the target object is an application icon a, and the application icon a is triggered to enter a floating display state in response to a continuous click operation for the mobile terminal.
Optionally, referring to fig. 13, the target object is an application icon a, and the application icon a is triggered to enter a blurring display state in response to a continuous click operation for the mobile terminal.
Optionally, referring to fig. 14, the target object is an application icon a, and the application icon a is triggered to enter a blurring display state in response to a gesture operation for the mobile terminal.
And S22, outputting a target menu or a target interface according to the preset state.
Optionally, the preset state is a preset state of the target object, such as a floating state, an amplification state, a blurring display state, and the like, for example, when the state type of the target object is the amplification state, in order to facilitate user operation, the smart device may be triggered to enter a split screen mode, and display a current interface in the first split screen area, and display the target interface in the second split screen area, where it is required to say that the current interface is an interface before the target object enters the preset state; for another example, in order to improve the processing efficiency of folder processing, when the state type of the target object is an enlarged state, a target menu may be output, so as to facilitate subsequent adding of the target object to the first folder through the target menu, that is, step S22 may specifically include:
identifying or determining that the state type of the target object is a suspension state, and outputting a target menu at a preset position of a current interface;
and identifying or determining that the state type of the target object is a zooming state, triggering to enter a split screen mode, displaying the current interface in a first split screen area, and displaying the target interface in a second split screen area.
The zooming state includes a zooming-out state and a zooming-in state, and since there may be a situation of user misoperation due to a change in the size of the user object in the zooming state, a current interface a is displayed in the first split-screen area, and a target interface B is displayed in the second split-screen area, as shown in fig. 15, a target object T is displayed in the current interface, and at least one first folder D may be displayed in the target interface B.
Optionally, referring to fig. 16, recognizing or determining that the state type of the target object T is a floating state, and outputting a target menu S at a preset position of the current interface a, where the target menu S includes at least one first folder D, the preset position may be an upper area, a middle area, or a lower area of the current interface a, and may be adaptively adjusted according to the position of the target object T on the current interface a.
And S23, responding to a second operation aiming at the target menu, and adding the target object into the first folder, or responding to a third operation, and adding the target object into the first folder of the target interface.
Optionally, the second operation may be at least one of the following operations for the target menu: long press, click, double click, continuous click, sliding operation in a preset direction, dragging, air gesture, fingerprint identification and voice control.
Optionally, the second operation may be at least one of the following operations for the target interface or the smart terminal: long press, click, double click, continuous click, sliding operation in a preset direction, dragging, air gesture, fingerprint identification and voice control.
Optionally, adding the target object to the first folder includes the following steps:
in one mode
In response to the second operation on the target menu, the target object is added to the first folder, that is, optionally, the step "add the target object to the first folder in response to the second operation on the target menu" may specifically include:
outputting a target menu;
and responding to a second operation aiming at the target menu, and adding the target object into the first folder.
Optionally, referring to fig. 17, a target menu L is output at a preset position, where the target menu L includes folder identifiers a of all folders, and in response to a sliding operation for the folder identifiers a, a target icon Z is slid into a folder (i.e., a first folder) corresponding to the sliding operation. Optionally, in some embodiments, each menu item of the target menu corresponds to a folder identification.
Optionally, referring to fig. 18, a target menu L is output at a preset position, where the target menu L includes all the micro folders a corresponding to all the folders, a target icon Z is added to the micro folder a (i.e., the first folder) corresponding to the click operation in response to the click operation for the micro folder a, and a preview of the micro folder a after the target icon is added is displayed.
Optionally, in some embodiments, the step "add the target object to the first folder in response to the second operation on the target menu" may specifically include:
identifying or determining the operation position of the second operation as a first menu item of the target menu;
and adding the target object to a first folder corresponding to the first menu item.
Optionally, operation information of the second operation is extracted, where the operation information includes an operation start point, an operation end point, and an operation trajectory, in some embodiments, only the operation end point of the second operation in the target menu may be identified as a first menu item of the target menu, and finally, the target object is added to a first folder corresponding to the first menu item.
When the user adds the target object to the first folder, a situation of misoperation may occur, for example, the added folder is not the folder that the user desires to add, at this time, the user is required to re-execute the corresponding operation to add the target object to the first folder
Optionally, in some embodiments, the adding the target object to the first folder according to the operation track corresponding to the second operation, that is, the step "adding the target object to the first folder corresponding to the first menu item" may specifically include:
identifying or determining that the operation track of the second operation points to the first menu item;
outputting a first folder corresponding to the first menu item;
and outputting the dynamic effect of the target object added into the first folder.
Alternatively, referring to fig. 19, it is identified or determined that the operation trajectory of the sliding operation points to the first menu item J, then the first folder S corresponding to the first menu item J is displayed, and finally, the animation of the target object added to the first folder is output.
Alternatively, a sliding trend corresponding to the operation trajectory of the sliding operation may be identified, and it may be determined that the operation trajectory of the sliding operation points to the first menu item J according to the sliding trend, for example, a sliding direction corresponding to the sliding operation may be determined, based on the sliding direction, a trajectory end point of the operation trajectory is taken as a start point of an extension line, and the menu item is taken as an end point of the extension line, an extension line corresponding to the operation trajectory is constructed, and finally, it is determined that the operation trajectory points to the first menu item J according to the extension line, and the target object is added to the first folder corresponding to the first menu item J.
Alternatively, the manner of outputting the first folder may be various, for example, referring to fig. 20, a thumbnail of the first folder is output. For another example, the first folder D corresponding to the first menu item J is dynamically output. As another example, the first folder D is highlighted.
Optionally, in some embodiments, the step "outputting the first folder corresponding to the first menu item" may specifically include:
acquiring a first folder corresponding to a first menu item;
and highlighting the first folder in a preset area.
Optionally, in some embodiments, in response to a situation that there are many applications in the folder, when the user adds the target object to the first folder, there may be a case that the user does not know the applications included in the folder, so that after the target object is added to the first folder, it is not convenient to search the target object subsequently, and therefore the folder processing method provided by the present application may further include:
responding to a fourth operation aiming at the target menu, and identifying or determining a second menu item corresponding to the fourth operation;
and previewing a second folder corresponding to the second menu item.
Alternatively, referring to fig. 21, in response to a long-press operation on a menu item in the target menu L, a second menu item K corresponding to the long-press operation is identified or determined, and then a preview image of the second folder is output on the second menu item K.
Alternatively, referring to fig. 22, in response to a long-press operation for a menu item in the target menu L, the second menu item K corresponding to the long-press operation is identified or determined, and then a preview image of the second folder is output at a preset position.
Alternatively, referring to fig. 23, in response to a long-press operation on a menu item in the target menu L, the second menu item K corresponding to the long-press operation is identified or determined, and then a mini folder of the second folder is output at a preset position, after the mini folder is output, the user may adjust the layout in the mini folder.
Optionally, in some embodiments, the folder processing method provided by the present application may further include:
identifying or determining that the operation position of the second operation is not the first menu item of the target menu;
and outputting the dynamic effect and/or prompt information of the failed addition.
Optionally, referring to fig. 24, the operation position of the sliding operation is identified or determined to be an operation point a, and the operation point a does not target a point corresponding to a menu item in the menu L, at this time, a dynamic effect of adding failure is output, and the dynamic effect may be a dynamic effect of an explosion effect, as shown in fig. 24, of course, another dynamic effect may also be used, which may be specifically set according to an actual situation.
Optionally, referring to fig. 25, it is identified or determined that the operation position of the sliding operation is the operation point a, and the operation point a does not target a point corresponding to a menu item in the menu L, at this time, a prompt message indicating that the adding fails is output, where the prompt message is "failing to add an icon".
Alternatively, for a mobile terminal installed with a large number of applications, the target menu may contain a large number of folders, and for some mobile terminals, a page of the target menu may not display all of the folders, so in some embodiments of the application, the folders contained in the target menu are displayed by turning pages in response to the fourth operation on the target menu.
Optionally, referring to fig. 26, a page turning control Q is disposed in the target menu L, and the target menu L is displayed by turning pages in response to a page turning operation performed on the page turning control Q. Optionally, the page turning control Q may be a floating control or a scroll bar control, and may be specifically set according to an actual situation.
Mode two
Optionally, the target object is added to the corresponding first folder by performing a corresponding operation on the target object.
Alternatively, referring to fig. 27, in response to a drag operation for the application icon a, the application icon a is added to the first folder D according to a drag end point of the drag operation. Optionally, the application icon a may also be added to the first folder D through a sliding operation, which is not limited herein.
Optionally, the step "responding to the second operation, and adding the target object to the first folder of the target menu" may specifically include:
identifying or determining the operation position of the second operation as a first folder of the target menu;
and adding the target object into the first folder.
Optionally, referring to fig. 28, in response to the continuous click operation on the application icon a, a target interface S is output, where the target interface S includes a plurality of first folders D, and the target interface may be displayed on a different interface from the target menu in a split manner, for example, the target menu is displayed on the first window C1, and the target interface S is displayed on the second window C2, as shown in fig. 28; for another example, the target interface S is displayed in the floating window X, as shown in fig. 29; in response to the drag operation (i.e., the fifth operation) for the application icon, the application icon a is added to the first folder D, as shown in fig. 28 and 29.
Optionally, in some embodiments, the target object may be further displayed in a floating manner, and a target interface including the target object after floating is output, that is, the step "responding to the second operation on the target object and outputting the target interface" may specifically include:
responding to a second operation aiming at the target object, and displaying the target object in a floating mode;
and outputting a target interface containing the levitated target object.
Optionally, referring to fig. 30, in response to a click operation on an application icon a, the application icon a is displayed in a floating manner, a target interface S including the suspended application icon a is displayed, then, a first folder D is displayed in the target interface S, an addition operation (i.e., a fifth operation) on the suspended application icon a is responded, the suspended application icon a is added to the first folder D, the addition operation may be a sliding operation, a click operation, and/or a pressing operation, and the setting may be specifically performed according to an actual situation, which is not described herein again.
The target object is displayed in a floating mode, so that the user can perceive the selected object to be added to the first folder subsequently, misoperation is avoided, and the folder processing efficiency is improved.
Optionally, after the target object is added to the first folder, the action effect of the target object added to the first folder of the target interface may be output, so that the user can perceive that the selected target object is added to the corresponding first folder, and the interactivity of the folder processing scheme may be further improved.
As described above, the folder processing method of the present application may trigger the target object to enter the preset state in response to the first operation, output the target menu or the target interface according to the preset state, add the target object to the first folder in response to the second operation on the target menu, or add the target object to the first folder on the target interface in response to the third operation.
Correspondingly, the present application further provides a folder processing apparatus, please refer to fig. 31, where fig. 31 is a schematic structural diagram of a first implementation manner of the folder processing apparatus provided in the embodiment of the present application, the folder processing apparatus 30 may be integrated in an intelligent terminal, and the folder processing apparatus 30 may include an output module 301, a determination module 302, and an adding module 303.
The output module 301 is configured to output the candidate identification information in response to the first operation.
Optionally, the first operation may be at least one of: long press, click, double click, continuous click, sliding operation in a preset direction, dragging, air gesture, fingerprint identification and voice control.
A determining module 302, configured to determine a target identifier in the candidate identifier information.
Optionally, in response to a selection operation for a candidate identifier in the candidate identifier information, the selected candidate identifier is determined as the target identifier.
An adding module 303, configured to, in response to a second operation for the target identifier, add the target object corresponding to the target identifier to the target folder based on an adding manner corresponding to the second operation.
As described above, the folder processing apparatus of the present application may output the candidate identification information in response to the first operation, determine the target identifier in the candidate identification information, and add the target object corresponding to the target identifier to the target folder based on the addition manner corresponding to the second operation in response to the second operation for the target identifier. The folder processing method can receive the second operation aiming at the candidate identification information and add the target object into the target folder, and can simplify the folder processing flow through the technical scheme, so that the folder processing efficiency is improved
Referring to fig. 32, fig. 32 is a schematic structural diagram of a second implementation manner of the folder processing device provided in the embodiment of the present application, where the folder processing device 40 may be integrated in an intelligent terminal, and the folder processing device 40 may include a triggering module 401, an output module 402, and an adding module 403.
The triggering module 401 may be configured to trigger the target object to enter a preset state in response to the first operation.
An output module 402, configured to output a target menu or a target interface according to a preset state.
Optionally, the preset state is a preset state of the target object, such as a floating state, an amplification state, a blurring display state, and the like, for example, when the state type of the target object is the amplification state, in order to facilitate user operation, the smart device may be triggered to enter a split screen mode, and display a current interface in the first split screen area, and display the target interface in the second split screen area, where it is required to say that the current interface is an interface before the target object enters the preset state; for another example, in order to improve the processing efficiency of folder processing, when the state type of the target object is the enlarged state, a target menu may be output, so that the target object can be added to the first folder through the target menu in the following step.
The adding module 403 may be configured to add the target object to the first folder in response to the second operation on the target menu, or add the target object to the first folder of the target interface in response to the third operation.
Optionally, the adding module 302 adds the target object to the corresponding first folder, specifically referring to the foregoing embodiment.
As described above, the folder processing apparatus of the present application may trigger the target object to enter the preset state in response to the first operation, output the target menu or the target interface according to the preset state, add the target object to the first folder in response to the second operation on the target menu, or add the target object to the first folder on the target interface in response to the third operation.
The application further provides an intelligent terminal, the intelligent terminal comprises a storage and a processor, the storage is stored with a folder processing program, and the folder processing program is executed by the processor to realize the steps of the folder processing method in any embodiment.
The present application further provides a storage medium, in which a folder processing program is stored, and when being executed by a processor, the folder processing program implements the steps of the folder processing method in any of the above embodiments.
In the embodiments of the intelligent terminal and the storage medium provided in the present application, all technical features of any one of the embodiments of the folder processing method may be included, and the expanding and explaining contents of the specification are basically the same as those of the embodiments of the method, and are not described herein again.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
It is to be understood that the foregoing scenarios are only examples, and do not constitute a limitation on application scenarios of the technical solutions provided in the embodiments of the present application, and the technical solutions of the present application may also be applied to other scenarios. For example, as a person having ordinary skill in the art can know, with the evolution of the system architecture and the emergence of new service scenarios, the technical solutions provided in the embodiments of the present application are also applicable to similar technical problems.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The units in the device in the embodiment of the application can be merged, divided and deleted according to actual needs.
In the present application, the same or similar term concepts, technical solutions and/or application scenario descriptions will be generally described only in detail at the first occurrence, and when the description is repeated later, the detailed description will not be repeated in general for brevity, and when understanding the technical solutions and the like of the present application, reference may be made to the related detailed description before the description for the same or similar term concepts, technical solutions and/or application scenario descriptions and the like which are not described in detail later.
In the present application, each embodiment is described with emphasis, and reference may be made to the description of other embodiments for parts that are not described or illustrated in any embodiment.
The technical features of the technical solution of the present application may be arbitrarily combined, and for brevity of description, all possible combinations of the technical features in the embodiments are not described, however, as long as there is no contradiction between the combinations of the technical features, the scope of the present application should be considered as being described in the present application.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application or portions contributing to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (such as a ROM/RAM, a magnetic disk, and an optical disk) as above, and includes several instructions to enable a terminal device (which may be a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware, or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer storage medium or transmitted from one computer storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optics, digital subscriber line) or wireless (e.g., infrared, wireless, microwave, etc.). A computer storage medium may be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, storage Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.
Claims (7)
1. A folder processing method is characterized by comprising the following steps:
s11: responding to the first operation, and outputting a candidate object identification list;
s12: determining a target identifier in the candidate object identifier list;
s13: identifying or determining an operation type of a second operation for the target identification; adding the target object corresponding to the target identification into a merged folder or a homologous folder based on the operation type;
wherein, the adding the target object corresponding to the target identifier to the merged folder or the homologous folder based on the operation type includes:
identifying or determining that the operation type of the second operation aiming at the target identifier is single-point touch, and displaying a target application icon corresponding to the target identifier in the candidate object identifier list; responding to a third operation aiming at the target application icon, and detecting the number of application icons of the same type in the target folder; when the number of the application icons of the same type in the target folder is detected to be larger than a preset value, generating a homologous folder in the target folder, adding the application icons of the same type into the homologous folder, and when the number of the application icons of the same type in the target folder is detected to be smaller than or equal to the preset value, displaying the target application icon in the preset position of the target folder according to an operation position corresponding to the third operation, wherein the homologous folder is a large folder, and the large folder is a folder through which an application program is opened in one step;
identifying or determining that the operation type of the second operation aiming at the target identification is multi-point touch; determining a target application identifier selected by the second operation, merging application icons corresponding to the target application identifier, and outputting a merged folder containing the application icons corresponding to the target application identifier; and adding the merged folder into the target folder.
2. The method according to claim 1, wherein the S11 comprises:
outputting an opening interface of the target folder;
and responding to the first operation aiming at the opening interface, and outputting candidate identification information.
3. The method of claim 2, wherein outputting candidate identification information in response to the first operation on the open interface comprises:
recognizing or determining that a first operation aiming at the opening interface is a preset operation, and outputting candidate identification information; and/or the presence of a gas in the gas,
and identifying or determining the operation area of the first operation as a preset area, and outputting candidate identification information.
4. The method according to claim 1, wherein the S13 comprises:
responding to a second operation aiming at the target object, and identifying or determining the application type of the target object as a target type;
and adding the application icon with the application type as the target type into the merged folder.
5. The method of claim 4, wherein after adding the application icon with the application type of the target type into the merged folder, further comprising:
and hiding the application icon with the application type being the target type in the target folder.
6. An intelligent terminal, characterized in that, intelligent terminal includes: memory, processor, wherein the memory has stored thereon a folder handling program, which when executed by the processor implements the steps of the folder handling method as claimed in any one of claims 1 to 5.
7. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the folder processing method as claimed in any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210653895.3A CN114722010B (en) | 2022-06-10 | 2022-06-10 | Folder processing method, intelligent terminal and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210653895.3A CN114722010B (en) | 2022-06-10 | 2022-06-10 | Folder processing method, intelligent terminal and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114722010A CN114722010A (en) | 2022-07-08 |
CN114722010B true CN114722010B (en) | 2022-11-29 |
Family
ID=82232334
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210653895.3A Active CN114722010B (en) | 2022-06-10 | 2022-06-10 | Folder processing method, intelligent terminal and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114722010B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106250109A (en) * | 2016-07-18 | 2016-12-21 | 乐视控股(北京)有限公司 | A kind of multipad display packing, device and mobile terminal |
CN109032720A (en) * | 2018-06-27 | 2018-12-18 | 奇酷互联网络科技(深圳)有限公司 | Folder icon display methods, system, readable storage medium storing program for executing and terminal |
CN110032307A (en) * | 2019-02-26 | 2019-07-19 | 华为技术有限公司 | A kind of moving method and electronic equipment of application icon |
CN113126838A (en) * | 2021-03-15 | 2021-07-16 | 维沃移动通信有限公司 | Application icon sorting method and device and electronic equipment |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120012541A (en) * | 2010-08-02 | 2012-02-10 | 삼성전자주식회사 | Method and apparatus for operating folder in a touch device |
CN113835580A (en) * | 2021-09-26 | 2021-12-24 | 维沃移动通信有限公司 | Application icon display method and device, electronic equipment and storage medium |
-
2022
- 2022-06-10 CN CN202210653895.3A patent/CN114722010B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106250109A (en) * | 2016-07-18 | 2016-12-21 | 乐视控股(北京)有限公司 | A kind of multipad display packing, device and mobile terminal |
CN109032720A (en) * | 2018-06-27 | 2018-12-18 | 奇酷互联网络科技(深圳)有限公司 | Folder icon display methods, system, readable storage medium storing program for executing and terminal |
CN110032307A (en) * | 2019-02-26 | 2019-07-19 | 华为技术有限公司 | A kind of moving method and electronic equipment of application icon |
CN113126838A (en) * | 2021-03-15 | 2021-07-16 | 维沃移动通信有限公司 | Application icon sorting method and device and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN114722010A (en) | 2022-07-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107241494A (en) | A kind of quick inspection method of data content, mobile terminal and storage medium | |
CN114625707A (en) | Processing method, intelligent terminal and storage medium | |
CN113407081A (en) | Display method, mobile terminal and storage medium | |
CN112558826A (en) | Shortcut operation method, mobile terminal and storage medium | |
CN114860674B (en) | File processing method, intelligent terminal and storage medium | |
CN107506103A (en) | A kind of terminal control method, terminal and computer-readable recording medium | |
CN114722010B (en) | Folder processing method, intelligent terminal and storage medium | |
CN114741359A (en) | Processing method, intelligent terminal and storage medium | |
CN115494997A (en) | Information reading method, intelligent terminal and storage medium | |
CN115914719A (en) | Screen projection display method, intelligent terminal and storage medium | |
CN114138144A (en) | Control method, intelligent terminal and storage medium | |
CN114443199A (en) | Interface processing method, intelligent terminal and storage medium | |
CN114741361A (en) | Processing method, intelligent terminal and storage medium | |
CN113867765A (en) | Application management method, intelligent terminal and storage medium | |
CN113867588A (en) | Icon processing method, intelligent terminal and storage medium | |
CN113342244A (en) | Interface display method, mobile terminal and storage medium | |
CN113342246A (en) | Operation method, mobile terminal and storage medium | |
WO2023092343A1 (en) | Icon area management method, intelligent terminal and storage medium | |
CN115718580A (en) | File opening method, intelligent terminal and storage medium | |
CN115617229A (en) | Application classification method, mobile terminal and storage medium | |
CN117008767A (en) | Control method, intelligent terminal and storage medium | |
CN114741362A (en) | Processing method, intelligent terminal and storage medium | |
CN114995730A (en) | Information display method, mobile terminal and storage medium | |
CN114722009A (en) | Folder processing method, intelligent terminal and storage medium | |
CN114327219A (en) | Application control method, intelligent terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |