[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20160062557A1 - Method of processing content and electronic device thereof - Google Patents

Method of processing content and electronic device thereof Download PDF

Info

Publication number
US20160062557A1
US20160062557A1 US14/843,387 US201514843387A US2016062557A1 US 20160062557 A1 US20160062557 A1 US 20160062557A1 US 201514843387 A US201514843387 A US 201514843387A US 2016062557 A1 US2016062557 A1 US 2016062557A1
Authority
US
United States
Prior art keywords
content
electronic device
processor
display
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/843,387
Inventor
Kyung-Hwan Kim
Dong-Jeon KIM
Jin-Hong JEONG
Hye-Soon Jeong
Se-Jun Song
Yo-Han Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Jeong, Hye-Soon, JEONG, JIN-HONG, KIM, KYUNG-HWAN, LEE, YO-HAN, Song, Se-Jun, KIM, DONG-JEON
Publication of US20160062557A1 publication Critical patent/US20160062557A1/en
Priority to US17/097,587 priority Critical patent/US11847292B2/en
Priority to US18/540,251 priority patent/US20240118781A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/13338Input devices, e.g. touch panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/543User-generated data transfer, e.g. clipboards, dynamic data exchange [DDE], object linking and embedding [OLE]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/40OLEDs integrated with touch screens

Definitions

  • the present disclosure relates to an electronic device for processing content.
  • TV television
  • DMB digital multimedia broadcasting
  • DVD digital video broadcasting
  • MPEG-1 or MPEG-2 Moving Picture Experts Group phase 1 or phase 2
  • MP3 Moving Picture Experts Group-3
  • an aspect of the present disclosure is to provide a method and an electronic device for allowing the user to easily use the memo function or the scrap function.
  • a user executes many steps to perform a memo function or scrap function of the related art, and thus, the user cannot easily use the memo function or the scrap function.
  • a method for processing a content at an electronic device includes generating a first content corresponding to a user input applied to content displayed via the electronic device, and displaying a floating user interface (UI), which displays first scrap information on the first content, on a screen of the electronic device.
  • UI floating user interface
  • an electronic device for processing a content includes a display and a processor configured to generate a first content corresponding to a user input applied to content displayed via the display, and control the display to display a floating UI, which displays first scrap information on the first content, on a screen.
  • a machine-readable storage medium recording a program for executing a method of processing a content by an electronic device.
  • the method includes generating a first content corresponding to a user input content displayed via the electronic device, and displaying a floating UI, which displays first scrap information on the first content, on a screen of the electronic device.
  • the present disclosure can provide a method and an electronic device for allowing the user to easily use a memo function or a scrap function
  • FIG. 1 illustrates a network environment including an electronic device according to various embodiments of the present disclosure
  • FIG. 2 is a flowchart of a method of processing contents by an electronic device according to various embodiments of the present disclosure
  • FIG. 3 is a flowchart of another method of processing the contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 4A , 4 B, 4 C, 4 D, and 4 E illustrate an example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 5A , 5 B, 5 C, and 6 D illustrate an example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 6A , 6 B, 6 C, and 6 D illustrate an example of selecting contents according to a user input by the electronic device according to various embodiments of the present disclosure
  • FIGS. 7A , 7 B, 7 C, and 7 D illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 8A and 8B illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIG. 9 illustrates another example of processing contents by the electronic device according to various embodiments of the present disclosure.
  • FIGS. 10A , 10 B, 10 C, 10 D, 10 E, 10 F, 10 G, and 10 H illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 11A , 11 B, 11 C, and 11 D illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 12A and 12B illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 13A , 13 B, and 13 C illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 14A and 14B illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 15A , 15 B, 15 C and 15 D illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIG. 16 illustrates another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIG. 17 illustrates another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 18A and 18B illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIGS. 19A , 19 B, and 19 C illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure
  • FIG. 20 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • FIG. 21 is a block diagram of a program module according to various embodiments of the present disclosure.
  • the expression “have”, “may have”, “include” or “may include” refers to existence of a corresponding feature (e.g., numerical value, function, operation, or components such as elements), and does not exclude existence of additional features.
  • the expression “A or B”, “at least one of A or/and B”, or “one or more of A or/and B” may include all possible combinations of the items listed.
  • the expression “A or B”, “at least one of A and B”, or “at least one of A or B” refers to all of (1) including at least one A, (2) including at least one B, or (3) including all of at least one A and at least one B.
  • a first”, “a second”, “the first”, or “the second” used in various embodiments of the present disclosure may modify various components regardless of the order and/or the importance but does not limit the corresponding components.
  • the expressions may be used to distinguish a component element from another component element.
  • a first user device and a second user device indicate different user devices although both of them are user devices.
  • a first element may be termed a second element, and similarly, a second element may be termed a first element without departing from the scope of the present disclosure.
  • one element e.g., a first element
  • another element e.g., a second element
  • the one element is directly connected to another element or the one element is indirectly connected to another element via yet another element (e.g., a third element).
  • a third component element does not exist between the first component element and the second component element.
  • the expression “configured to” used in the present disclosure may be exchanged with, for example, “suitable for”, “having the capacity to”, “designed to”, “adapted to”, “made to”, or “capable or according to the situation.
  • the term “configured to” may not necessarily imply “specifically designed to” in hardware.
  • the expression “device configured to” may mean that the device, together with other devices or components, “is able to”.
  • the phrase “processor adapted (or configured) to perform A, B, and C” may mean a dedicated processor (e.g. embedded processor) only for performing the corresponding operations or a generic-purpose processor (e.g., central processing unit (CPU) or application processor (AP)) that can perform the corresponding operations by executing one or more software programs stored in a memory device.
  • a dedicated processor e.g. embedded processor
  • a generic-purpose processor e.g., central processing unit (CPU) or application processor (AP)
  • the electronic device may include at least one of a smartphone, a tablet personal computer (PC), a mobile phone, a video phone, an electronic book (e-book) reader, a desktop PC, a laptop PC, a netbook computer, a personal digital assistant (PDA), a portable multimedia player (PMP), an MP3 player, a mobile medical appliance, a camera, and a wearable device (e.g., a head-mounted-device (HMD) such as electronic glasses, electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, electronic tattoos, or a smart watch).
  • a wearable device e.g., a head-mounted-device (HMD) such as electronic glasses, electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, electronic tattoos, or a smart watch.
  • HMD head-mounted-device
  • the electronic device may be a smart home appliance.
  • the home appliance may include at least one of, for example, a television, a digital video disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washing machine, an air cleaner, a set-top box, a home automation control panel, a security control panel, a TV box (e.g., Samsung HomeSyncTM, Apple TVTM, or Google TVTM), a game console (e.g., XboxTM and PlayStationTM),an electronic dictionary, an electronic key, a camcorder, and an electronic photo frame.
  • a TV box e.g., Samsung HomeSyncTM, Apple TVTM, or Google TVTM
  • a game console e.g., XboxTM and PlayStationTM
  • an electronic dictionary e.g., an electronic key, a camcorder, and an electronic photo frame.
  • the electronic device may include at least one of various medical devices (e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood pressure measuring device, a body temperature measuring device, etc.), a magnetic resonance angiography (MRA), a magnetic resonance imaging (MRI), a computed tomography (CT) machine, and an ultrasonic machine), a navigation device, a global positioning system (GPS) receiver, an event data recorder (EDR) , a flight data recorder (FDR) , a vehicle infotainment device, an electronic devices for a ship (e.g., a navigation device for a ship, and a gyro-compass), avionics, security devices, an automotive head unit, a robot for home or industry, an automatic teller's machine (ATM), a point of sales (POS) in a shop, or internet device of things (e.g., a light bulb, various sensors, electric or gas meter, a sprinkler
  • the electronic device may include at least one of a part of furniture or a building/structure, an electronic board, an electronic signature receiving device, a projector, and various kinds of measuring instruments (e.g., a water meter, an electric meter, a gas meter, and a radio wave meter).
  • the electronic device according to various embodiments of the present disclosure may be a combination of one or more of the aforementioned various devices.
  • the electronic device according to various embodiments of the present disclosure may be a flexible device. Further, the electronic device according to an embodiment of the present disclosure is not limited to the aforementioned devices, and may include a new electronic device according to the development of technology.
  • the term “user” may indicate a person who uses an electronic device or a device (e.g., an artificial intelligence electronic device, an automated device for performing testing operations, etc.) that uses an electronic device.
  • a device e.g., an artificial intelligence electronic device, an automated device for performing testing operations, etc.
  • FIG. 1 illustrates a network environment including an electronic device according to various embodiments of the present disclosure.
  • a network environment 100 includes an electronic device 101 .
  • the electronic device 101 may include a bus 110 , a processor 120 , a memory 130 , an input/output interface 150 , a display 160 , and a communication interface 170 .
  • the electronic device 101 may omit at least some of the above components or further include other components.
  • the bus 110 may include a circuit for connecting the components 120 to 170 and transmitting communication between the components (for example, control messages and/or data).
  • the processor 120 may include one or more of a CPU, an AP, and a communication processor (CP).
  • the processor 120 may carry out operations or data processing related to control and/or communication of at least one other component of the electronic device 101 .
  • the processor 120 may control the electronic device 101 to perform a memo function or a scrap function. According to an embodiment, the electronic device 101 may execute a scrap mode for storing or temporarily storing data, for example, text or image selected by the user. Further, the processor 120 may control the electronic device 101 to inform the user that a stored or temporarily stored content exist by controlling the display 160 to display a floating user interface (UI).
  • UI floating user interface
  • the memory 130 may include a volatile memory and/or a non-volatile memory.
  • the memory 130 may store, for example, commands or data related to one or more other components of the electronic device 101 .
  • the memory 130 may store software and/or a program 140 .
  • the program 140 may include, for examples, a kernel 141 , middleware 143 , an application programming interface (API) 145 , and/or an at least one application program (or “applications”) 147 .
  • At least some of the kernel 141 , the middle 143 , and the API 145 may be referred to as an operating system (OS).
  • OS operating system
  • the kernel 141 may control or manage system resources (for example, the bus 110 , the processor 120 , or the memory 130 ) used for executing an operation or function implemented by other programs (for example, the middleware 143 , the API 145 , or the application 147 ). Furthermore, the kernel 141 may provide an interface through which the middleware 143 , the API 145 , or the application program 147 may access individual components of the electronic device 101 to control or manage system resources.
  • system resources for example, the bus 110 , the processor 120 , or the memory 130
  • other programs for example, the middleware 143 , the API 145 , or the application 147 .
  • the kernel 141 may provide an interface through which the middleware 143 , the API 145 , or the application program 147 may access individual components of the electronic device 101 to control or manage system resources.
  • the middleware 143 may serve as, for example, an intermediary such that the API 145 or the application program 147 communicate with the kernel 141 to transmit/receive data. Furthermore, in regard to task requests received from the application program 147 , the middleware 143 may perform a control (for example, scheduling or load balancing) for the task requests using, for example, a method of assigning a priority for using the system resources (for example, the bus 110 , the processor 120 , or the memory 130 ) of the electronic device 101 to at least one application.
  • a control for example, scheduling or load balancing
  • the API 145 is an interface by which the applications 147 control functions provided from the kernel 141 or the middleware 143 , and may include, for example, at least one interface or function (for example, instructions) for file control, window control, image processing, or text control.
  • the input/output interface 150 may serve as an interface that may transfer commands or data, which is input from the user or another external device, to the other components of the electronic device 101 . Further, the input/output interface 150 may output commands or data of the electronic device 101 to the user or another external device.
  • the display 160 may include, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, a micro electro mechanical system (MEMS) display, or an electronic paper display.
  • the display 160 may display various contents (for example, text, images, videos, icons, or symbols).
  • the display 160 may include a touch screen and receive, for example, a touch input, a gesture input, a proximity input, or a hovering input using an electronic pen or a user's body part.
  • the communication interface 170 may enable communication between the electronic device 101 and an external device (for example, a first external electronic device 102 , a second external electronic device 104 , or a server 106 ).
  • the communication interface 170 may be connected to a network 162 through wireless or wired communication to communicate with the external device (for example, the second external electronic device 104 or the server 106 ).
  • the wireless communication may include, for example, at least one of long term evolution (LTE), LTE-advanced (LTE-A), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications system (UMTS), wireless broadband (WiBro), and global system for mobile communications (GSM) as a cellular communication protocol.
  • the wired communication may include at least one of, for example, a universal serial bus (USB), a high definition multimedia interface (HDMI), recommended standard 232 (RS-232), and a plain old telephone service (POTS).
  • the network 162 may include at least one of communication networks such as a computer network (for example, a local area network (LAN) or a wide area network (WAN)), the Internet, and a telephone network.
  • Each of the first and second external electronic devices 102 and 104 may be the same as or different from the electronic device 101 .
  • the server 106 may include a group of one or more servers,
  • all or some of the operations performed by the electronic device 101 may be performed by another electronic device or a plurality of electronic devices (for example, the electronic device 102 or 104 or the server 106 ).
  • the electronic device 101 may request remotely perform at least some of the functions at another device (for example, the electronic device 102 or 104 or the server 106 ).
  • the other electronic device (for example, the electronic device 102 or 104 or the server 106 ) may perform the additional function and transfer the result to the electronic device 101 .
  • the electronic device 101 may provide the requested functions or services based on the received result or after additionally processing the received result.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • the user input received by the electronic device 101 may be one or more of the touch input, the hovering input, the voice input, and the gesture input.
  • the touch input or the hovering input may be made by a user's body (for example, a finger or the like) or a stylus pen.
  • the voice input may be received from the user through a microphone in the electronic device 101 and the gesture input may be received from the user through a camera in the electronic device.
  • an electronic device for processing a content is provided.
  • FIG. 2 is a flowchart of a method of processing contents by an electronic device according to various embodiments of the present disclosure.
  • the processor 120 may determine a text or image corresponding to a user input, for example, selecting of text or images through the display 160 in operation S 202 .
  • the processor 120 may capture the text or image and determine the captured text or image as a content in operation S 204 .
  • the content may be a captured image including both the text and an image selected by the user.
  • the user selects one area including 100 characters and 2 still images that are displayed on the screen in operation S 202 .
  • the processor 120 may capture the 100 characters and the 2 still images and determine the captured area as the selected content in operation S 204 .
  • the processor 120 may store the content in operation S 206 .
  • the processor 120 may temporarily store the content in the memory 130 .
  • the processor 120 may process the content according to a user input in operation S 208 .
  • the processor 120 may display the content on the display, or store, edit, share, or delete the content according to the user input in operation S 208 .
  • the processor 120 may store or temporarily store a plurality of contents, and process the plurality of stored or temporarily stored contents individually or simultaneously.
  • the electronic device 101 may store or temporarily store one or more contents.
  • the processor 120 may store or temporarily store contents in an amount equal to a predetermined number. For example, the processor 120 may store or temporarily store 20 contents and process each of the 20 contents according to a user input.
  • FIG. 3 is a flowchart of another method of processing contents by the electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may execute a scrap mode according to a user input (first input) in operation S 302 .
  • the processor 120 of the electronic device 101 may determine whether a user input (second input) for selecting at least a part of the image (for example, still image or dynamic image) or text displayed on the screen is received in operation S 304 .
  • the process returns back to operation S 304 until a second input is received.
  • the processor 120 may maintain the scrap mode without performing any operation.
  • the processor 120 may capture the text or image according to the second input and determine the captured text or image as a first content in operation S 306 .
  • the processor 120 may store the first content in operation S 308 .
  • the first content may remain within the electronic device 101 even though power of the electronic device 101 is turned off or the scrap mode is terminated.
  • the first content may be stored in a temporary folder and, when power is turned on, the processor 120 may load the first content and display the loaded first content on the display 160 .
  • the first content may be stored temporarily.
  • the first content may be automatically deleted when power of the electronic device 101 is turned off or the scrap mode is terminated.
  • the electronic device 101 may display a floating UI including scrap information on the first content and a second content, which was stored before the first contents, on the screen in operation S 310 .
  • the scrap information may include, for example, a thumbnail image of the first content and the second content and the number of first contents and the second contents.
  • the processor 120 may display thumbnail images corresponding to the first content and the second content in the floating UI as a radial menu.
  • the processor 120 may display the floating UI on the uppermost layer (i.e., having the highest z-order).
  • content information which corresponds to information on the corresponding content, may he stored in the memory 130 .
  • the content information may include the number of images included in each of the first content and the second content a time when each of the first content and the second content is stored, and application information to which the image or text included in each of the first content and the second content is provided.
  • the content information may be displayed together with the corresponding content.
  • the processor 120 may determine whether a user input i.e., a third input) for processing the first or second content is received in operation S 312 .
  • the processor 120 may control the electronic device 101 to maintain the current state without performing any operation.
  • the processor 120 may process the first content or the second content according to the third input in operation S 314 .
  • the processor 120 may display the first content or the second content on the screen, or store, edit, share, or delete the first content or the second content according to the user input in operation S 314 .
  • the processor 120 may convert colors of the image included in the first content or the second content into black and white, or extract the text included in the first content or the second content and store contacts such as a phone number and a mail address among the text in the memory 130 .
  • the operation mode may be automatically switched to the scrap mode when the image or text is selected according to the user input in another embodiment.
  • the scrap mode may be automatically executed when a predetermined shortcut key is input to execute the scrap mode. Further, the scrap mode may be automatically executed when a predetermined gesture input or voice input is input to execute the scrap mode.
  • a method for processing a content at an electronic device includes; generating a content corresponding to a user input among one or more contents displayed through a display of the electronic device as a first content; and displaying a floating UI, which displays first scrap information on the first content, on a screen.
  • FIGS. 4A , 4 B, 4 C, 4 D) and 4 E illustrate an example of processing contents by the electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may execute first mode 401 , second mode 402 , third mode 403 , fourth mode 404 , and a scrap mode 405 .
  • the processor 120 may display a mode selection table 400 for selecting the operation mode corresponding to a user input on the display 160 .
  • the user may select the operation mode of the electronic device 101 by selecting one of the modes 401 - 405 .
  • the electronic device 101 may receive a user input 411 for executing the scrap mode 405 from the user. At this time, the user input 411 may be, for example, tapping, clicking, or double-clicking the scrap mode 405 .
  • the processor 120 may load the mode selection table 400 according to a user input.
  • the processor 120 may load the mode selection table according to a touch input for selecting a preset position of the screen continuously for a predetermined time or longer.
  • the electronic device 101 may directly enter the scrap mode without loading the mode selection table 400 according to a user input. For example, when an area including the image or text is selected, the scrap mode 405 is automatically executed, and then the area may be captured and stored as content. Further, the floating UI showing scrap information on the content may be displayed.
  • an execution screen 421 according to a user input 410 in the scrap mode 405 is illustrated.
  • the user selects a part 430 of the text or image included in the execution screen 421 using a finger or a stylus pen 600 .
  • the user may select the image or text within the area 430 by drawing a line from top to bottom of the area 430 including the text or image, which the user desires to select, in the execution screen 421 by using the stylus pen 600 .
  • the processor 120 may generate a captured image by capturing the area 430 .
  • the processor 120 may control the display 160 to distinguish the area 430 , which is selected by the user, from other areas.
  • the display 160 may overlay the area 430 selected by the user with a layer of a color different from the background color of the execution screen 421 .
  • the captured image 430 is floated over the execution screen 421 .
  • the processor 120 may capture the area 430 including the text or image. Further, the processor 120 may float the captured image 430 to display it over the execution screen 421 .
  • the user may input a command for processing the captured image 430 into the electronic device 101 .
  • the processor 120 may display an icon 452 corresponding to a command for temporarily storing the captured image 430 as content.
  • the user inputs a user input 412 for selecting the icon 452 into the electronic device 101 .
  • the user input 412 may be, for example, clicking or double-clicking the icon 452 .
  • the processor 120 may temporarily store the captured image 430 as content.
  • the processor 1120 may display an icon 451 corresponding to a command for extracting and parsing the text on the captured image 430 .
  • the electronic device 101 may process the captured image 430 according to a user input by selecting at least one of the icons 441 , 442 , and 443 .
  • the processor 120 may store the captured image 430 in the memory 130 or delete the captured image 430 according to a user input by selecting the icon 441 .
  • the processor 120 may combine captured images into one file according to the user input by selecting the icon 441 . For example, if three contents are displayed on the display 160 , the processor 120 may merge the three contents into a single file.
  • the processor 120 may merge the contents into one image or document file (for example, a pdf file).
  • the captured image 430 may be shared with another application or other devices (for example, the electronic devices 102 and 104 or the server 106 ) according to a user input by selecting the icon 442 .
  • the processor 120 may determine an application or a device with which the captured image 430 is shared according to a user input. When the application or the device with which the captured image 430 is shared is determined, the processor 120 may convert a format of the captured image 430 into a format suitable for the corresponding application or device. For example, when the captured image 430 is shared with a first application, the processor 120 may convert the captured image 430 into a jpg file.
  • the processor 120 may convert the captured image 430 into a bmp file.
  • the one captured image 430 may be simultaneously shared with a plurality of applications or devices.
  • the processor 120 may convert the format of the captured image 430 into a format suitable for each of the applications and devices.
  • the processor 120 may combine the captured images 430 into one file and share the file. For example, the processor 120 may combine three captured images 430 into one captured image and share the combined captured image. Further, the processor 120 may share each of the captured images 430 with each of the applications or devices. For example, when the number of captured images 430 is three, the processor 120 may sequentially transmit the three captured images 430 to the applications or the devices.
  • the processor 120 may extract characters from the captured image and share only the extracted characters with the applications or the devices.
  • the processor 120 may share the captured image 430 by transmitting each of the image and the text extracted from the captured image 430 to the applications or the devices.
  • the applications or the devices, with which the captured image 430 is shared may display both the image and the text or display only one of the image and the text according to the specification of the corresponding applications or devices.
  • the processor 120 may edit information on the captured image 430 (for example, a tag including a photographing time of the image included in the captured image 430 , a file size, a resolution, a file path, and like) according to a user input by selecting the icon 443 .
  • the electronic device 101 may display a floating UI 450 on the screen.
  • the floating UI 450 may be a UT which informs the user that the captured image captured in FIG. 4D is temporarily stored in the electronic device 101 .
  • thumbnail images corresponding to one or more temporarily stored contents and the number of temporarily stored contents may be displayed through the floating UI 450 .
  • the floating UI 450 may have a form in which the thumbnail images overlap each other in a radial shape, a form in which the thumbnail images are horizontally arranged and overlap each other, or a form in which the thumbnail images are vertically arranged and overlap each other, and may have an animation in which the thumbnail images are rotated every preset time (for example, every one second).
  • the processor 120 may display the floating UI 450 on the uppermost layer (i.e., with the highest z-order). Accordingly, the floating UI 450 may be displayed in the entire area of the screen of the display 160 .
  • the processor 120 may display a thumbnail image of the content 430 through the floating UI 450 to inform that the content 430 is temporarily stored. Further, the content 430 is temporarily stored, the processor 120 may display “1” via the floating UI to indicate the number of temporarily stored contents.
  • the electronic device 101 receives a user input 413 for selecting the floating UI 450 .
  • the processor 120 may control the display 160 to display the temporarily stored content 430 in the form of, for example, a popup window.
  • the temporarily stored content 430 in the form of the popup window is floated. Since the content 430 includes text, the processor 120 may display, on the content 430 , an icon 451 corresponding to a command for extracting and parsing the text included in the contents 430 . Further, when the content is loaded through the floating UI 450 , the processor 120 may control the display 160 to not display the floating UI 450 or to display the floating UI 450 on a lower layer of the layer including the content 430 , so as to make the floating UI 450 hidden by the content 430 .
  • FIGS. 5A , 5 B, 5 C, and 5 D illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • an image selected according to a user input is captured and floated on an execution screen 521 .
  • the image may be captured, and the icon 452 corresponding to the command for temporarily storing the captured image 531 as a content may be displayed on the image.
  • the icon 451 corresponding to the command for parsing the text may not be displayed on the captured image 531 .
  • FIG. 5B illustrates an execution 521 after the capture image, that is, the content 531 is temporarily stored, and a scrap window 570 showing temporarily stored content 531 and 532 together.
  • the floating UI 450 may be displayed on the execution screen 521 .
  • FIG. 5B illustrates the floating UI 450 on an upper right part of the execution screen 521
  • the floating UI 450 may be located at any part of the execution screen 521 .
  • the floating UI 540 may display the number of temporarily stored contents and a thumbnail image corresponding to each of the temporarily stored contents.
  • a thumbnail image reduced from the captured image, that is, the content 531 is displayed through the floating UI 450 .
  • the processor 120 displays another execution screen 522 on the display 160 after the content 531 is temporarily stored.
  • the processor 120 may display a floating window 560 including temporarily stored contents or thumbnail images 561 , 562 and 563 corresponding to the contents on the execution screen 522 .
  • the floating window 560 may include the thumbnail images corresponding to the temporarily stored contents. For example, when there are 30 stored contents, the floating window 560 may also display all 30 thumbnail images. Although only the three thumbnail images 561 , 562 , and 563 are displayed in the floating window 560 in FIG. 5C , the processor 120 may display more thumbnail images through the floating window 560 according to a user input for scrolling the floating window 560 .
  • the temporarily stored contents 531 , 532 , and 533 are displayed together.
  • the scrap screen may display the temporarily stored contents 531 , 532 , and 533 sequentially in an order of temporary storage (i.e., either ascending or descending in time).
  • the scrap screen may include icons 571 , 572 , and 573 corresponding to commands for processing the temporarily stored contents 531 - 533 .
  • the processor 120 may process the contents 531 - 533 according to a user input for selecting one of the icons 571 - 573 . For example, when the icon 571 is selected by the user, the processor 120 may switch the screen to an execution screen 531 . When the icon 572 is selected by the user, the processor 120 may share the corresponding content with another application or other devices (for example, the electronic devices 102 and 104 or the server 106 ). When the icon 573 is selected by the user, the processor 120 may delete the corresponding content.
  • the processor 120 may select one of the contents 531 , 532 , and 533 and then process the corresponding content according to a user input for selecting one of the icons 571 - 573 .
  • the user may select one of the contents 531 - 533 and request processing the corresponding content to the electronic device 101 by dragging the selected content to one of the icons 571 - 573 .
  • the user may select one of the contents 531 - 533 and then make a request for deleting the content to the electronic device 101 by dragging the corresponding content to the left or right side.
  • FIGS. 6A , 6 B, 6 C, and 6 D illustrate an example of selecting a content according to a user input by the electronic device 101 according to various embodiments of the present disclosure.
  • FIG. 6A illustrates an operation mode of the electronic device 101 , and a scrap mode is selected according to a user input.
  • the electronic device 101 may execute the first to fourth modes 401 , 402 , 403 , and 404 , and the scrap mode 405 .
  • the processor 120 may display the mode selection table 400 for selecting the operation mode corresponding to a user input on the display 160 .
  • the user may select the operation mode of the electronic device 101 by selecting one of the modes 401 - 405 .
  • the electronic device 101 may receive a user input 601 for executing the scrap mode 405 from the user. At this time, the user input 601 may be, for example, clicking or double-clicking the scrap mode 405 .
  • the user input 601 may be made by selecting the scrap mode 405 using a stylus pen 600 by the user.
  • an area 641 is illustrated in which a content to be temporarily stored is selected according to a user input.
  • the user may select the area 641 including an image or text through the stylus pen 600 and capture the image or text included in the area 641 . Further, the user may input a command for temporarily storing the captured image as a content into the electronic device 101 .
  • the user selects the area 641 including the image or text by inputting, into the electronic device 101 , a user input 602 for selecting a virtual OK button 602 displayed on the execution screen 621 through the display 160 .
  • FIG. 6C a user input 603 for selecting contents to be temporarily stored is illustrated.
  • the user draws a diagonal line 603 from the upper left part of the execution screen 621 by using the stylus pen 600 .
  • the processor 120 may determine a rectangle including the diagonal line as the area including the image or text which the user desires to temporarily store.
  • FIG. 6D a captured image 631 generated by the user input 603 in FIG. 6C is illustrated.
  • the processor 120 may control the display 160 such that a layer with a color different from the background color of the execution screen 621 is located on the area 631 including the image or text.
  • FIGS. 7A , 7 B, 7 C, and 7 D illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 selects or temporarily stores text or an image among the images or text included in an execution screen 721 as content 730 according to a user input.
  • FIG. 7A illustrates icons 441 , 442 , 443 , 451 , and 452 corresponding to commands for processing the stored content 730 .
  • the processor 120 may capture the selected image or content, and separately float the captured image 731 on an execution screen 721 .
  • the user may temporarily store the captured image 731 as a content by inputting a user input for selecting the icon 452 .
  • content 732 only stores an image.
  • content 732 stores an image and text.
  • content 733 only includes text.
  • the electronic device 101 may temporarily store the contents 731 , 732 , and 733 according to a user input by selecting the icon 452 . Further, in FIGS. 7B and 7C , the electronic device 101 may extract and parse text included in the contents 732 and 733 by selecting the icon 451 .
  • FIGS. 8A and 8B illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure
  • a captured image 831 that is generated by capturing an area in an execution screen 821 is floated.
  • the processor 120 may extract text data from the content 831 through, for example, an OCR scheme. Referring to FIG. 8B , text data 832 is illustrated and was extracted from the content 831 . In FIGS. 8A and 8B , the processor 120 may temporarily store the capture image 831 as a content.
  • the processor 120 may parse the text data 832 .
  • the processor 120 may parse all of characters, numbers, and special letters included in the text data 832 .
  • the processor 120 may share the text data 832 with all applications, that can be executed by the electronic device 101 , or with other devices (for example, the electronic devices 102 and 104 or the server 106 ) according to the user input for selecting the icon 442 .
  • the processor 120 may extract contacts such as an address, a phone number, a mail address, and a website address included in the text data 832 and may determine the extracted data as scrap information on the content 831 .
  • the processor 120 may temporarily store the scrap information to correspond to the content 831 and display the scrap information through the floating UI 450 .
  • the processor 120 may display, on the execution screen 831 , a predetermined image (for example, a loading image) indicating that the text data 832 is being extracted until the text data 832 can be displayed on the display 160 as illustrated in FIG. 8B .
  • a predetermined image for example, a loading image
  • FIG. 9 illustrates another example of processing the content by the electronic device 101 according to various embodiments of the present disclosure.
  • the processor 120 may extract information the text data 832 as results 931 - 935 .
  • the processor 1120 may separately extract only an image from the content 831 .
  • the processor 120 may control the display 160 to display a result of analysis of the text data 832 close to the text data 832 .
  • results 931 to 935 of the analysis of the text data 832 are displayed on the lower part of the text data 832 .
  • the processor 120 may extract a phone number 931 from the text data 832 . Further, the processor 120 may also display icons 941 corresponding to a command for processing the phone number 931 . The processor 120 may share the phone number 931 with another application or an electronic device (for example, the electronic device 102 or 104 , or the server 106 ) according to a user input applied to one of the icons 941 . For example, the processor 120 may call, send a mail, send a text message to the phone number 931 , or add the phone number 931 to an address book by selecting one of the icons 941 . According to an embodiment, when there are multiple phone numbers included in the text data 932 , the processor 120 may sequentially display phone numbers 961 and 962 and display a mail address 963 together with the phone numbers 961 and 962 .
  • the processor 120 may extract an email address 932 from the text data 832 . Further, the processor 120 may also display icons 942 corresponding to a. command for processing the email address 932 . The processor 120 may share the mail address 932 with another application or an electronic device (for example, the electronic device 102 or 104 , or the server 106 ) by selecting one of the icons 942 . Further, the processor 120 may share the email address 932 , transmit an email to the email address 932 , transmit a message to the email address 932 , or add the email address 932 to the address book by selecting one of the icons 942 .
  • the processor 120 may extract schedule information 933 from the text data 832 .
  • the processor 120 may manage the user's schedule by adding the schedule 933 information to a calendar app. Further, the processor 120 may also display icons 943 corresponding to a command for processing the schedule information 933 .
  • the processor 120 may share the schedule information 933 with another application or an electronic device (for example, the electronic device 102 or 104 , or the server 106 ) by selecting one of the icons 943 . Further, the processor 120 may share the schedule information 933 or add the schedule information 933 to the calendar app preinstalled in the electronic device 101 by selecting one of the icons 943 .
  • the processor 120 may extract a website address from the text data 832 and display the website 934 below the text data 832 . Further, the processor 120 may also display icons 944 corresponding to a command for processing the website address. The processor 120 may share the website 934 with another application or an electronic device (for example, the electronic device 102 or 104 , or the server 106 ) by selecting one of the icons 944 . Further, the processor 120 may browse the website 934 through a browser or add the website 934 to a bookmark list by selecting one of the icons 944 .
  • the processor 120 may extract map data 935 (for example, an address or a GPS coordinate) from the text data 832 .
  • the processor 120 may display the map 935 including a position corresponding to the map data below the text data 832 . Further, the processor 120 may also display icons 945 corresponding to a command for processing the map data 935 together with the map data 935 .
  • the processor 120 may share the map data 935 with another application or an electronic device (for example, the electronic device 102 or 104 , or the server 106 ) by selecting one of the icons 945 . Further, the processor 120 may display an address included in the text data on the map 935 by selecting one of the icons 945 .
  • FIGS. 10A , 10 B, 10 C, 10 D, 10 E, 10 F, 10 G, and 10 H illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • the processor 120 may capture an area including an image and text from an execution screen 1021 and temporarily store the captured image 1031 as first a content.
  • FIG. 10A floats and displays the temporarily stored first content 1031 .
  • a thumbnail image corresponding to the first content 1031 may be displayed in the floating UI 450 .
  • a floating UI 450 may display “1” as the number of temporarily stored contents.
  • the processor 120 may capture an area including one image from an execution screen 1022 and temporarily store the captured image 1032 as a second content.
  • FIG. 10C a user input for selecting the area including the image from the execution screen 1022 is illustrated. The user may select the area 1032 through the stylus pen 600 and capture the image or text included in the area 1032 .
  • the captured image 1032 is floated on the execution screen 1022 .
  • the processor 120 may temporarily store the captured image 1032 . Further, the processor 120 may convert the captured image 1032 into the second content and temporarily store the second content.
  • the processor 120 may display, on the execution screen 1022 , the floating UI 450 to which a thumbnail image corresponding to the second content 1032 is added. Since the number of temporarily stored contents is 2, the floating UI 450 may display “2” as the number of temporarily stored contents.
  • the processor 120 may capture an area including three images from the execution screen 1022 and temporarily store the captured images 1033 as a third content.
  • a user input for selecting the area is illustrated and includes the three images from the execution screen 1022 .
  • the user may select the area 1033 through the stylus pen 600 to determine the area including the image or text to be temporarily stored.
  • the temporarily stored third content 1033 is floated on the execution screen 1022 .
  • the processor 120 may convert the captured image 1033 into the second content 1033 and temporarily store the second contents.
  • the processor 120 may display, on the execution screen 1022 , the floating UI 450 to which a thumbnail image corresponding to the third content 1033 is added. Since the number of temporarily stored contents is 3, the floating UI 450 may display “3” as the number of temporarily stored contents. Referring to FIG. 10H , thumbnail images displayed through the floating UI 450 may be displayed to partially overlap each other. At this time, with respect to the thumbnail images corresponding to the temporarily stored contents, the floating UI 450 may first display the thumbnail image corresponding to the contents stored later. In contrast, with respect to the thumbnail images corresponding to the temporarily stored contents, the floating UI 450 may first display the thumbnail image corresponding to the contents stored relatively earlier.
  • FIGS. 11A , 11 B, 11 C, and 11 D illustrate another example of processing the contents by the electronic device 101 according to various embodiments of the present disclosure.
  • a popup window 1151 including thumbnail images 1131 , 1132 , and 1133 corresponding to temporarily stored contents is displayed on an execution screen 1121 .
  • the popup window 1151 is an example of the floating UI 450 , and may be displayed on execution screens 1121 , 1122 , and 1123 including thumbnail images 1131 , 1132 , and 1133 of temporarily stored contents.
  • the popup window 1151 may be displayed at any position of the execution screen 1121 and may be move according to a user input.
  • the popup window 1151 is displayed on an upper right part of the execution screen 1121 .
  • a popup window 1152 including thumbnail images 1134 , 1135 , and 1136 corresponding to the temporarily stored contents is displayed on a lower left part of the execution screen 1122 .
  • a user input 1101 is input for selecting one thumbnail image 1133 by dragging the selected thumbnail image 1133 in a diagonal direction .
  • the processor 120 may move the contents corresponding to the thumbnail image 1131 - 1133 according to the user input 1101 .
  • the popup window 1151 moves based on the user input 1101 and is positioned on a lower left part of the execution screen 1122 .
  • the popup window 1152 moved according to the user input 1101 is displayed on a lower left part of the execution screen 1122 .
  • FIGS. 11C and 11D illustrate an example for moving temporarily stored contents 1134 to 1136 to the execution screen 1123 by using a popup window 1151 .
  • the processor 120 displays a text message window according to a message transmission/reception application as the execution screen 1123 .
  • the electronic device 101 receives a user input 1102 for selecting the thumbnail image 1134 and dragging the selected thumbnail image 1134 in a diagonal direction. According to an embodiment, the electronic device 101 may move the thumbnail image 1134 to an input window 1171 of the execution screen 1123 .
  • the processor 120 may move the contents 1134 corresponding to the thumbnail image 1134 to the execution screen 1123 according to the user input 1102 as illustrated in FIG. 11C . That is, the processor 120 may allow the user to use the contents 1134 in an application by moving the contents 1134 according to the user input 1102 . Referring to FIG. 11D , the user may transmit the contents 1134 to another device through the message transmission/reception application.
  • FIGS. 12A and 12B illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • screens displaying a list 1210 of temporarily stored contents are illustrated.
  • the processor 120 may generate a list 1210 in which temporarily stored contents 1211 , 1212 , and 1213 are arranged based on when the contents 1211 - 1213 are stored.
  • the processor 120 may execute a command corresponding to each of the icons 441 , 442 , and 443 .
  • the processor 120 may store, delete, share and edit at least one of the contents 1211 - 1213 according to the user input.
  • a user input 1201 for deleting the first contents 1211 is input.
  • the processor 120 may remove the selected first content 1211 from the list 1210 .
  • FIG. 12B the list 1210 having the first content 1211 is removed is illustrated.
  • the processor 120 may delete the corresponding content from the list 1210 by receiving a user input for selecting the icon 1223 displayed on each of the contents 1211 - 1213 .
  • FIGS. 13A , 13 B, and 13 C illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • the processor 120 may receive a user input for selecting text displayed on an execution screen 1311 through the stylus pen 600 .
  • the processor 120 may control the display 160 such that the selected text 1331 is underlined.
  • An area including the selected text or image may be captured and temporarily stored as a content.
  • the user may select a desired image or text by inputting a user input for underlining the desired image or text into the electronic device 101 .
  • the processor 120 may display an editing menu 1350 .
  • the user may determine a method of processing the selected image or text by selecting one of the menus included in the editing menu 1350 .
  • the user selects the “see together” menu.
  • FIG. 13C in response to the selected menu item, an operation according to the selected menu in FIG. 13B is executed.
  • the execution screen displays only the image or the text selected by the user in FIG. 13A .
  • FIGS. 14A and 14B illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • a user input 1401 for the “see together” function is provided.
  • the processor 120 may control the display 160 to display only the image or the text selected by the user according to the user input 1401 .
  • the processor 120 may control the display 160 to distinguish between texts included in different paragraphs.
  • a text included in a first paragraph and a text included in a second paragraph are selected according to a user input.
  • FIGS. 15A , 15 B, 15 C, and 15 D illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • an execution screen 1521 illustrates execution of a first application.
  • the processor 120 may capture an area 1531 including an image or a text corresponding to a user input and temporarily store the captured image as content 1532 .
  • the processor 120 may move the content 1532 to an execution screen that executes an application different from the first application according to a user input.
  • the content 1532 is moved to an execution screen 1522 that executes a second application. Since the content 1532 includes both the image and the text, the second application may receive both the image and the text included in the content 1532 . According to another embodiment, scrap information on the content 1532 may be input into the second application. The user may more the content 1532 to the second application by dragging the selected image or text to the execution screen 1522 .
  • the content 1532 is displayed on the execution screen 1522 that executes the second application.
  • the processor 120 may copy the content 1532 to the second application.
  • the content 1532 is moved or copied to an application other than the first application.
  • a third application is a messenger.
  • the user may display the content 1532 in a messenger window 1523 by dragging the content 1532 to the messenger window 1523 .
  • FIG. 16 illustrates another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • the processor 120 may extract a website address 1620 displayed in the browser window 1610 according to a user input.
  • the user input may be a flick on the browser window 1610 by using a finger or the stylus pen 600 , and another type of input (for example, a hovering input, a voice input, a gesture input, or the like) may be recognized by the electronic device 101 as the user input for extracting the website address 1620 .
  • the processor 120 may temporarily store the website address 1620 and display the website address 1620 on the right side of the browser window 1610 as illustrated in FIG. 16 .
  • the user may delete the website address 1620 by a drag input by a directional input on the website address 1620 from top to bottom or from bottom to top.
  • FIG. 17 illustrates another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • the processor 120 when the electronic device 101 is implemented to have dual monitors, the processor 120 temporarily stores the extracted website address 1620 and copies and inputs the website address 1620 into an execution screen 1720 of a different application. For example, the processor 120 may copy the website address 1620 extracted from the browser window 1620 and input the website address 1620 into an input window 1721 of the execution screen 1720 .
  • FIGS. 18A and 18B illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • HTTP hypertext markup language
  • the processor 120 may determine the position (for example, a line) of the html document at which the part selected by the user is located.
  • the processor 120 may insert “f727to732f890t932” into a uniform resource locator (URL) corresponding to lines 727-732 and 890-932 of the html document and store the URL or transmit the URL to another external device (for example, the electronic device 102 or 104 , or the server 106 ).
  • URL uniform resource locator
  • the processor 120 may transmit “http://ss.com/253#f728t760f820t884” to another external electronic device.
  • the processor 120 may show images or text corresponding to lines 727-732 and lines 890-932.
  • the processor 120 may control the display 160 to show the corresponding area by scrolling the Internet page to allow the user to intuitively identify the area including the images or texts corresponding to the lines 727-732 and the lines 890-932 by using the URL.
  • the processor 120 may show the area including the images or text corresponding to the lines 727-732 and the lines 890-932 to be in contrast with other areas of the Internet page.
  • FIGS. 19A , 19 B, and 19 C illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure
  • an execution screen 1921 includes a character or text and the user selects text by using the stylus pen 600 . Further, the processor 120 may display a menu bar 1925 for processing the selected text 1932 . In FIG. 19A , the user selects a “memo” function from the menu bar 1925 .
  • the processor 120 may control the display 160 to float and display a memo window 1940 on the execution screen 1921 .
  • the processor 120 may distinguish the text 1932 selected by the user from texts, which are not selected, through the display 160 .
  • the electronic device 101 may receive a memo through the user's finger or the stylus pen 600 . At this time, the memo may be related to the selected text 1932 .
  • the processor 120 may hide the memo window 1941 . Further, the processor 120 may inform the user that the memo related to the text 1932 exists by displaying a memo icon 1942 .
  • FIG. 20 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • an electronic device 2001 may include, for example, all or some of the electronic device 101 illustrated in FIG. 1 .
  • the electronic device 2001 may include at least one AP 2010 , a communication module 2020 , a subscriber identification module (SIM) card 2024 , a memory 2030 , a sensor module 2040 , an input device 2050 , a display 2060 , an interface 2070 , an audio module 2080 , a camera module 2091 , a power management module 2095 , a battery 2096 , an indicator 2097 , and a motor 2098 .
  • SIM subscriber identification module
  • the AP 2010 may control a plurality of hardware or software components connected thereto by driving an operating system or an application program and perform a variety of data processing and calculations.
  • the AP 2010 may be embodied as, for example, a system on chip (SoC).
  • the AP 2010 may further include a graphic processing unit (GPU) and/or an image signal processor.
  • the AP 2010 may include some of the components (for example, the cellular module 2021 ) illustrated in FIG. 20 .
  • the AP 2010 may load instructions or data, received from at least one other element (for example, a non-volatile memory), in a volatile memory to process the loaded instructions or data and may store various types of data in a non-volatile memory.
  • the communication module 2020 may be similar to the communication interface 200 of FIG. 1 .
  • the communication module 2020 may include, for example, a cellular module 2021 , a Wi-Fi module 2023 , a Bluetooth (BT) module 2025 , a GPS module 2027 , a near field communication (NFC) module 2028 , and a radio frequency (RF) module 2029 .
  • a cellular module 2021 a Wi-Fi module 2023
  • BT Bluetooth
  • GPS module 2027 GPS module
  • NFC near field communication
  • RF radio frequency
  • the cellular module 2021 may provide a voice call, image call, short messaging service (SMS), or Internet service through, for example, a communication network. According to an embodiment, the cellular module 2021 may distinguish between and authenticate electronic devices 2001 within a communication network using a subscriber identification module (for example, the SIM card 2024 ). According to an embodiment of the present disclosure, the cellular module 2021 may perform at least some of functions that the AP 2010 may provide. According to an embodiment of the present disclosure, the cellular module 2021 may include a CP.
  • the Wi-Fi module 2023 , the BT module 2025 , the GPS module 2027 , and the NYC module 2028 may include, for example, a processor for processing data transmitted/received through the corresponding module. According to any embodiment, at least some (two or more) of the cellular module 2021 , the Wi-Fi module 2023 , the BT module 2025 , the GPS module 2027 , and the NFC module 2028 may be included in one integrated chip (IC) or IC package.
  • IC integrated chip
  • the RF module 2029 may transmit/receive, for example, a communication signal (e.g., an RF signal).
  • the RF module 2029 may include, for example, a transceiver, a power amp module (PAM), a frequency filter, a low noise amplifier (LNA), or an antenna.
  • PAM power amp module
  • LNA low noise amplifier
  • at least one of the cellular module 2021 , the Wi-Fi module 2023 , the 131 module 2025 , the GPS module 2027 , and the NFC module 2028 may transmit/receive an RF signal through a separate RF module 2029 .
  • the SIM card 2024 may include, for example, a card including a subscriber identification module and/or an embedded SIM, and may further include unique identification information (for example, an Integrated Circuit Card Identifier (ICCID)) or subscriber information (for example, International Mobile Subscriber Identity (IMSI)).
  • ICCID Integrated Circuit Card Identifier
  • IMSI International Mobile Subscriber Identity
  • the memory 2030 may include, for example, an internal memory 2032 or an external memory 2034 .
  • the internal memory 2032 may include at least one of a volatile memory (for example, a dynamic random access memory (DRAM), a static RAM (SRAM), a synchronous dynamic RAM (SDRAM), and the like) and a non-volatile memory (for example, a one time programmable read only memory (OTPROM), a programmable ROM (PROM), an erasable and programmable ROM (EPROM), an electrically erasable and programmable ROM (EEPROM), a mask ROM, a flash ROM, a flash memory (for example, a NAND flash memory or a NOR flash memory), a hard disc drive, a solid state drive (SSD), and the like).
  • a volatile memory for example, a dynamic random access memory (DRAM), a static RAM (SRAM), a synchronous dynamic RAM (SDRAM), and the like
  • a non-volatile memory for example, a
  • the external memory 2034 may further include a flash drive, for example, a compact flash (CF), a secure digital (SD), a micro secure digital (Micro-SD), a mini secure digital (Mini-SD), an extreme Digital (xD), a memory stick or the like.
  • CF compact flash
  • SD secure digital
  • Micro-SD micro secure digital
  • Mini-SD mini secure digital
  • xD extreme Digital
  • the external memory 2034 may be functionally and/or physically connected to the electronic device 2001 through various interfaces.
  • the sensor module 2040 may measure a physical quantity or detect an operation state of the electronic device 2001 , and may convert the measured or detected information to an electrical signal.
  • the sensor module 2040 may include at least one of, for example, a gesture sensor 2040 A, a gyro sensor 2404 B, an atmospheric pressure sensor 2040 C, a magnetic sensor 2040 D, an acceleration sensor 2040 E, a grip sensor 2040 F, a proximity sensor 2040 G, a color sensor 2040 H (for example, a red/green/blue (RUB) sensor), a biometric sensor 20401 , a temperature/humidity sensor 2040 J, an illumination sensor 2040 K, and an ultra violet (UV) sensor 2040 M.
  • the sensor module 2040 may include an E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECU) sensor, an infrared (IR) sensor, an iris sensor, and/or a fingerprint sensor.
  • the sensor module 2040 may further include a control circuit for controlling at least one sensor included therein.
  • the electronic device 2001 may further include a processor configured to control the sensor module 2040 as a part of or separately from the AP 2010 , and may control the sensor module 2040 while the AP 2010 is in a sleep state.
  • the input device 2050 may include, for example, a touch panel 2052 , a (digital) pen sensor 2054 , a key 2056 , or an ultrasonic input device 2058 .
  • the touch panel 2052 may use at least one of, for example, a capacitive type, a resistive type, an infrared type, and an ultrasonic type.
  • the touch panel 2052 may further include a control circuit.
  • the touch panel 2052 may further include a tactile layer and provide a tactile reaction to the user.
  • the (digital) pen sensor 2054 may be, for example, a part of the touch panel 2052 or may include a separate recognition sheet.
  • the key 2056 may include, for example, a physical button, an optical key or a keypad.
  • the ultrasonic input unit 2058 may identify data by detecting an acoustic wave with a microphone (for example, microphone 2088 ) of the electronic device 2001 through an input unit for generating an ultrasonic signal.
  • the display 2060 may include a panel 2062 , a hologram device 2064 , or a projector 2066 .
  • the panel 2062 may include a component equal or similar to the display 160 of FIG. 1 .
  • a The panel 2062 may be embodied to be, for example, flexible, transparent, or wearable.
  • the panel 2062 may also be integrated with the touch panel 2052 as a single module.
  • the hologram device 2064 may show a stereoscopic image in the air using interference of light.
  • the projector 2066 may project light onto a screen to display an image.
  • the screen may be located inside or outside the electronic device 2001 .
  • the display 2060 may further include a control circuit (not shown) for controlling the panel 2062 , the hologram device 2064 , or the projector 2066 .
  • the interface 2070 may include, for example, an HDMI 2072 , a USB 2074 , an optical interface 2076 , or a D-subminiature (D-sub) 2078 .
  • the interface 2070 may be included in, for example, the communication interface 170 illustrated in FIG. 1 . Additionally or alternatively, the interface 2070 may include, for example, a mobile high-definition link (Min) interface, an SD card/multi-media card (MMC) interface, or an infrared data association (IrDA) standard interface.
  • Min mobile high-definition link
  • MMC SD card/multi-media card
  • IrDA infrared data association
  • the audio module 2080 may bilaterally convert, for example, a sound and an electrical signal. At least some components of the audio module 2080 may be included, in, for example, the input/output interface 150 illustrated in FIG. 1 .
  • the audio module 2080 may process voice information input or output through, for example, a speaker 2082 , a receiver 2084 , earphones 2086 , or the microphone 2088 .
  • the camera module 2091 may photograph a still image and a dynamic image.
  • the camera module 291 may include one or more image sensors (for example, a front sensor or a back sensor), a lens, an image signal processor (ISP) or a flash (for example, LED or xenon lamp).
  • image sensors for example, a front sensor or a back sensor
  • ISP image signal processor
  • flash for example, LED or xenon lamp
  • the power management module 2095 may manage, for example, power of the electronic device 2001 .
  • the power management module 2095 may include a power management integrated circuit (PMIC), a charger IC, or a battery or fuel gauge.
  • PMIC power management integrated circuit
  • the PMIC may have a wired and/or wireless charging scheme.
  • a magnetic resonance scheme, a magnetic induction scheme, or an electromagnetic scheme may be exemplified as the wireless charging method, and an additional circuit for wireless charging, such as a coil loop circuit, a resonance circuit, a rectifier circuit, and the like may be added.
  • the battery gauge may measure, for example, a residual quantity of the battery 2096 , a voltage, a current, or a temperature during charging.
  • the battery 2096 may include, for example, a rechargeable battery or a solar battery.
  • the indicator 2097 may indicate particular status of the electronic device 2001 or a part thereof (for example, the AP 2010 ), for example, a booting status, a message status, a charging status, or the like.
  • the motor 2098 may convert an electrical signal into mechanical vibrations, and may generate a vibration or haptic effect.
  • the electronic device 2001 may include a processing device (for example, a GPU) for supporting mobile TV.
  • the processing unit for supporting the mobile television (TV) may process media data according to a standard of digital multimedia broadcasting (DMB), digital video broadcasting (DVB), media flow or the like.
  • DMB digital multimedia broadcasting
  • DVD digital video broadcasting
  • Each of the above-described elements of the electronic device 2001 may be implemented by one or more components and the name of the corresponding element may vary depending on the type of the electronic device 2001 .
  • the electronic device 2001 may include at least one of the above-described elements, and may exclude some of the elements or further include other additional elements. Further, some of the elements of the electronic device 2001 according to various embodiments may be combined into one entity, so that the functions of the corresponding elements may be performed in the same way as those before they are combined.
  • FIG. 21 is a block diagram of a program module according to various embodiments of the present disclosure.
  • a program module 2110 may include an OS for controlling resources related to the electronic device (for example, the electronic device 101 ) and/or various applications (for example, the application programs 147 ) executed in the OS.
  • the OS may be, for example, Android®, iOS®, Windows®, Symbian®, Tizen®, Bada®, or the like.
  • the program module 2110 may include a kernel 2120 , middleware 2130 , an API 2160 , and/or at least one application 2170 . At least some of the program module 2110 may be preloaded in the electronic device or downloaded from a server (for example, the server 106 ).
  • the kernel 2120 may include, for example, a system resource manager 2121 or a device driver 2123 .
  • the system resource manager 2121 may perform the control, allotment or collection of the system resources.
  • the system resource manager 2121 may include a process manager, a memory manager, or a file system manager.
  • the device driver 2123 may include, for example, a display driver, a camera driver, a BT driver, a shared-memory driver, a USB driver, a keypad driver, a Wi-Fi driver, an audio driver, or an inter-process communication (IPC) driver.
  • IPC inter-process communication
  • the middleware 2130 may provide a function required by the application 2170 through the API 2160 so that the application 2170 can efficiently use limited system resources of the electronic device.
  • the middleware 2130 (for example, the middleware 143 ) may include, for example, at least one of a runtime library 2135 , an application manager 2141 , a window manager 2142 , a multimedia. manager 2143 , a resource manager 2144 , a power manager 2145 , a database manager 2146 , a package manager 2147 , a connectivity manager 2148 , a notification manager 2149 , a location manager 2150 , a graphic manager 2151 , and a security manager 2152 .
  • the runtime library 2135 may include, for example, a library module that a compiler uses to add new functions through a programming language while executing the application 2170 .
  • the runtime library 2135 may perform input/output management, memory management, or an arithmetic function.
  • the application manager 2141 may manage, for example, a life cycle of the application 2170 .
  • the window manager 2142 may manage a graphical user interface (GUI) resource used in the screen.
  • the multimedia manager 2143 may detect a format required for reproducing various media files and encode or decode a media file using a codec appropriate for the corresponding format.
  • the resource manager 2144 may manage resources such as a source code, a memory or a storage space of the application 2170 .
  • the power manager 2145 may operate together with a basic input/output system (BIOS) to manage a battery or power and may provide power information required for the operation of the electronic device.
  • the database manager 2146 may generate, search for, or change a database to be used by at least one of the applications 2170 .
  • the package manager 2147 may manage the installation or the updating of applications distributed in the form of package file.
  • the connectivity manager 2148 may manage wireless connections, such as Wi-Fi or BT.
  • the notification manager 2149 may display or notify an event such as a received message, an appointment, and a proximity notification to a user without disturbance.
  • the location manager 2150 may manage location information of the electronic device.
  • the graphic manager 2151 may manage graphic effects to be provided to a user and user interfaces.
  • the security manager 2152 may provide various security functions required for system security or user authentication.
  • the middleware 2130 may further include a telephony manager for managing a voice call function or a video call function of the electronic device.
  • the middleware 2130 may include a middleware module that forms a combination of various functions of the aforementioned components.
  • the middleware 2130 may provide modules specialized according to types of operating systems in order to provide differentiated functions.
  • the middleware 2130 may dynamically remove the conventional components, or add new components.
  • the API 2160 (for example, the API 145 ), which is a set of API programming functions, may include different configurations according to operating systems. For example, with respect to each platform, one API set may be provided in a case of Android or iOS, and two or more API sets may be provided in a case of Tizen®.
  • the at least one application 2170 may include one or more of a home application 2171 , a diary application 2172 , an SMS/multimedia message service (MMS) application 2173 , an instant message (TM) application 2174 , a browser application 2175 , a camera application 2176 , an alarm application 2177 , a contact application 2178 , a voice dial application 2179 , an e-mail application 2180 , a calendar application 2181 , a media, player application 2182 , an album application 2183 , a clock application 2184 , a health care application (for example, an application for measuring a work rate or blood sugar) (not shown), and an environmental information application (for example, an application for providing atmospheric pressure, humidity, or temperature information) (not shown).
  • MMS SMS/multimedia message service
  • TM instant message
  • the applications 2170 may include an application (hereinafter, referred to as an “information exchange application” for convenience of the description) supporting information exchange between the electronic device (for example, the electronic device 101 ) and external electronic devices (for example, the electronic devices 102 and 104 ).
  • the information exchange application may include, for example, a notification relay application for transmitting specific information to the external electronic device, or a device management application for managing the external electronic device.
  • the notification relay application may include a function of transferring, to the external electronic device (for example, the electronic device 102 or 104 ), notification information generated from other applications of the electronic device 101 (for example, an SMS/MMS application, an e-mail application, a health management application, or an environmental information application). Further, the notification relay application may receive notification information from, for example, the external electronic device and provide the received notification information to the user.
  • the external electronic device for example, the electronic device 102 or 104
  • notification information generated from other applications of the electronic device 101 for example, an SMS/MMS application, an e-mail application, a health management application, or an environmental information application.
  • the notification relay application may receive notification information from, for example, the external electronic device and provide the received notification information to the user.
  • the device management application may manage (for example, install, delete, or update), for example, a function for at least a part of the external electronic device (for example, the electronic device 104 ) communicating with the electronic device (for example, turning on/off the external electronic device itself (or some elements thereof) or adjusting brightness (or resolution) of a display), applications executed in the external electronic device, or services provided from the external electronic device (for example, a telephone call service or a message service).
  • a function for at least a part of the external electronic device for example, the electronic device 104
  • the electronic device for example, turning on/off the external electronic device itself (or some elements thereof) or adjusting brightness (or resolution) of a display
  • applications executed in the external electronic device for example, a telephone call service or a message service.
  • the applications 2170 may include an application (for example, health management application) designated according to attributes (for example, attributes of the electronic device such as the type of electronic device which corresponds to a mobile medical device) of the external electronic device (for example, the electronic device 102 or 104 ).
  • the applications 2170 may include an application received from the external electronic device (for example, the server 106 , or the electronic device 102 or 104 ).
  • the applications 2170 may include a preloaded application or a third party application that can be downloaded from the server. Names of the components of the program module 2110 according to the above described embodiments may vary depending on the type of operating system.
  • At least some of the programming module 2110 may be implemented in software, firmware, hardware, or a combination of two or more thereof. At least some of the programming module 2110 may be implemented (for example, executed) by, for example, the processor (for example, the AP 2010 ), At least some of the programming module 2110 may include, for example, a module, program, routine, sets of instructions, or process for performing one or more functions.
  • a machine-readable storage medium recording a program for executing a method of processing a content by an electronic device.
  • the method includes: generating a content corresponding to a user input among one or more contents displayed through a display of the electronic device as a first content; and displaying a floating UI, which displays first scrap information on the first content, on a screen.
  • module used in the present disclosure may refer to, for example, a unit including one or more combinations of hardware, software, and firmware.
  • the “module” may be interchangeable with a term, such as a unit, a logic, a logical block, a component, or a circuit.
  • the “module” may be the smallest unit of an integrated component or a part thereof.
  • the “module” may be a mMimum unit for performing one or more functions or a part thereof
  • the “module” may be mechanically or electronically implemented.
  • the “module” may include at least one of an application-specific integrated circuit (ASIC) chip, a field-programmable gate arrays (FPGA), and a programmable-logic device for performing operations which has been known or are to be developed hereinafter.
  • ASIC application-specific integrated circuit
  • FPGA field-programmable gate arrays
  • programmable-logic device for performing operations which has been known or are to be developed hereinafter.
  • At least some of the devices (for example, modules or functions thereof) or the method (for example, operations) according to the present disclosure may be implemented by a command stored in a computer-readable storage medium in a programming module form.
  • the command is executed by one or more processors (for example, the processor 120 )
  • the one or more processors may execute a function corresponding to the command.
  • the computer-readable storage medium may be, for example, the memory 130 .
  • the computer readable recoding medium may include a hard disk, a floppy disk, magnetic media (e.g., a magnetic tape), optical media (e.g., a compact disc read only memory (CD-ROM) and a digital versatile disc (DVD)), magneto-optical media (e.g., a floptical disk), a hardware device (e.g., a read only memory (ROM), a random access memory (RAM), a flash memory), and the like.
  • the program instructions may include high class language codes, which can be executed in a computer by using an interpreter, as well as machine codes made by a compiler.
  • the aforementioned hardware device may be configured to operate as one or more software modules in order to perform the operation of the present disclosure, and vice versa.
  • the programming module may include one or more of the aforementioned components or may further include other additional components, or some of the aforementioned components may be omitted.
  • Operations executed by a module, a programming module, or other component elements according to various embodiments of the present disclosure may be executed sequentially, in parallel, repeatedly, or in a heuristic manner. Further, some operations may be executed according to another order or may be omitted, or other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Nonlinear Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method for processing contents at an electronic device is provided. The method includes generating a first content corresponding to a user input applied to content via the electronic device, and displaying a floating user interface (UI), which displays first scrap information on the first content, on a screen of the electronic device.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This application claims the benefit under 35 U.S.C. §119 (a) of a Korean patent application filed on Sep. 2, 2014 in the Korean Intellectual Property Office and assigned Serial number 10-2014-0116514, the entire disclosure of which is hereby incorporated by reference.
  • TECHNICAL FIELD
  • The present disclosure relates to an electronic device for processing content.
  • BACKGROUND
  • Based on development of information communication technologies and semiconductor technologies, the supply and use of various portable terminals has rapidly increased. Particularly, recent portable terminals include various functions such as a television (TV) viewing function (for example, mobile broadcasting such as digital multimedia broadcasting (DMB) or digital video broadcasting (DVB)), an audio playback function (for example, Moving Picture Experts Group phase 1 or phase 2 (MPEG-1 or MPEG-2) audio layer-3 (MP3)), a picture function, and a data communication function, and audio an communication function such as voice calling or message transmission/reception.
  • The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
  • SUMMARY
  • Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a method and an electronic device for allowing the user to easily use the memo function or the scrap function.
  • A user executes many steps to perform a memo function or scrap function of the related art, and thus, the user cannot easily use the memo function or the scrap function.
  • In accordance with an aspect of the present disclosure, a method for processing a content at an electronic device is provided. The method includes generating a first content corresponding to a user input applied to content displayed via the electronic device, and displaying a floating user interface (UI), which displays first scrap information on the first content, on a screen of the electronic device.
  • In accordance with another aspect of the present disclosure, an electronic device for processing a content is provided. The electronic device includes a display and a processor configured to generate a first content corresponding to a user input applied to content displayed via the display, and control the display to display a floating UI, which displays first scrap information on the first content, on a screen.
  • In accordance with another aspect of the present disclosure, a machine-readable storage medium recording a program for executing a method of processing a content by an electronic device is provided. The method includes generating a first content corresponding to a user input content displayed via the electronic device, and displaying a floating UI, which displays first scrap information on the first content, on a screen of the electronic device.
  • The present disclosure can provide a method and an electronic device for allowing the user to easily use a memo function or a scrap function,
  • Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 illustrates a network environment including an electronic device according to various embodiments of the present disclosure;
  • FIG. 2 is a flowchart of a method of processing contents by an electronic device according to various embodiments of the present disclosure;
  • FIG. 3 is a flowchart of another method of processing the contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 4A, 4B, 4C, 4D, and 4E illustrate an example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 5A, 5B, 5C, and 6D illustrate an example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 6A, 6B, 6C, and 6D illustrate an example of selecting contents according to a user input by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 7A, 7B, 7C, and 7D illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 8A and 8B illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIG. 9 illustrates another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 10A, 10B, 10C, 10D, 10E, 10F, 10G, and 10H illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 11A, 11B, 11C, and 11D illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 12A and 12B illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 13A, 13B, and 13C illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 14A and 14B illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 15A, 15B, 15C and 15D illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIG. 16 illustrates another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIG. 17 illustrates another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 18A and 18B illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIGS. 19A, 19B, and 19C illustrate another example of processing contents by the electronic device according to various embodiments of the present disclosure;
  • FIG. 20 is a block diagram of an electronic device according to various embodiments of the present disclosure; and
  • FIG. 21 is a block diagram of a program module according to various embodiments of the present disclosure.
  • Throughout the drawings, like reference numerals will be understood to refer to like parts, components, and structures.
  • DETAILED DESCRIPTION
  • The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
  • The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure is provided for illustration purpose only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalents.
  • It is to be understood that the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
  • In the present disclosure, the expression “have”, “may have”, “include” or “may include” refers to existence of a corresponding feature (e.g., numerical value, function, operation, or components such as elements), and does not exclude existence of additional features.
  • Throughout the present disclosure the words “comprise” and “contain” and variations of the words, for example “comprising” and “comprises”, means “including but not limited to”, and is not intended to (and does not) exclude other moieties, additives, components, integers or steps.
  • In the present disclosure, the expression “A or B”, “at least one of A or/and B”, or “one or more of A or/and B” may include all possible combinations of the items listed. For example, the expression “A or B”, “at least one of A and B”, or “at least one of A or B” refers to all of (1) including at least one A, (2) including at least one B, or (3) including all of at least one A and at least one B.
  • The expression “a first”, “a second”, “the first”, or “the second” used in various embodiments of the present disclosure may modify various components regardless of the order and/or the importance but does not limit the corresponding components. The expressions may be used to distinguish a component element from another component element. For example, a first user device and a second user device indicate different user devices although both of them are user devices. For example, a first element may be termed a second element, and similarly, a second element may be termed a first element without departing from the scope of the present disclosure.
  • When one element (e.g., a first element) is “(operatively or communicatively) coupled with/to or connected to” another element (e.g., a second element), it should be construed that the one element is directly connected to another element or the one element is indirectly connected to another element via yet another element (e.g., a third element). Conversely, when one component element is “directly coupled” or “directly connected” to another component element, it may be construed that a third component element does not exist between the first component element and the second component element.
  • The expression “configured to” used in the present disclosure may be exchanged with, for example, “suitable for”, “having the capacity to”, “designed to”, “adapted to”, “made to”, or “capable or according to the situation. The term “configured to” may not necessarily imply “specifically designed to” in hardware. Alternatively, in some situations, the expression “device configured to” may mean that the device, together with other devices or components, “is able to”. For example, the phrase “processor adapted (or configured) to perform A, B, and C” may mean a dedicated processor (e.g. embedded processor) only for performing the corresponding operations or a generic-purpose processor (e.g., central processing unit (CPU) or application processor (AP)) that can perform the corresponding operations by executing one or more software programs stored in a memory device.
  • Features, integers, characteristics, compounds, chemical moieties or groups described in conjunction with a particular aspect, embodiment or example of the disclosure are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible therewith.
  • The terms used herein are merely for the purpose of describing particular embodiments and are not intended to limit the scope of other embodiments. As used herein, singular forms may include plural forms as well unless the context clearly indicates otherwise. Unless defined otherwise, all terms used herein, including technical and scientific terms, have the same meaning as those commonly understood by a person skilled in the art to which the present disclosure pertains. Such terms as those defined in a generally used dictionary are to be interpreted to have the meanings equal to the contextual meanings in the relevant field of the art, and are not to be interpreted to have ideal or excessively formal meanings unless clearly defined in the present disclosure. In some cases, even the term defined in the present disclosure should not be interpreted to exclude embodiments of the present disclosure.
  • For example, the electronic device may include at least one of a smartphone, a tablet personal computer (PC), a mobile phone, a video phone, an electronic book (e-book) reader, a desktop PC, a laptop PC, a netbook computer, a personal digital assistant (PDA), a portable multimedia player (PMP), an MP3 player, a mobile medical appliance, a camera, and a wearable device (e.g., a head-mounted-device (HMD) such as electronic glasses, electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, electronic tattoos, or a smart watch).
  • According to various embodiments, the electronic device may be a smart home appliance. The home appliance may include at least one of, for example, a television, a digital video disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washing machine, an air cleaner, a set-top box, a home automation control panel, a security control panel, a TV box (e.g., Samsung HomeSync™, Apple TV™, or Google TV™), a game console (e.g., Xbox™ and PlayStation™),an electronic dictionary, an electronic key, a camcorder, and an electronic photo frame.
  • According to another embodiment, the electronic device may include at least one of various medical devices (e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood pressure measuring device, a body temperature measuring device, etc.), a magnetic resonance angiography (MRA), a magnetic resonance imaging (MRI), a computed tomography (CT) machine, and an ultrasonic machine), a navigation device, a global positioning system (GPS) receiver, an event data recorder (EDR) , a flight data recorder (FDR) , a vehicle infotainment device, an electronic devices for a ship (e.g., a navigation device for a ship, and a gyro-compass), avionics, security devices, an automotive head unit, a robot for home or industry, an automatic teller's machine (ATM), a point of sales (POS) in a shop, or internet device of things (e.g., a light bulb, various sensors, electric or gas meter, a sprinkler device, a fire alarm, a thermostat, a streetlamp, a toaster, a sporting goods, a hot water tank, a heater, a boiler, etc.).
  • According to various embodiments, the electronic device may include at least one of a part of furniture or a building/structure, an electronic board, an electronic signature receiving device, a projector, and various kinds of measuring instruments (e.g., a water meter, an electric meter, a gas meter, and a radio wave meter). The electronic device according to various embodiments of the present disclosure may be a combination of one or more of the aforementioned various devices. The electronic device according to various embodiments of the present disclosure may be a flexible device. Further, the electronic device according to an embodiment of the present disclosure is not limited to the aforementioned devices, and may include a new electronic device according to the development of technology.
  • Hereinafter, an electronic device according to various embodiments of the present disclosure will be described with reference to the accompanying drawings. As used herein, the term “user” may indicate a person who uses an electronic device or a device (e.g., an artificial intelligence electronic device, an automated device for performing testing operations, etc.) that uses an electronic device.
  • FIG. 1 illustrates a network environment including an electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 1, a network environment 100 includes an electronic device 101. The electronic device 101 may include a bus 110, a processor 120, a memory 130, an input/output interface 150, a display 160, and a communication interface 170. According to various embodiments, the electronic device 101 may omit at least some of the above components or further include other components.
  • The bus 110 may include a circuit for connecting the components 120 to 170 and transmitting communication between the components (for example, control messages and/or data).
  • The processor 120 may include one or more of a CPU, an AP, and a communication processor (CP). For example, the processor 120 may carry out operations or data processing related to control and/or communication of at least one other component of the electronic device 101.
  • The processor 120 may control the electronic device 101 to perform a memo function or a scrap function. According to an embodiment, the electronic device 101 may execute a scrap mode for storing or temporarily storing data, for example, text or image selected by the user. Further, the processor 120 may control the electronic device 101 to inform the user that a stored or temporarily stored content exist by controlling the display 160 to display a floating user interface (UI).
  • The memory 130 may include a volatile memory and/or a non-volatile memory. The memory 130 may store, for example, commands or data related to one or more other components of the electronic device 101. According to an embodiment, the memory 130 may store software and/or a program 140. The program 140 may include, for examples, a kernel 141, middleware 143, an application programming interface (API) 145, and/or an at least one application program (or “applications”) 147. At least some of the kernel 141, the middle 143, and the API 145 may be referred to as an operating system (OS).
  • The kernel 141 may control or manage system resources (for example, the bus 110, the processor 120, or the memory 130) used for executing an operation or function implemented by other programs (for example, the middleware 143, the API 145, or the application 147). Furthermore, the kernel 141 may provide an interface through which the middleware 143, the API 145, or the application program 147 may access individual components of the electronic device 101 to control or manage system resources.
  • The middleware 143 may serve as, for example, an intermediary such that the API 145 or the application program 147 communicate with the kernel 141 to transmit/receive data. Furthermore, in regard to task requests received from the application program 147, the middleware 143 may perform a control (for example, scheduling or load balancing) for the task requests using, for example, a method of assigning a priority for using the system resources (for example, the bus 110, the processor 120, or the memory 130) of the electronic device 101 to at least one application.
  • The API 145 is an interface by which the applications 147 control functions provided from the kernel 141 or the middleware 143, and may include, for example, at least one interface or function (for example, instructions) for file control, window control, image processing, or text control.
  • For example, the input/output interface 150 may serve as an interface that may transfer commands or data, which is input from the user or another external device, to the other components of the electronic device 101. Further, the input/output interface 150 may output commands or data of the electronic device 101 to the user or another external device.
  • The display 160 may include, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, a micro electro mechanical system (MEMS) display, or an electronic paper display. The display 160 may display various contents (for example, text, images, videos, icons, or symbols). The display 160 may include a touch screen and receive, for example, a touch input, a gesture input, a proximity input, or a hovering input using an electronic pen or a user's body part.
  • The communication interface 170 may enable communication between the electronic device 101 and an external device (for example, a first external electronic device 102, a second external electronic device 104, or a server 106). For example, the communication interface 170 may be connected to a network 162 through wireless or wired communication to communicate with the external device (for example, the second external electronic device 104 or the server 106).
  • The wireless communication may include, for example, at least one of long term evolution (LTE), LTE-advanced (LTE-A), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications system (UMTS), wireless broadband (WiBro), and global system for mobile communications (GSM) as a cellular communication protocol. The wired communication may include at least one of, for example, a universal serial bus (USB), a high definition multimedia interface (HDMI), recommended standard 232 (RS-232), and a plain old telephone service (POTS). The network 162 may include at least one of communication networks such as a computer network (for example, a local area network (LAN) or a wide area network (WAN)), the Internet, and a telephone network.
  • Each of the first and second external electronic devices 102 and 104 may be the same as or different from the electronic device 101. According to an embodiment, the server 106 may include a group of one or more servers,
  • According to various embodiments, all or some of the operations performed by the electronic device 101 may be performed by another electronic device or a plurality of electronic devices (for example, the electronic device 102 or 104 or the server 106). According to an embodiment, when the electronic device 101 should perform some functions or services automatically or by request, the electronic device 101 may request remotely perform at least some of the functions at another device (for example, the electronic device 102 or 104 or the server 106). The other electronic device (for example, the electronic device 102 or 104 or the server 106) may perform the additional function and transfer the result to the electronic device 101. The electronic device 101 may provide the requested functions or services based on the received result or after additionally processing the received result. To this end, for example, cloud computing, distributed computing, or client-server computing technology may be used.
  • Hereinafter, the user input received by the electronic device 101 may be one or more of the touch input, the hovering input, the voice input, and the gesture input. The touch input or the hovering input may be made by a user's body (for example, a finger or the like) or a stylus pen. The voice input may be received from the user through a microphone in the electronic device 101 and the gesture input may be received from the user through a camera in the electronic device.
  • In accordance with an aspect of the present disclosure, an electronic device for processing a content is provided,
  • FIG. 2 is a flowchart of a method of processing contents by an electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 2, the processor 120 may determine a text or image corresponding to a user input, for example, selecting of text or images through the display 160 in operation S202. When the text or image is selected according to the user input, the processor 120 may capture the text or image and determine the captured text or image as a content in operation S204. The content may be a captured image including both the text and an image selected by the user. For example, the user selects one area including 100 characters and 2 still images that are displayed on the screen in operation S202. The processor 120 may capture the 100 characters and the 2 still images and determine the captured area as the selected content in operation S204.
  • Thereafter, the processor 120 may store the content in operation S206. According to an embodiment, the processor 120 may temporarily store the content in the memory 130. Further, the processor 120 may process the content according to a user input in operation S208. According to an embodiment, the processor 120 may display the content on the display, or store, edit, share, or delete the content according to the user input in operation S208.
  • According to an embodiment, the processor 120 may store or temporarily store a plurality of contents, and process the plurality of stored or temporarily stored contents individually or simultaneously. Before operation S202, the electronic device 101 may store or temporarily store one or more contents. The processor 120 may store or temporarily store contents in an amount equal to a predetermined number. For example, the processor 120 may store or temporarily store 20 contents and process each of the 20 contents according to a user input.
  • FIG. 3 is a flowchart of another method of processing contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 3, the electronic device 101 may execute a scrap mode according to a user input (first input) in operation S302. When the scrap mode is executed, the processor 120 of the electronic device 101 may determine whether a user input (second input) for selecting at least a part of the image (for example, still image or dynamic image) or text displayed on the screen is received in operation S304.
  • When the second input is not received in operation S304, the process returns back to operation S304 until a second input is received. In another embodiment, if the second input is not received in operation S304, the processor 120 may maintain the scrap mode without performing any operation. When the second input is received in operation S304, the processor 120 may capture the text or image according to the second input and determine the captured text or image as a first content in operation S306.
  • Subsequently, the processor 120 may store the first content in operation S308. The first content may remain within the electronic device 101 even though power of the electronic device 101 is turned off or the scrap mode is terminated. According to an embodiment, the first content may be stored in a temporary folder and, when power is turned on, the processor 120 may load the first content and display the loaded first content on the display 160.
  • According to another embodiment, the first content may be stored temporarily. When the first content is temporarily stored, the first content may be automatically deleted when power of the electronic device 101 is turned off or the scrap mode is terminated.
  • The electronic device 101 may display a floating UI including scrap information on the first content and a second content, which was stored before the first contents, on the screen in operation S310. The scrap information may include, for example, a thumbnail image of the first content and the second content and the number of first contents and the second contents. For example, the processor 120 may display thumbnail images corresponding to the first content and the second content in the floating UI as a radial menu. According to an embodiment, the processor 120 may display the floating UI on the uppermost layer (i.e., having the highest z-order).
  • When the first content or the second content is stored, content information, which corresponds to information on the corresponding content, may he stored in the memory 130. The content information may include the number of images included in each of the first content and the second content a time when each of the first content and the second content is stored, and application information to which the image or text included in each of the first content and the second content is provided. When each of the first content and the second content is displayed on the display 160, the content information may be displayed together with the corresponding content.
  • Thereafter, the processor 120 may determine whether a user input i.e., a third input) for processing the first or second content is received in operation S312. When the third input is not received in operation S312, the processor 120 may control the electronic device 101 to maintain the current state without performing any operation.
  • When the third input is received in operation S312, the processor 120 may process the first content or the second content according to the third input in operation S314. According to an embodiment, the processor 120 may display the first content or the second content on the screen, or store, edit, share, or delete the first content or the second content according to the user input in operation S314. For example, the processor 120 may convert colors of the image included in the first content or the second content into black and white, or extract the text included in the first content or the second content and store contacts such as a phone number and a mail address among the text in the memory 130.
  • Although the scrap mode is first executed and then the content corresponding to the user input is determined in FIG. 3, the operation mode may be automatically switched to the scrap mode when the image or text is selected according to the user input in another embodiment. The scrap mode may be automatically executed when a predetermined shortcut key is input to execute the scrap mode. Further, the scrap mode may be automatically executed when a predetermined gesture input or voice input is input to execute the scrap mode.
  • In accordance with an aspect of the present disclosure, a method for processing a content at an electronic device is provided. The method includes; generating a content corresponding to a user input among one or more contents displayed through a display of the electronic device as a first content; and displaying a floating UI, which displays first scrap information on the first content, on a screen.
  • FIGS. 4A, 4B, 4C, 4D) and 4E illustrate an example of processing contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 4A, the electronic device 101 may execute first mode 401, second mode 402, third mode 403, fourth mode 404, and a scrap mode 405. According to an embodiment, the processor 120 may display a mode selection table 400 for selecting the operation mode corresponding to a user input on the display 160. The user may select the operation mode of the electronic device 101 by selecting one of the modes 401-405. In FIG. 4A, the electronic device 101 may receive a user input 411 for executing the scrap mode 405 from the user. At this time, the user input 411 may be, for example, tapping, clicking, or double-clicking the scrap mode 405.
  • According to an embodiment, the processor 120 may load the mode selection table 400 according to a user input. For example, the processor 120 may load the mode selection table according to a touch input for selecting a preset position of the screen continuously for a predetermined time or longer. Further, the electronic device 101 may directly enter the scrap mode without loading the mode selection table 400 according to a user input. For example, when an area including the image or text is selected, the scrap mode 405 is automatically executed, and then the area may be captured and stored as content. Further, the floating UI showing scrap information on the content may be displayed.
  • Referring to FIGS. 4B, 4C, 4D, and 4E, an execution screen 421 according to a user input 410 in the scrap mode 405 is illustrated.
  • Referring to FIG. 4B, the user selects a part 430 of the text or image included in the execution screen 421 using a finger or a stylus pen 600. The user may select the image or text within the area 430 by drawing a line from top to bottom of the area 430 including the text or image, which the user desires to select, in the execution screen 421 by using the stylus pen 600. Further, the processor 120 may generate a captured image by capturing the area 430.
  • According to an embodiment, the processor 120 may control the display 160 to distinguish the area 430, which is selected by the user, from other areas. In FIG. 4B, the display 160 may overlay the area 430 selected by the user with a layer of a color different from the background color of the execution screen 421.
  • Referring to FIG. 4C, the captured image 430 is floated over the execution screen 421. As illustrated in FIG. 4C, when the text or image is selected, the processor 120 may capture the area 430 including the text or image. Further, the processor 120 may float the captured image 430 to display it over the execution screen 421.
  • The user may input a command for processing the captured image 430 into the electronic device 101. According to an embodiment, the processor 120 may display an icon 452 corresponding to a command for temporarily storing the captured image 430 as content. In FIG. 4C, the user inputs a user input 412 for selecting the icon 452 into the electronic device 101. At this time, the user input 412 may be, for example, clicking or double-clicking the icon 452. When an icon 452 is selected, the processor 120 may temporarily store the captured image 430 as content. According to an embodiment, when the captured image 430 includes text, the processor 1120 may display an icon 451 corresponding to a command for extracting and parsing the text on the captured image 430. Further, the electronic device 101 may process the captured image 430 according to a user input by selecting at least one of the icons 441, 442, and 443.
  • For example, the processor 120 may store the captured image 430 in the memory 130 or delete the captured image 430 according to a user input by selecting the icon 441. The processor 120 may combine captured images into one file according to the user input by selecting the icon 441. For example, if three contents are displayed on the display 160, the processor 120 may merge the three contents into a single file. For example, the processor 120 may merge the contents into one image or document file (for example, a pdf file).
  • Further, the captured image 430 may be shared with another application or other devices (for example, the electronic devices 102 and 104 or the server 106) according to a user input by selecting the icon 442. According to an embodiment, the processor 120 may determine an application or a device with which the captured image 430 is shared according to a user input. When the application or the device with which the captured image 430 is shared is determined, the processor 120 may convert a format of the captured image 430 into a format suitable for the corresponding application or device. For example, when the captured image 430 is shared with a first application, the processor 120 may convert the captured image 430 into a jpg file. Further, when the captured image 430 is shared with a second application, the processor 120 may convert the captured image 430 into a bmp file. In addition, the one captured image 430 may be simultaneously shared with a plurality of applications or devices. The processor 120 may convert the format of the captured image 430 into a format suitable for each of the applications and devices.
  • According to another embodiment, when the number of captured images 430 is two or more, the processor 120 may combine the captured images 430 into one file and share the file. For example, the processor 120 may combine three captured images 430 into one captured image and share the combined captured image. Further, the processor 120 may share each of the captured images 430 with each of the applications or devices. For example, when the number of captured images 430 is three, the processor 120 may sequentially transmit the three captured images 430 to the applications or the devices.
  • According to another embodiment, when the captured image 430 corresponds to an image generated by capturing only text, the processor 120 may extract characters from the captured image and share only the extracted characters with the applications or the devices. When the captured image 430 includes both the image and the text, the processor 120 may share the captured image 430 by transmitting each of the image and the text extracted from the captured image 430 to the applications or the devices. According to an embodiment, the applications or the devices, with which the captured image 430 is shared, may display both the image and the text or display only one of the image and the text according to the specification of the corresponding applications or devices.
  • Further, the processor 120 may edit information on the captured image 430 (for example, a tag including a photographing time of the image included in the captured image 430, a file size, a resolution, a file path, and like) according to a user input by selecting the icon 443.
  • Referring to FIG. 4D, when the captured image 430 is temporarily stored as a content, the electronic device 101 may display a floating UI 450 on the screen. The floating UI 450 may be a UT which informs the user that the captured image captured in FIG. 4D is temporarily stored in the electronic device 101. According to an embodiment, thumbnail images corresponding to one or more temporarily stored contents and the number of temporarily stored contents may be displayed through the floating UI 450. The floating UI 450 may have a form in which the thumbnail images overlap each other in a radial shape, a form in which the thumbnail images are horizontally arranged and overlap each other, or a form in which the thumbnail images are vertically arranged and overlap each other, and may have an animation in which the thumbnail images are rotated every preset time (for example, every one second). According to an embodiment, the processor 120 may display the floating UI 450 on the uppermost layer (i.e., with the highest z-order). Accordingly, the floating UI 450 may be displayed in the entire area of the screen of the display 160.
  • Referring to FIG. 4E, the processor 120 may display a thumbnail image of the content 430 through the floating UI 450 to inform that the content 430 is temporarily stored. Further, the content 430 is temporarily stored, the processor 120 may display “1” via the floating UI to indicate the number of temporarily stored contents.
  • Referring to FIG. 4D, the electronic device 101 receives a user input 413 for selecting the floating UI 450. When the electronic device 101 receives the user input 413, the processor 120 may control the display 160 to display the temporarily stored content 430 in the form of, for example, a popup window.
  • Referring to FIG. 4E, the temporarily stored content 430 in the form of the popup window is floated. Since the content 430 includes text, the processor 120 may display, on the content 430, an icon 451 corresponding to a command for extracting and parsing the text included in the contents 430. Further, when the content is loaded through the floating UI 450, the processor 120 may control the display 160 to not display the floating UI 450 or to display the floating UI 450 on a lower layer of the layer including the content 430, so as to make the floating UI 450 hidden by the content 430.
  • FIGS. 5A, 5B, 5C, and 5D illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 5A, an image selected according to a user input is captured and floated on an execution screen 521. When an image on the execution screen 521 is selected according to a user input, the image may be captured, and the icon 452 corresponding to the command for temporarily storing the captured image 531 as a content may be displayed on the image. According to an embodiment, since the captured image 531 does not include text in FIG. 5A, the icon 451 corresponding to the command for parsing the text may not be displayed on the captured image 531.
  • FIG. 5B illustrates an execution 521 after the capture image, that is, the content 531 is temporarily stored, and a scrap window 570 showing temporarily stored content 531 and 532 together. Referring to FIG. 5B, the floating UI 450 may be displayed on the execution screen 521. Although FIG. 5B illustrates the floating UI 450 on an upper right part of the execution screen 521, the floating UI 450 may be located at any part of the execution screen 521. The floating UI 540 may display the number of temporarily stored contents and a thumbnail image corresponding to each of the temporarily stored contents. In FIG. 5B, a thumbnail image reduced from the captured image, that is, the content 531 is displayed through the floating UI 450.
  • Referring to FIG. 5C, the processor 120 displays another execution screen 522 on the display 160 after the content 531 is temporarily stored. In FIG. 5C, it is assumed that the number of temporarily stored contents is plural. The processor 120 may display a floating window 560 including temporarily stored contents or thumbnail images 561, 562 and 563 corresponding to the contents on the execution screen 522. The floating window 560 may include the thumbnail images corresponding to the temporarily stored contents. For example, when there are 30 stored contents, the floating window 560 may also display all 30 thumbnail images. Although only the three thumbnail images 561, 562, and 563 are displayed in the floating window 560 in FIG. 5C, the processor 120 may display more thumbnail images through the floating window 560 according to a user input for scrolling the floating window 560.
  • Referring to FIG. 5D, the temporarily stored contents 531, 532, and 533 are displayed together. As illustrated in FIG. 5D, the scrap screen may display the temporarily stored contents 531, 532, and 533 sequentially in an order of temporary storage (i.e., either ascending or descending in time).
  • The scrap screen may include icons 571, 572, and 573 corresponding to commands for processing the temporarily stored contents 531-533. The processor 120 may process the contents 531-533 according to a user input for selecting one of the icons 571-573. For example, when the icon 571 is selected by the user, the processor 120 may switch the screen to an execution screen 531. When the icon 572 is selected by the user, the processor 120 may share the corresponding content with another application or other devices (for example, the electronic devices 102 and 104 or the server 106). When the icon 573 is selected by the user, the processor 120 may delete the corresponding content.
  • According to an embodiment, the processor 120 may select one of the contents 531, 532, and 533 and then process the corresponding content according to a user input for selecting one of the icons 571-573. According to another embodiment, the user may select one of the contents 531-533 and request processing the corresponding content to the electronic device 101 by dragging the selected content to one of the icons 571-573. According to another embodiment, the user may select one of the contents 531-533 and then make a request for deleting the content to the electronic device 101 by dragging the corresponding content to the left or right side.
  • FIGS. 6A, 6B, 6C, and 6D illustrate an example of selecting a content according to a user input by the electronic device 101 according to various embodiments of the present disclosure.
  • FIG. 6A illustrates an operation mode of the electronic device 101, and a scrap mode is selected according to a user input. Referring to FIG. 6A, the electronic device 101 may execute the first to fourth modes 401, 402, 403, and 404, and the scrap mode 405. According to an embodiment, the processor 120 may display the mode selection table 400 for selecting the operation mode corresponding to a user input on the display 160. The user may select the operation mode of the electronic device 101 by selecting one of the modes 401-405. The electronic device 101 may receive a user input 601 for executing the scrap mode 405 from the user. At this time, the user input 601 may be, for example, clicking or double-clicking the scrap mode 405. The user input 601 may be made by selecting the scrap mode 405 using a stylus pen 600 by the user.
  • Referring to FIG. 6B, an area 641 is illustrated in which a content to be temporarily stored is selected according to a user input. The user may select the area 641 including an image or text through the stylus pen 600 and capture the image or text included in the area 641. Further, the user may input a command for temporarily storing the captured image as a content into the electronic device 101.
  • Referring to FIG. 6B, the user selects the area 641 including the image or text by inputting, into the electronic device 101, a user input 602 for selecting a virtual OK button 602 displayed on the execution screen 621 through the display 160.
  • Referring to FIG. 6C, a user input 603 for selecting contents to be temporarily stored is illustrated.
  • Referring to FIG. 6C, the user draws a diagonal line 603 from the upper left part of the execution screen 621 by using the stylus pen 600. The processor 120 may determine a rectangle including the diagonal line as the area including the image or text which the user desires to temporarily store. Referring to FIG. 6D, a captured image 631 generated by the user input 603 in FIG. 6C is illustrated. In order to indicate that the image or text is selected according to the user input 603, the processor 120 may control the display 160 such that a layer with a color different from the background color of the execution screen 621 is located on the area 631 including the image or text.
  • FIGS. 7A, 7B, 7C, and 7D illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIGS. 7A, the electronic device 101 selects or temporarily stores text or an image among the images or text included in an execution screen 721 as content 730 according to a user input.
  • FIG. 7A illustrates icons 441, 442, 443, 451, and 452 corresponding to commands for processing the stored content 730. When an image or text is selected in the execution screen 721, the processor 120 may capture the selected image or content, and separately float the captured image 731 on an execution screen 721. The user may temporarily store the captured image 731 as a content by inputting a user input for selecting the icon 452.
  • Referring to FIG. 7B, content 732 only stores an image. Referring to FIG. 7C, content 732 stores an image and text. Referring to FIG. 7D), content 733 only includes text. As illustrated in FIGS. 7A to 7C, the electronic device 101 may temporarily store the contents 731, 732, and 733 according to a user input by selecting the icon 452. Further, in FIGS. 7B and 7C, the electronic device 101 may extract and parse text included in the contents 732 and 733 by selecting the icon 451.
  • FIGS. 8A and 8B illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure,
  • Referring to FIG. 8A, a captured image 831 that is generated by capturing an area in an execution screen 821 is floated.
  • Since the content 831 illustrated in FIG. 8A is a still image, the text included in the content 831 is also included as a part of the still image. The processor 120 may extract text data from the content 831 through, for example, an OCR scheme. Referring to FIG. 8B, text data 832 is illustrated and was extracted from the content 831. In FIGS. 8A and 8B, the processor 120 may temporarily store the capture image 831 as a content.
  • According to an embodiment, the processor 120 may parse the text data 832. The processor 120 may parse all of characters, numbers, and special letters included in the text data 832. The processor 120 may share the text data 832 with all applications, that can be executed by the electronic device 101, or with other devices (for example, the electronic devices 102 and 104 or the server 106) according to the user input for selecting the icon 442. Further, the processor 120 may extract contacts such as an address, a phone number, a mail address, and a website address included in the text data 832 and may determine the extracted data as scrap information on the content 831. The processor 120 may temporarily store the scrap information to correspond to the content 831 and display the scrap information through the floating UI 450.
  • According to an embodiment, the processor 120 may display, on the execution screen 831, a predetermined image (for example, a loading image) indicating that the text data 832 is being extracted until the text data 832 can be displayed on the display 160 as illustrated in FIG. 8B.
  • FIG. 9 illustrates another example of processing the content by the electronic device 101 according to various embodiments of the present disclosure.
  • Referring to FIG. 9, an example is illustrated in which the processor 120 may extract information the text data 832 as results 931-935. According to an embodiment, the processor 1120 may separately extract only an image from the content 831, Further, the processor 120 may control the display 160 to display a result of analysis of the text data 832 close to the text data 832. In FIG. 9, results 931 to 935 of the analysis of the text data 832 are displayed on the lower part of the text data 832.
  • The processor 120 may extract a phone number 931 from the text data 832. Further, the processor 120 may also display icons 941 corresponding to a command for processing the phone number 931. The processor 120 may share the phone number 931 with another application or an electronic device (for example, the electronic device 102 or 104, or the server 106) according to a user input applied to one of the icons 941. For example, the processor 120 may call, send a mail, send a text message to the phone number 931, or add the phone number 931 to an address book by selecting one of the icons 941. According to an embodiment, when there are multiple phone numbers included in the text data 932, the processor 120 may sequentially display phone numbers 961 and 962 and display a mail address 963 together with the phone numbers 961 and 962.
  • The processor 120 may extract an email address 932 from the text data 832. Further, the processor 120 may also display icons 942 corresponding to a. command for processing the email address 932. The processor 120 may share the mail address 932 with another application or an electronic device (for example, the electronic device 102 or 104, or the server 106) by selecting one of the icons 942. Further, the processor 120 may share the email address 932, transmit an email to the email address 932, transmit a message to the email address 932, or add the email address 932 to the address book by selecting one of the icons 942.
  • The processor 120 may extract schedule information 933 from the text data 832. When the schedule is extracted, the processor 120 may manage the user's schedule by adding the schedule 933 information to a calendar app. Further, the processor 120 may also display icons 943 corresponding to a command for processing the schedule information 933. The processor 120 may share the schedule information 933 with another application or an electronic device (for example, the electronic device 102 or 104, or the server 106) by selecting one of the icons 943. Further, the processor 120 may share the schedule information 933 or add the schedule information 933 to the calendar app preinstalled in the electronic device 101 by selecting one of the icons 943.
  • The processor 120 may extract a website address from the text data 832 and display the website 934 below the text data 832. Further, the processor 120 may also display icons 944 corresponding to a command for processing the website address. The processor 120 may share the website 934 with another application or an electronic device (for example, the electronic device 102 or 104, or the server 106) by selecting one of the icons 944. Further, the processor 120 may browse the website 934 through a browser or add the website 934 to a bookmark list by selecting one of the icons 944.
  • The processor 120 may extract map data 935 (for example, an address or a GPS coordinate) from the text data 832. The processor 120 may display the map 935 including a position corresponding to the map data below the text data 832. Further, the processor 120 may also display icons 945 corresponding to a command for processing the map data 935 together with the map data 935. The processor 120 may share the map data 935 with another application or an electronic device (for example, the electronic device 102 or 104, or the server 106) by selecting one of the icons 945. Further, the processor 120 may display an address included in the text data on the map 935 by selecting one of the icons 945.
  • FIGS. 10A, 10B, 10C, 10D, 10E, 10F, 10G, and 10H illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 10A, the processor 120 may capture an area including an image and text from an execution screen 1021 and temporarily store the captured image 1031 as first a content. FIG. 10A floats and displays the temporarily stored first content 1031. When the first content 1031 is temporarily stored, a thumbnail image corresponding to the first content 1031 may be displayed in the floating UI 450.
  • Referring to FIG. 10B, when it is assumed that the temporarily stored content corresponds to the first content 1031, a floating UI 450 may display “1” as the number of temporarily stored contents.
  • Subsequently, the processor 120 may capture an area including one image from an execution screen 1022 and temporarily store the captured image 1032 as a second content. Referring to FIG. 10C, a user input for selecting the area including the image from the execution screen 1022 is illustrated. The user may select the area 1032 through the stylus pen 600 and capture the image or text included in the area 1032.
  • Referring to FIG. 10D, the captured image 1032 is floated on the execution screen 1022. The processor 120 may temporarily store the captured image 1032. Further, the processor 120 may convert the captured image 1032 into the second content and temporarily store the second content.
  • Referring to FIG. 10E, the processor 120 may display, on the execution screen 1022, the floating UI 450 to which a thumbnail image corresponding to the second content 1032 is added. Since the number of temporarily stored contents is 2, the floating UI 450 may display “2” as the number of temporarily stored contents.
  • Subsequently, the processor 120 may capture an area including three images from the execution screen 1022 and temporarily store the captured images 1033 as a third content. Referring to FIG. 10F, a user input for selecting the area is illustrated and includes the three images from the execution screen 1022. The user may select the area 1033 through the stylus pen 600 to determine the area including the image or text to be temporarily stored. Referring to FIG. 10G, the temporarily stored third content 1033 is floated on the execution screen 1022. The processor 120 may convert the captured image 1033 into the second content 1033 and temporarily store the second contents.
  • The processor 120 may display, on the execution screen 1022, the floating UI 450 to which a thumbnail image corresponding to the third content 1033 is added. Since the number of temporarily stored contents is 3, the floating UI 450 may display “3” as the number of temporarily stored contents. Referring to FIG. 10H, thumbnail images displayed through the floating UI 450 may be displayed to partially overlap each other. At this time, with respect to the thumbnail images corresponding to the temporarily stored contents, the floating UI 450 may first display the thumbnail image corresponding to the contents stored later. In contrast, with respect to the thumbnail images corresponding to the temporarily stored contents, the floating UI 450 may first display the thumbnail image corresponding to the contents stored relatively earlier.
  • FIGS. 11A, 11B, 11C, and 11D illustrate another example of processing the contents by the electronic device 101 according to various embodiments of the present disclosure.
  • Referring to FIG. 11A, a popup window 1151 including thumbnail images 1131, 1132, and 1133 corresponding to temporarily stored contents is displayed on an execution screen 1121. The popup window 1151 is an example of the floating UI 450, and may be displayed on execution screens 1121, 1122, and 1123 including thumbnail images 1131, 1132, and 1133 of temporarily stored contents. The popup window 1151 may be displayed at any position of the execution screen 1121 and may be move according to a user input. In FIG. 11A, the popup window 1151 is displayed on an upper right part of the execution screen 1121. Referring to FIG. 11B, a popup window 1152 including thumbnail images 1134, 1135, and 1136 corresponding to the temporarily stored contents is displayed on a lower left part of the execution screen 1122.
  • Referring to FIG. 11A, a user input 1101 is input for selecting one thumbnail image 1133 by dragging the selected thumbnail image 1133 in a diagonal direction . The processor 120 may move the contents corresponding to the thumbnail image 1131-1133 according to the user input 1101. Referring to FIG. 11B, the popup window 1151 moves based on the user input 1101 and is positioned on a lower left part of the execution screen 1122. Referring to FIG. 11B, the popup window 1152 moved according to the user input 1101 is displayed on a lower left part of the execution screen 1122. FIGS. 11C and 11D illustrate an example for moving temporarily stored contents 1134 to 1136 to the execution screen 1123 by using a popup window 1151.
  • Referring to FIG. 11C, the processor 120 displays a text message window according to a message transmission/reception application as the execution screen 1123. The electronic device 101 receives a user input 1102 for selecting the thumbnail image 1134 and dragging the selected thumbnail image 1134 in a diagonal direction. According to an embodiment, the electronic device 101 may move the thumbnail image 1134 to an input window 1171 of the execution screen 1123.
  • The processor 120 may move the contents 1134 corresponding to the thumbnail image 1134 to the execution screen 1123 according to the user input 1102 as illustrated in FIG. 11C. That is, the processor 120 may allow the user to use the contents 1134 in an application by moving the contents 1134 according to the user input 1102. Referring to FIG. 11D, the user may transmit the contents 1134 to another device through the message transmission/reception application.
  • FIGS. 12A and 12B illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure. In FIGS. 12A and 12B, screens displaying a list 1210 of temporarily stored contents are illustrated.
  • Referring to FIG. 12A, the processor 120 may generate a list 1210 in which temporarily stored contents 1211, 1212, and 1213 are arranged based on when the contents 1211-1213 are stored. The processor 120 may execute a command corresponding to each of the icons 441, 442, and 443. For example, the processor 120 may store, delete, share and edit at least one of the contents 1211-1213 according to the user input. Referring to FIG. 12A, a user input 1201 for deleting the first contents 1211 is input. The processor 120 may remove the selected first content 1211 from the list 1210. Referring to FIG. 12B, the list 1210 having the first content 1211 is removed is illustrated.
  • According to another embodiment, the processor 120 may delete the corresponding content from the list 1210 by receiving a user input for selecting the icon 1223 displayed on each of the contents 1211-1213.
  • FIGS. 13A, 13B, and 13C illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 13A, the processor 120 may receive a user input for selecting text displayed on an execution screen 1311 through the stylus pen 600. The processor 120 may control the display 160 such that the selected text 1331 is underlined. An area including the selected text or image may be captured and temporarily stored as a content. As illustrated in FIG. 13A, the user may select a desired image or text by inputting a user input for underlining the desired image or text into the electronic device 101.
  • Referring to FIG. 13B, when the text or image is selected, the processor 120 may display an editing menu 1350. The user may determine a method of processing the selected image or text by selecting one of the menus included in the editing menu 1350. In FIG. 13B, the user selects the “see together” menu. Referring to FIG. 13C, in response to the selected menu item, an operation according to the selected menu in FIG. 13B is executed. The execution screen displays only the image or the text selected by the user in FIG. 13A.
  • FIGS. 14A and 14B illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 14A, a user input 1401 for the “see together” function is provided. The processor 120 may control the display 160 to display only the image or the text selected by the user according to the user input 1401.
  • Referring to FIG. 14B, when the image or the text is selected according to the user input, the processor 120 may control the display 160 to distinguish between texts included in different paragraphs. In FIG. 14B, a text included in a first paragraph and a text included in a second paragraph are selected according to a user input.
  • FIGS. 15A, 15B, 15C, and 15D illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 15A, an execution screen 1521 illustrates execution of a first application.
  • Referring to FIG. 15A, the processor 120 may capture an area 1531 including an image or a text corresponding to a user input and temporarily store the captured image as content 1532.
  • The processor 120 may move the content 1532 to an execution screen that executes an application different from the first application according to a user input.
  • Referring to FIG. 15B, the content 1532 is moved to an execution screen 1522 that executes a second application. Since the content 1532 includes both the image and the text, the second application may receive both the image and the text included in the content 1532. According to another embodiment, scrap information on the content 1532 may be input into the second application. The user may more the content 1532 to the second application by dragging the selected image or text to the execution screen 1522.
  • Referring to FIG. 15C, the content 1532 is displayed on the execution screen 1522 that executes the second application. According to an embodiment, the processor 120 may copy the content 1532 to the second application.
  • Referring to FIG. 15D, the content 1532 is moved or copied to an application other than the first application. In FIG. 15D, a third application is a messenger. The user may display the content 1532 in a messenger window 1523 by dragging the content 1532 to the messenger window 1523.
  • FIG. 16 illustrates another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 16, the processor 120 may extract a website address 1620 displayed in the browser window 1610 according to a user input. For example, the user input may be a flick on the browser window 1610 by using a finger or the stylus pen 600, and another type of input (for example, a hovering input, a voice input, a gesture input, or the like) may be recognized by the electronic device 101 as the user input for extracting the website address 1620.
  • When the website address 1620 is extracted, the processor 120 may temporarily store the website address 1620 and display the website address 1620 on the right side of the browser window 1610 as illustrated in FIG. 16. According to another embodiment, the user may delete the website address 1620 by a drag input by a directional input on the website address 1620 from top to bottom or from bottom to top.
  • FIG. 17 illustrates another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 17, when the electronic device 101 is implemented to have dual monitors, the processor 120 temporarily stores the extracted website address 1620 and copies and inputs the website address 1620 into an execution screen 1720 of a different application. For example, the processor 120 may copy the website address 1620 extracted from the browser window 1620 and input the website address 1620 into an input window 1721 of the execution screen 1720.
  • FIGS. 18A and 18B illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure.
  • Referring to FIGS. 18A and 18B, an example of a hypertext markup language (HTML) document is displayed by a browser application. When text or images included in the Internet page is selected by the processor 120, the selected text or images are stored as a content. The processor 120 may determine the position (for example, a line) of the html document at which the part selected by the user is located. The processor 120 may insert “f727to732f890t932” into a uniform resource locator (URL) corresponding to lines 727-732 and 890-932 of the html document and store the URL or transmit the URL to another external device (for example, the electronic device 102 or 104, or the server 106). For example, the processor 120 may transmit “http://ss.com/253#f728t760f820t884” to another external electronic device. The processor 120 may show images or text corresponding to lines 727-732 and lines 890-932. For example, the processor 120 may control the display 160 to show the corresponding area by scrolling the Internet page to allow the user to intuitively identify the area including the images or texts corresponding to the lines 727-732 and the lines 890-932 by using the URL. Further, the processor 120 may show the area including the images or text corresponding to the lines 727-732 and the lines 890-932 to be in contrast with other areas of the Internet page.
  • FIGS. 19A, 19B, and 19C illustrate another example of processing the contents by the electronic device according to various embodiments of the present disclosure
  • Referring to FIG. 19A, an execution screen 1921 includes a character or text and the user selects text by using the stylus pen 600. Further, the processor 120 may display a menu bar 1925 for processing the selected text 1932. In FIG. 19A, the user selects a “memo” function from the menu bar 1925.
  • Referring to FIG. 19B, since the memo function is selected, the processor 120 may control the display 160 to float and display a memo window 1940 on the execution screen 1921. The processor 120 may distinguish the text 1932 selected by the user from texts, which are not selected, through the display 160. Further, the electronic device 101 may receive a memo through the user's finger or the stylus pen 600. At this time, the memo may be related to the selected text 1932.
  • Referring to FIG. 19C, after the memo is completely input, the processor 120 may hide the memo window 1941. Further, the processor 120 may inform the user that the memo related to the text 1932 exists by displaying a memo icon 1942.
  • FIG. 20 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • Referring to FIG. 20, an electronic device 2001 may include, for example, all or some of the electronic device 101 illustrated in FIG. 1. The electronic device 2001 may include at least one AP 2010, a communication module 2020, a subscriber identification module (SIM) card 2024, a memory 2030, a sensor module 2040, an input device 2050, a display 2060, an interface 2070, an audio module 2080, a camera module 2091, a power management module 2095, a battery 2096, an indicator 2097, and a motor 2098.
  • The AP 2010 may control a plurality of hardware or software components connected thereto by driving an operating system or an application program and perform a variety of data processing and calculations. The AP 2010 may be embodied as, for example, a system on chip (SoC). According to an embodiment, the AP 2010 may further include a graphic processing unit (GPU) and/or an image signal processor. The AP 2010 may include some of the components (for example, the cellular module 2021) illustrated in FIG. 20. The AP 2010 may load instructions or data, received from at least one other element (for example, a non-volatile memory), in a volatile memory to process the loaded instructions or data and may store various types of data in a non-volatile memory.
  • The communication module 2020 may be similar to the communication interface 200 of FIG. 1. The communication module 2020 may include, for example, a cellular module 2021, a Wi-Fi module 2023, a Bluetooth (BT) module 2025, a GPS module 2027, a near field communication (NFC) module 2028, and a radio frequency (RF) module 2029.
  • The cellular module 2021 may provide a voice call, image call, short messaging service (SMS), or Internet service through, for example, a communication network. According to an embodiment, the cellular module 2021 may distinguish between and authenticate electronic devices 2001 within a communication network using a subscriber identification module (for example, the SIM card 2024). According to an embodiment of the present disclosure, the cellular module 2021 may perform at least some of functions that the AP 2010 may provide. According to an embodiment of the present disclosure, the cellular module 2021 may include a CP.
  • The Wi-Fi module 2023, the BT module 2025, the GPS module 2027, and the NYC module 2028 may include, for example, a processor for processing data transmitted/received through the corresponding module. According to any embodiment, at least some (two or more) of the cellular module 2021, the Wi-Fi module 2023, the BT module 2025, the GPS module 2027, and the NFC module 2028 may be included in one integrated chip (IC) or IC package.
  • The RF module 2029 may transmit/receive, for example, a communication signal (e.g., an RF signal). The RF module 2029 may include, for example, a transceiver, a power amp module (PAM), a frequency filter, a low noise amplifier (LNA), or an antenna. According to another embodiment, at least one of the cellular module 2021, the Wi-Fi module 2023, the 131 module 2025, the GPS module 2027, and the NFC module 2028 may transmit/receive an RF signal through a separate RF module 2029.
  • The SIM card 2024 may include, for example, a card including a subscriber identification module and/or an embedded SIM, and may further include unique identification information (for example, an Integrated Circuit Card Identifier (ICCID)) or subscriber information (for example, International Mobile Subscriber Identity (IMSI)).
  • The memory 2030 (for example, the memory 130) may include, for example, an internal memory 2032 or an external memory 2034. The internal memory 2032 may include at least one of a volatile memory (for example, a dynamic random access memory (DRAM), a static RAM (SRAM), a synchronous dynamic RAM (SDRAM), and the like) and a non-volatile memory (for example, a one time programmable read only memory (OTPROM), a programmable ROM (PROM), an erasable and programmable ROM (EPROM), an electrically erasable and programmable ROM (EEPROM), a mask ROM, a flash ROM, a flash memory (for example, a NAND flash memory or a NOR flash memory), a hard disc drive, a solid state drive (SSD), and the like).
  • The external memory 2034 may further include a flash drive, for example, a compact flash (CF), a secure digital (SD), a micro secure digital (Micro-SD), a mini secure digital (Mini-SD), an extreme Digital (xD), a memory stick or the like. The external memory 2034 may be functionally and/or physically connected to the electronic device 2001 through various interfaces.
  • The sensor module 2040 may measure a physical quantity or detect an operation state of the electronic device 2001, and may convert the measured or detected information to an electrical signal. The sensor module 2040 may include at least one of, for example, a gesture sensor 2040A, a gyro sensor 2404B, an atmospheric pressure sensor 2040C, a magnetic sensor 2040D, an acceleration sensor 2040E, a grip sensor 2040F, a proximity sensor 2040G, a color sensor 2040H (for example, a red/green/blue (RUB) sensor), a biometric sensor 20401, a temperature/humidity sensor 2040J, an illumination sensor 2040K, and an ultra violet (UV) sensor 2040M. Additionally or alternatively, the sensor module 2040 may include an E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECU) sensor, an infrared (IR) sensor, an iris sensor, and/or a fingerprint sensor. The sensor module 2040 may further include a control circuit for controlling at least one sensor included therein. In some embodiments, the electronic device 2001 may further include a processor configured to control the sensor module 2040 as a part of or separately from the AP 2010, and may control the sensor module 2040 while the AP 2010 is in a sleep state.
  • The input device 2050 may include, for example, a touch panel 2052, a (digital) pen sensor 2054, a key 2056, or an ultrasonic input device 2058. The touch panel 2052 may use at least one of, for example, a capacitive type, a resistive type, an infrared type, and an ultrasonic type. The touch panel 2052 may further include a control circuit. The touch panel 2052 may further include a tactile layer and provide a tactile reaction to the user.
  • The (digital) pen sensor 2054 may be, for example, a part of the touch panel 2052 or may include a separate recognition sheet. The key 2056 may include, for example, a physical button, an optical key or a keypad. The ultrasonic input unit 2058 may identify data by detecting an acoustic wave with a microphone (for example, microphone 2088) of the electronic device 2001 through an input unit for generating an ultrasonic signal.
  • The display 2060 (for example, the display 160) may include a panel 2062, a hologram device 2064, or a projector 2066. The panel 2062 may include a component equal or similar to the display 160 of FIG. 1. A The panel 2062 may be embodied to be, for example, flexible, transparent, or wearable. The panel 2062 may also be integrated with the touch panel 2052 as a single module. The hologram device 2064 may show a stereoscopic image in the air using interference of light. The projector 2066 may project light onto a screen to display an image. For example, the screen may be located inside or outside the electronic device 2001. According to an embodiment of the present disclosure, the display 2060 may further include a control circuit (not shown) for controlling the panel 2062, the hologram device 2064, or the projector 2066.
  • The interface 2070 may include, for example, an HDMI 2072, a USB 2074, an optical interface 2076, or a D-subminiature (D-sub) 2078. The interface 2070 may be included in, for example, the communication interface 170 illustrated in FIG. 1. Additionally or alternatively, the interface 2070 may include, for example, a mobile high-definition link (Min) interface, an SD card/multi-media card (MMC) interface, or an infrared data association (IrDA) standard interface.
  • The audio module 2080 may bilaterally convert, for example, a sound and an electrical signal. At least some components of the audio module 2080 may be included, in, for example, the input/output interface 150 illustrated in FIG. 1. The audio module 2080 may process voice information input or output through, for example, a speaker 2082, a receiver 2084, earphones 2086, or the microphone 2088.
  • The camera module 2091 may photograph a still image and a dynamic image. According to an embodiment, the camera module 291 may include one or more image sensors (for example, a front sensor or a back sensor), a lens, an image signal processor (ISP) or a flash (for example, LED or xenon lamp).
  • The power management module 2095 may manage, for example, power of the electronic device 2001. According to an embodiment, the power management module 2095 may include a power management integrated circuit (PMIC), a charger IC, or a battery or fuel gauge. The PMIC may have a wired and/or wireless charging scheme. A magnetic resonance scheme, a magnetic induction scheme, or an electromagnetic scheme may be exemplified as the wireless charging method, and an additional circuit for wireless charging, such as a coil loop circuit, a resonance circuit, a rectifier circuit, and the like may be added. The battery gauge may measure, for example, a residual quantity of the battery 2096, a voltage, a current, or a temperature during charging. The battery 2096 may include, for example, a rechargeable battery or a solar battery.
  • The indicator 2097 may indicate particular status of the electronic device 2001 or a part thereof (for example, the AP 2010), for example, a booting status, a message status, a charging status, or the like. The motor 2098 may convert an electrical signal into mechanical vibrations, and may generate a vibration or haptic effect. Although not illustrated, the electronic device 2001 may include a processing device (for example, a GPU) for supporting mobile TV. The processing unit for supporting the mobile television (TV) may process media data according to a standard of digital multimedia broadcasting (DMB), digital video broadcasting (DVB), media flow or the like.
  • Each of the above-described elements of the electronic device 2001 may be implemented by one or more components and the name of the corresponding element may vary depending on the type of the electronic device 2001. In various embodiments of the present disclosure, the electronic device 2001 may include at least one of the above-described elements, and may exclude some of the elements or further include other additional elements. Further, some of the elements of the electronic device 2001 according to various embodiments may be combined into one entity, so that the functions of the corresponding elements may be performed in the same way as those before they are combined.
  • FIG. 21 is a block diagram of a program module according to various embodiments of the present disclosure.
  • Referring to FIG. 21, a program module 2110 (for example, the program 140) may include an OS for controlling resources related to the electronic device (for example, the electronic device 101) and/or various applications (for example, the application programs 147) executed in the OS. The OS may be, for example, Android®, iOS®, Windows®, Symbian®, Tizen®, Bada®, or the like.
  • The program module 2110 may include a kernel 2120, middleware 2130, an API 2160, and/or at least one application 2170. At least some of the program module 2110 may be preloaded in the electronic device or downloaded from a server (for example, the server 106).
  • The kernel 2120 (for example, the kernel 141 of FIG. 1) may include, for example, a system resource manager 2121 or a device driver 2123. The system resource manager 2121 may perform the control, allotment or collection of the system resources. According to an embodiment, the system resource manager 2121 may include a process manager, a memory manager, or a file system manager. The device driver 2123 may include, for example, a display driver, a camera driver, a BT driver, a shared-memory driver, a USB driver, a keypad driver, a Wi-Fi driver, an audio driver, or an inter-process communication (IPC) driver.
  • The middleware 2130 may provide a function required by the application 2170 through the API 2160 so that the application 2170 can efficiently use limited system resources of the electronic device. According to an embodiment, the middleware 2130 (for example, the middleware 143) may include, for example, at least one of a runtime library 2135, an application manager 2141, a window manager 2142, a multimedia. manager 2143, a resource manager 2144, a power manager 2145, a database manager 2146, a package manager 2147, a connectivity manager 2148, a notification manager 2149, a location manager 2150, a graphic manager 2151, and a security manager 2152.
  • The runtime library 2135 may include, for example, a library module that a compiler uses to add new functions through a programming language while executing the application 2170. The runtime library 2135 may perform input/output management, memory management, or an arithmetic function.
  • The application manager 2141 may manage, for example, a life cycle of the application 2170. The window manager 2142 may manage a graphical user interface (GUI) resource used in the screen. The multimedia manager 2143 may detect a format required for reproducing various media files and encode or decode a media file using a codec appropriate for the corresponding format. The resource manager 2144 may manage resources such as a source code, a memory or a storage space of the application 2170.
  • The power manager 2145 may operate together with a basic input/output system (BIOS) to manage a battery or power and may provide power information required for the operation of the electronic device. The database manager 2146 may generate, search for, or change a database to be used by at least one of the applications 2170. The package manager 2147 may manage the installation or the updating of applications distributed in the form of package file.
  • For example, the connectivity manager 2148 may manage wireless connections, such as Wi-Fi or BT. The notification manager 2149 may display or notify an event such as a received message, an appointment, and a proximity notification to a user without disturbance. The location manager 2150 may manage location information of the electronic device. The graphic manager 2151 may manage graphic effects to be provided to a user and user interfaces. The security manager 2152 may provide various security functions required for system security or user authentication. According to an embodiment, when the electronic device (for example, electronic device 101) has a call function, the middleware 2130 may further include a telephony manager for managing a voice call function or a video call function of the electronic device.
  • The middleware 2130 may include a middleware module that forms a combination of various functions of the aforementioned components. The middleware 2130 may provide modules specialized according to types of operating systems in order to provide differentiated functions. In addition, the middleware 2130 may dynamically remove the conventional components, or add new components.
  • The API 2160 (for example, the API 145), which is a set of API programming functions, may include different configurations according to operating systems. For example, with respect to each platform, one API set may be provided in a case of Android or iOS, and two or more API sets may be provided in a case of Tizen®.
  • The at least one application 2170 (for example, the applications 147) may include one or more of a home application 2171, a diary application 2172, an SMS/multimedia message service (MMS) application 2173, an instant message (TM) application 2174, a browser application 2175, a camera application 2176, an alarm application 2177, a contact application 2178, a voice dial application 2179, an e-mail application 2180, a calendar application 2181, a media, player application 2182, an album application 2183, a clock application 2184, a health care application (for example, an application for measuring a work rate or blood sugar) (not shown), and an environmental information application (for example, an application for providing atmospheric pressure, humidity, or temperature information) (not shown).
  • According to an embodiment, the applications 2170 may include an application (hereinafter, referred to as an “information exchange application” for convenience of the description) supporting information exchange between the electronic device (for example, the electronic device 101) and external electronic devices (for example, the electronic devices 102 and 104). The information exchange application may include, for example, a notification relay application for transmitting specific information to the external electronic device, or a device management application for managing the external electronic device.
  • For example, the notification relay application may include a function of transferring, to the external electronic device (for example, the electronic device 102 or 104), notification information generated from other applications of the electronic device 101 (for example, an SMS/MMS application, an e-mail application, a health management application, or an environmental information application). Further, the notification relay application may receive notification information from, for example, the external electronic device and provide the received notification information to the user. The device management application may manage (for example, install, delete, or update), for example, a function for at least a part of the external electronic device (for example, the electronic device 104) communicating with the electronic device (for example, turning on/off the external electronic device itself (or some elements thereof) or adjusting brightness (or resolution) of a display), applications executed in the external electronic device, or services provided from the external electronic device (for example, a telephone call service or a message service).
  • According to an embodiment, the applications 2170 may include an application (for example, health management application) designated according to attributes (for example, attributes of the electronic device such as the type of electronic device which corresponds to a mobile medical device) of the external electronic device (for example, the electronic device 102 or 104). According to an embodiment, the applications 2170 may include an application received from the external electronic device (for example, the server 106, or the electronic device 102 or 104). According to an embodiment, the applications 2170 may include a preloaded application or a third party application that can be downloaded from the server. Names of the components of the program module 2110 according to the above described embodiments may vary depending on the type of operating system.
  • According to various embodiments, at least some of the programming module 2110 may be implemented in software, firmware, hardware, or a combination of two or more thereof. At least some of the programming module 2110 may be implemented (for example, executed) by, for example, the processor (for example, the AP 2010), At least some of the programming module 2110 may include, for example, a module, program, routine, sets of instructions, or process for performing one or more functions.
  • In accordance with an aspect of the present disclosure, a machine-readable storage medium recording a program for executing a method of processing a content by an electronic device is provided. The method includes: generating a content corresponding to a user input among one or more contents displayed through a display of the electronic device as a first content; and displaying a floating UI, which displays first scrap information on the first content, on a screen.
  • The term “module” used in the present disclosure may refer to, for example, a unit including one or more combinations of hardware, software, and firmware. The “module” may be interchangeable with a term, such as a unit, a logic, a logical block, a component, or a circuit. The “module” may be the smallest unit of an integrated component or a part thereof. The “module” may be a mMimum unit for performing one or more functions or a part thereof The “module” may be mechanically or electronically implemented. For example, the “module” according to the present disclosure may include at least one of an application-specific integrated circuit (ASIC) chip, a field-programmable gate arrays (FPGA), and a programmable-logic device for performing operations which has been known or are to be developed hereinafter.
  • According to various embodiments, at least some of the devices (for example, modules or functions thereof) or the method (for example, operations) according to the present disclosure may be implemented by a command stored in a computer-readable storage medium in a programming module form. When the command is executed by one or more processors (for example, the processor 120), the one or more processors may execute a function corresponding to the command. The computer-readable storage medium may be, for example, the memory 130.
  • The computer readable recoding medium may include a hard disk, a floppy disk, magnetic media (e.g., a magnetic tape), optical media (e.g., a compact disc read only memory (CD-ROM) and a digital versatile disc (DVD)), magneto-optical media (e.g., a floptical disk), a hardware device (e.g., a read only memory (ROM), a random access memory (RAM), a flash memory), and the like. In addition, the program instructions may include high class language codes, which can be executed in a computer by using an interpreter, as well as machine codes made by a compiler. The aforementioned hardware device may be configured to operate as one or more software modules in order to perform the operation of the present disclosure, and vice versa.
  • The programming module according to the present disclosure may include one or more of the aforementioned components or may further include other additional components, or some of the aforementioned components may be omitted. Operations executed by a module, a programming module, or other component elements according to various embodiments of the present disclosure may be executed sequentially, in parallel, repeatedly, or in a heuristic manner. Further, some operations may be executed according to another order or may be omitted, or other operations may be added.
  • While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.

Claims (21)

What is claimed is:
1. A method for processing contents at an electronic device, the method comprising;
generating a first content corresponding to a user input applied to content displayed via the electronic device; and
displaying a floating user interface (UI), which displays first scrap information on the first content, on a screen of the electronic device.
2. The method of claim 1, further comprising:
generating a second content; and
displaying second scrap information on the first content and the second content through the floating UI.
3. The method of claim 2, wherein displaying the second scrap information comprises displaying a number of contents included in in the first content and a number of contents included in in the second content.
4. The method of claim 1, wherein displaying the floating UI comprises displaying an image corresponding to the first content through the floating UI.
5. The method of claim 2, further comprising:
if a user input on the floating UI is received, displaying the first content and the second content sequentially in an order of generation of the contents.
6. The method of claim 1, wherein floating UI is displayed over the one or more contents.
7. The method of claim 1, further comprising;
determining whether the first content includes text; and
if the first content includes the text, displaying a menu for providing a text function.
8. The method of claim 1, wherein the text function comprises:
extracting the text from the first content; and
extracting first information related to at least one of a phone number, a mail, a schedule, a website address, and map data from the text.
9. The method of claim 8, further comprising:
displaying a menu for executing a first application related to the first information; and
if a user input for the menu is received, executing the first application.
10. The method of claim 2, further comprising:
displaying a first image corresponding to the first content and a second image corresponding to the second content through the floating UI;
receiving a user input for selecting the second image; and
if a user input for sharing content corresponding to the second image is received, transmitting the content corresponding to the second image to a corresponding application or device.
11. An electronic device for processing a content, the electronic device comprising:
a display; and
a processor configured to:
generate a first content corresponding to a user input applied to content displayed via the display, and
control the display to display a floating user interface (UI), which displays first scrap information on the first content, on a screen.
12. The electronic device of claim 11, wherein the processor is further configured to generate a second content and control the display to display second scrap information on the first content and the second content through the floating UI.
13. The electronic device of claim 12, wherein the processor is further configured to control the display to display a number of contents included in the first content and a number of contents included in the second content through the floating UI.
14. The electronic device of claim 11, wherein the processor is further configured to control the display to display an image corresponding to the first content through the floating UI.
15. The electronic device of claim 12, wherein the processor is further configured to control the display to display the first content and the second content sequentially in an order of generation of the contents when a user input on the floating UI is received.
16. The electronic device of claim 11, wherein the floating UI is displayed over the one or more contents.
17. The electronic device of claim 11, wherein the processor is further configured to:
determine whether the first content includes text, and controls the display to display a menu for providing a text function.
18. The electronic device of claim 11, wherein the processor is further configured to extract the text from the first content and extracts first information on a phone number, a mail, a schedule, a website address, and map data from the text.
19. The electronic device of claim 18, wherein the processor is further configured to control the display to display a menu for executing a first application related to the first information, and executes the first application if a user input for the menu is received.
20. The electronic device of claim 12, wherein the processor is further configured to:
control the display to:
display a first image corresponding to the first content and a second image corresponding to the second content through the floating UI,
receive a user input for selecting the second image, and
receive a user input for sharing content corresponding to the second image, and
transmit the content corresponding to the second image to a corresponding application or device.
21. A machine-readable storage medium recording a program for executing a method of processing a content at an electronic device, the method comprising:
generating a first content corresponding to a user input content displayed via the electronic device; and
displaying a floating UI, which displays scrap information on the first content, on a screen of the electronic device.
US14/843,387 2014-09-02 2015-09-02 Method of processing content and electronic device thereof Abandoned US20160062557A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/097,587 US11847292B2 (en) 2014-09-02 2020-11-13 Method of processing content and electronic device thereof
US18/540,251 US20240118781A1 (en) 2014-09-02 2023-12-14 Method of processing content and electronic device thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140116514A KR102411890B1 (en) 2014-09-02 2014-09-02 A mehtod for processing contents and an electronic device therefor
KR10-2014-0116514 2014-09-02

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/097,587 Continuation US11847292B2 (en) 2014-09-02 2020-11-13 Method of processing content and electronic device thereof

Publications (1)

Publication Number Publication Date
US20160062557A1 true US20160062557A1 (en) 2016-03-03

Family

ID=54105619

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/843,387 Abandoned US20160062557A1 (en) 2014-09-02 2015-09-02 Method of processing content and electronic device thereof
US17/097,587 Active US11847292B2 (en) 2014-09-02 2020-11-13 Method of processing content and electronic device thereof
US18/540,251 Pending US20240118781A1 (en) 2014-09-02 2023-12-14 Method of processing content and electronic device thereof

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/097,587 Active US11847292B2 (en) 2014-09-02 2020-11-13 Method of processing content and electronic device thereof
US18/540,251 Pending US20240118781A1 (en) 2014-09-02 2023-12-14 Method of processing content and electronic device thereof

Country Status (6)

Country Link
US (3) US20160062557A1 (en)
EP (1) EP2993567B1 (en)
KR (3) KR102411890B1 (en)
CN (3) CN115097981B (en)
AU (1) AU2015312629B2 (en)
WO (1) WO2016036132A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD772297S1 (en) * 2014-09-01 2016-11-22 Apple Inc. Display screen or portion thereof with graphical user interface
US20170269800A1 (en) * 2015-01-13 2017-09-21 Lg Electronics Inc. Mobile terminal and control method thereof
US20170315684A1 (en) * 2016-04-28 2017-11-02 Chiun Mai Communication Systems, Inc. Method for managing multiple types of data
USD822042S1 (en) * 2016-02-08 2018-07-03 Bizagi Group Ltd. Display screen with graphical user interface
US20180203825A1 (en) * 2017-01-16 2018-07-19 Seiko Epson Corporation Electronic apparatus, electronic system, method of controlling electronic apparatus, and computer-readable recording medium
US20180332216A1 (en) * 2017-05-12 2018-11-15 Microsoft Technology Licensing, Llc Synchronized display on hinged multi-screen device
USD861704S1 (en) 2017-09-11 2019-10-01 Apple Inc. Electronic device with graphical user interface
US10528207B2 (en) * 2015-01-12 2020-01-07 Facebook, Inc. Content-based interactive elements on online social networks
USD900145S1 (en) * 2018-01-05 2020-10-27 Google Llc Display screen or portion thereof with transitional graphical user interface
US11269483B2 (en) * 2017-05-16 2022-03-08 Apple Inc. Device, method, and graphical user interface for managing content items and associated metadata
US11489944B2 (en) * 2020-04-21 2022-11-01 Acer Incorporated Electronic apparatus and data transmission method thereof based off marquees
EP4318299A4 (en) * 2021-04-29 2024-08-07 Vivo Mobile Communication Co Ltd Document generation method and apparatus and electronic device

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102097125B1 (en) * 2018-02-02 2020-04-06 네이버 주식회사 Method for clipping web page and terminal device using the same
KR20200106703A (en) * 2019-03-05 2020-09-15 삼성전자주식회사 Apparatus and method for providing information based on user selection
CN112162812B (en) * 2020-09-27 2024-05-31 维沃移动通信有限公司 Object adding method and device
WO2022086349A1 (en) * 2020-10-20 2022-04-28 Rifai Burneo Karim Portable system for the deployment of an intraconnected digital network in rural areas

Citations (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6178270B1 (en) * 1997-05-28 2001-01-23 Xerox Corporation Method and apparatus for selecting text and image data from video images
US6760728B1 (en) * 2000-09-27 2004-07-06 Palmsource, Inc. Method and apparatus for importing and exporting directory and calendar information to and from personal information management applications
US20060284981A1 (en) * 2005-06-20 2006-12-21 Ricoh Company, Ltd. Information capture and recording system
US20080002888A1 (en) * 2006-06-29 2008-01-03 Nokia Corporation Apparatus, method, device and computer program product providing enhanced text copy capability with touch input display
US20080082932A1 (en) * 2006-09-29 2008-04-03 Beumer Bradley R Computer-Implemented Clipboard
US7568080B2 (en) * 2002-10-07 2009-07-28 Commvault Systems, Inc. Snapshot storage and management system with indexing and user interface
US7600198B2 (en) * 2005-11-23 2009-10-06 Bluebeam Software, Inc. Method of tracking data objects using related thumbnails in a palette window
US20100195131A1 (en) * 2009-02-03 2010-08-05 Canon Kabushiki Kaisha Information processing apparatus, and print control method
US20110035145A1 (en) * 2008-04-17 2011-02-10 Sanyo Electric Co., Ltd. Navigation device
US20110081083A1 (en) * 2009-10-07 2011-04-07 Google Inc. Gesture-based selective text recognition
US20110107206A1 (en) * 2009-11-03 2011-05-05 Oto Technologies, Llc E-reader semantic text manipulation
US20110231796A1 (en) * 2010-02-16 2011-09-22 Jose Manuel Vigil Methods for navigating a touch screen device in conjunction with gestures
US20120010995A1 (en) * 2008-10-23 2012-01-12 Savnor Technologies Web content capturing, packaging, distribution
US20120042272A1 (en) * 2010-08-12 2012-02-16 Hong Jiyoung Mobile terminal and method of controlling the same
US20120110486A1 (en) * 2010-10-01 2012-05-03 Imerj LLC Universal clipboard
US20120131497A1 (en) * 2010-11-18 2012-05-24 Google Inc. Orthogonal Dragging on Scroll Bars
US20120144293A1 (en) * 2010-12-06 2012-06-07 Samsung Electronics Co., Ltd. Display apparatus and method of providing user interface thereof
US20120159334A1 (en) * 2010-12-21 2012-06-21 Microsoft Corporation Extensible system action for sharing while remaining in context
US20130011055A1 (en) * 2011-07-08 2013-01-10 Qualcomm Incorporated Parallel Processing Method and Apparatus for Determining Text Information from an Image
US20130013987A1 (en) * 2011-07-04 2013-01-10 Nhn Corporation System and method for linking web documents
US20130046935A1 (en) * 2011-08-18 2013-02-21 Microsoft Corporation Shared copy cache across networked devices
US20130104037A1 (en) * 2011-10-20 2013-04-25 Microsoft Corporation Merging and Fragmenting Graphical Objects
US20130104032A1 (en) * 2011-10-19 2013-04-25 Jiyoun Lee Mobile terminal and method of controlling the same
US20130187873A1 (en) * 2012-01-25 2013-07-25 Samsung Electronics Co. Ltd. Scroll control apparatus and method of terminal
US20130254714A1 (en) * 2012-03-23 2013-09-26 Samsung Electronics Co., Ltd. Method and apparatus for providing floating user interface
US20130304815A1 (en) * 2012-05-10 2013-11-14 Intel Mobile Communications GmbH Method for transferring data between a first device and a second device
US20130332878A1 (en) * 2011-08-08 2013-12-12 Samsung Electronics Co., Ltd. Apparatus and method for performing capture in portable terminal
US20130346872A1 (en) * 2012-06-25 2013-12-26 Microsoft Corporation Input method editor application platform
US20130346914A1 (en) * 2012-06-20 2013-12-26 Samsung Electronics Co., Ltd. Information display apparatus and method of user device
US20140053061A1 (en) * 2012-08-16 2014-02-20 Realnetworks, Inc. System for clipping webpages
US20140123045A1 (en) * 2012-10-31 2014-05-01 Motorola Mobility Llc Mixed Type Text Extraction and Distribution
US8723824B2 (en) * 2011-09-27 2014-05-13 Apple Inc. Electronic devices with sidewall displays
US20140132535A1 (en) * 2012-11-12 2014-05-15 Lg Electronics Inc. Mobile terminal and control method thereof
US20140141836A1 (en) * 2009-07-18 2014-05-22 Abbyy Software Ltd. Entering Information Through an OCR-Enabled Viewfinder
US20140157169A1 (en) * 2012-12-05 2014-06-05 Microsoft Corporation Clip board system with visual affordance
US20140168095A1 (en) * 2012-12-14 2014-06-19 Barnesandnoble.Com Llc Touch sensitive device with pinch-based archive and restore functionality
US20140223372A1 (en) * 2013-02-04 2014-08-07 602531 British Columbia Ltd. Method, system, and apparatus for executing an action related to user selection
US20140258838A1 (en) * 2013-03-11 2014-09-11 Sap Ag Expense input utilities, systems, and methods
US20140280132A1 (en) * 2013-03-15 2014-09-18 Desire2Learn Incorporated Method and system for network enabled digital clipboard
US20140282106A1 (en) * 2013-03-13 2014-09-18 Cambridgesoft Corporation Systems and methods for gesture-based sharing of data between separate electronic devices
US20140304599A1 (en) * 2011-10-06 2014-10-09 Sony Ericsson Mobile Communications Ab Method and Electronic Device for Manipulating a First or a Second User Interface Object
US20140324943A1 (en) * 2013-04-30 2014-10-30 Adobe Systems Incorporated Drag-and-drop clipboard for html documents
US20140372952A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation Simplified Data Input in Electronic Documents
US20150058369A1 (en) * 2013-08-23 2015-02-26 Samsung Electronics Co., Ltd. Electronic device and method for using captured image in electronic device
US20150207850A1 (en) * 2012-09-11 2015-07-23 Google Inc. Clipboard
US9092428B1 (en) * 2011-12-09 2015-07-28 Guangsheng Zhang System, methods and user interface for discovering and presenting information in text content
US20150254518A1 (en) * 2012-10-26 2015-09-10 Blackberry Limited Text recognition through images and video
US9165406B1 (en) * 2012-09-21 2015-10-20 A9.Com, Inc. Providing overlays based on text in a live camera view
US9201868B1 (en) * 2011-12-09 2015-12-01 Guangsheng Zhang System, methods and user interface for identifying and presenting sentiment information
US20160036966A1 (en) * 2014-08-01 2016-02-04 Lg Electronics Inc. Mobile terminal controlled by at least one touch and method of controlling therefor
US20160140095A1 (en) * 2014-11-18 2016-05-19 Samsung Electronics Co., Ltd. Device and method for managing reference information for provided content
US20160154686A1 (en) * 2013-08-01 2016-06-02 Tencent Technology (Shenzhen) Company Limited Method and apparatus for presenting clipboard contents on a mobile terminal
US20160210013A1 (en) * 2015-01-21 2016-07-21 Lg Electronics Inc. Mobile terminal and method for controlling the same
US20160216862A1 (en) * 2012-04-25 2016-07-28 Amazon Technologies, Inc. Using gestures to deliver content to predefined destinations
US9429447B2 (en) * 2014-06-10 2016-08-30 Samsung Electronics Co., Ltd. Method of utilizing image based on location information of the image in electronic device and the electronic device thereof
US20160328110A1 (en) * 2013-12-30 2016-11-10 Beijing Qihoo Technology Company Limited Method, system, equipment and device for identifying image based on image
US9645732B2 (en) * 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US20170220680A1 (en) * 2010-02-22 2017-08-03 Google Inc. Search query reformulation using a user-selected term
US20170235445A1 (en) * 2014-07-14 2017-08-17 Samsung Electronics Co., Ltd. Electronic device for playing-playing contents and method thereof

Family Cites Families (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1185706A (en) * 1997-09-08 1999-03-30 Sharp Corp Method and device for generating application data and recording medium recording application data generation program
US6683631B2 (en) * 1998-12-31 2004-01-27 International Business Machines Corporation System and method for selecting and deselecting information in an electronic document
JP4556301B2 (en) * 2000-07-25 2010-10-06 ソニー株式会社 Video signal processing device and video display device
US6823084B2 (en) * 2000-09-22 2004-11-23 Sri International Method and apparatus for portably recognizing text in an image sequence of scene imagery
US7884804B2 (en) * 2003-04-30 2011-02-08 Microsoft Corporation Keyboard with input-sensitive display device
KR100584344B1 (en) * 2003-06-10 2006-05-26 삼성전자주식회사 Method for recognizing a character in potable terminal having a image imput part
JP2005033346A (en) * 2003-07-09 2005-02-03 Hitachi Ltd Apparatus and method for processing information, and software
US7966352B2 (en) * 2004-01-26 2011-06-21 Microsoft Corporation Context harvesting from selected content
US8677274B2 (en) * 2004-11-10 2014-03-18 Apple Inc. Highlighting items for search results
US7559033B2 (en) * 2005-07-21 2009-07-07 International Business Machines Corporation Method and system for improving selection capability for user interface
US20070044013A1 (en) 2005-08-18 2007-02-22 Sony Ericsson Mobile Communications Ab Methods, devices and computer program products for saving content of a mobile terminal display
US20080086700A1 (en) * 2006-10-06 2008-04-10 Rodriguez Robert A Systems and Methods for Isolating On-Screen Textual Data
US7954047B2 (en) * 2007-08-06 2011-05-31 Apple Inc. Cutting and copying discontiguous selections of cells
US9137377B2 (en) * 2007-08-22 2015-09-15 Citrix Systems, Inc. Systems and methods for at least partially releasing an appliance from a private branch exchange
US20090055379A1 (en) * 2007-08-22 2009-02-26 Richard Murtagh Systems and Methods for Locating Contact Information
US20090055842A1 (en) * 2007-08-22 2009-02-26 Richard Murtagh Systems and Methods for Establishing a Communication Session
CA2598400A1 (en) * 2007-08-22 2009-02-22 Hazem Y. Abdelazim System and method for onscreen text recognition for mobile devices
KR100929236B1 (en) * 2007-09-18 2009-12-01 엘지전자 주식회사 Portable terminal with touch screen and operation control method thereof
EP2051173A3 (en) * 2007-09-27 2009-08-12 Magix Ag System and method for dynamic content insertion from the internet into a multimedia work
US20090125848A1 (en) * 2007-11-14 2009-05-14 Susann Marie Keohane Touch surface-sensitive edit system
US20090164493A1 (en) * 2007-12-24 2009-06-25 Johnsgard Todd J Apparatus and methods for editing content on a wireless device
US8650507B2 (en) * 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
KR20090102886A (en) 2008-03-27 2009-10-01 엘지전자 주식회사 Display apparatus and method for summarizing of electronic document
KR101504682B1 (en) * 2008-09-10 2015-03-20 엘지전자 주식회사 Controlling a Mobile Terminal with at least two display area
US20100122160A1 (en) * 2008-11-10 2010-05-13 International Business Machines Corporation Non-contiguous content duplication
JP2010160564A (en) * 2009-01-06 2010-07-22 Toshiba Corp Portable terminal
US9569001B2 (en) * 2009-02-03 2017-02-14 Massachusetts Institute Of Technology Wearable gestural interface
EP2219354A3 (en) * 2009-02-13 2014-05-14 Samsung Electronics Co., Ltd. Operation method and system of mobile terminal connected to electronic device
CN101576878A (en) * 2009-06-17 2009-11-11 董名垂 System and method for instantly translating web pages hooked by users
US8515185B2 (en) * 2009-11-25 2013-08-20 Google Inc. On-screen guideline-based selective text recognition
JP4851604B2 (en) * 2010-01-27 2012-01-11 京セラ株式会社 Portable electronic device and method for controlling portable electronic device
EP2383970B1 (en) * 2010-04-30 2013-07-10 beyo GmbH Camera based method for text input and keyword detection
JP5676942B2 (en) * 2010-07-06 2015-02-25 キヤノン株式会社 Image processing apparatus, image processing method, and program
KR101784877B1 (en) * 2010-07-12 2017-11-07 삼성전자주식회사 Method and apparatus for managing menu item in a portable terminal
JP5573457B2 (en) * 2010-07-23 2014-08-20 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
KR101271143B1 (en) 2010-08-06 2013-06-04 주식회사 금성이앤씨 Power generator attached to the floodgate
KR101749282B1 (en) * 2010-09-17 2017-06-21 에스프린팅솔루션 주식회사 Management apparatus and method for controlling screen capture of management apparatus
US20120124524A1 (en) * 2010-11-15 2012-05-17 Microsoft Corporation Quick Data Share
KR20120076014A (en) 2010-12-29 2012-07-09 엘지전자 주식회사 Method for operating a communication terminal
KR20120102262A (en) * 2011-03-08 2012-09-18 삼성전자주식회사 The method for selecting a desired contents from text in portable terminal and device thererof
CN102855079B (en) * 2011-05-24 2016-06-08 Lg电子株式会社 Mobile terminal
KR101589650B1 (en) * 2011-07-05 2016-01-29 네이버 주식회사 Document link system for aligning copied document onto clipboard based on cloud and displaying document and services linkable with document and method thereof
US8745084B2 (en) * 2011-07-20 2014-06-03 Docscorp Australia Repository content analysis and management
KR101851239B1 (en) 2011-11-08 2018-04-23 삼성전자 주식회사 Device and method for processing an image expression of wireless terminal
US20130177295A1 (en) * 2012-01-09 2013-07-11 Microsoft Corporation Enabling copy and paste functionality for videos and other media content
US8704948B2 (en) * 2012-01-18 2014-04-22 Eldon Technology Limited Apparatus, systems and methods for presenting text identified in a video image
CN103294657B (en) * 2012-03-02 2017-10-27 富泰华工业(深圳)有限公司 Method for editing text and system
WO2013131171A1 (en) * 2012-03-07 2013-09-12 Quillsoft Ltd. Constrained digital text reader
US9570050B2 (en) * 2012-03-13 2017-02-14 Assima Switzerland Sa System and method for enhanced screen copy
CN103379212B (en) * 2012-04-24 2015-06-24 宇龙计算机通信科技(深圳)有限公司 Method for rapidly replying message and mobile terminal
CN103389874B (en) * 2012-05-07 2016-09-14 Lg电子株式会社 Mobile terminal and control method thereof
CN102768614B (en) * 2012-06-14 2015-07-15 广东步步高电子工业有限公司 Text processing method applied to touch screen mobile handheld device
KR102068604B1 (en) * 2012-08-28 2020-01-22 삼성전자 주식회사 Apparatus and method for recognizing a character in terminal equipment
KR102147203B1 (en) * 2012-09-10 2020-08-25 엘지전자 주식회사 Mobile terminal and control method therof
US20140071171A1 (en) * 2012-09-12 2014-03-13 Alcatel-Lucent Usa Inc. Pinch-and-zoom, zoom-and-pinch gesture control
US9164658B2 (en) * 2012-10-12 2015-10-20 Cellco Partnership Flexible selection tool for mobile devices
US9098127B2 (en) * 2012-10-17 2015-08-04 Blackberry Limited Electronic device including touch-sensitive display and method of controlling same
US9607289B2 (en) * 2012-11-14 2017-03-28 Facebook, Inc. Content type filter
KR20140062886A (en) 2012-11-15 2014-05-26 엘지전자 주식회사 Mobile terminal and control method thereof
US20140143681A1 (en) * 2012-11-16 2014-05-22 Rohit Chandra Method and apparatus for sharing user-selected content
KR102014778B1 (en) * 2012-12-14 2019-08-27 엘지전자 주식회사 Digital device for providing text messaging service and the method for controlling the same
KR101992194B1 (en) * 2012-12-31 2019-06-25 엘지전자 주식회사 Mobile terminal and controlling method thereof
US20140188802A1 (en) * 2012-12-31 2014-07-03 Appsense Limited Pull and paste
CN103136181B (en) * 2013-03-06 2017-04-12 惠州Tcl移动通信有限公司 Electronic reading device and text selecting and storing method thereof
US20160266769A1 (en) * 2013-03-14 2016-09-15 Motorola Mobility Llc Text display and selection system
US20150372286A1 (en) * 2013-03-15 2015-12-24 Applied Materials, Inc. Apparatus for material spray deposition of high solid percentage slurries for battery active material manufacture applications
CN104077273A (en) * 2013-03-27 2014-10-01 腾讯科技(深圳)有限公司 Method and device for extracting webpage contents
KR102088911B1 (en) 2013-04-18 2020-03-13 엘지전자 주식회사 Mobile terminal and control method thereof
JP2015014960A (en) * 2013-07-05 2015-01-22 ソニー株式会社 Information processor and storage medium
CN103324440B (en) * 2013-07-05 2016-06-08 广东欧珀移动通信有限公司 A kind of method utilizing multi-point touch to select word content
US9237410B1 (en) * 2013-08-15 2016-01-12 Mediaspark Incorporated Sharing mobile device content in a many-to-one manner
KR102065417B1 (en) * 2013-09-23 2020-02-11 엘지전자 주식회사 Wearable mobile terminal and method for controlling the same
GB2521338A (en) * 2013-09-26 2015-06-24 Ibm Text selection
JP2015069365A (en) * 2013-09-27 2015-04-13 シャープ株式会社 Information processing equipment and control program
US9329692B2 (en) * 2013-09-27 2016-05-03 Microsoft Technology Licensing, Llc Actionable content displayed on a touch screen
JP6419421B2 (en) * 2013-10-31 2018-11-07 株式会社東芝 Image display device, image display method, and program
US10915698B2 (en) * 2013-12-31 2021-02-09 Barnes & Noble College Booksellers, Llc Multi-purpose tool for interacting with paginated digital content
US10275050B2 (en) * 2014-05-23 2019-04-30 Microsoft Technology Licensing, Llc Ink for a shared interactive space
US9965559B2 (en) * 2014-08-21 2018-05-08 Google Llc Providing automatic actions for mobile onscreen content
KR101631999B1 (en) * 2014-08-29 2016-06-21 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20160171106A1 (en) * 2014-12-11 2016-06-16 Microsoft Corporation Webpage content storage and review
US10409465B2 (en) * 2015-12-08 2019-09-10 International Business Machines Corporation Selecting areas of content on a touch screen

Patent Citations (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6178270B1 (en) * 1997-05-28 2001-01-23 Xerox Corporation Method and apparatus for selecting text and image data from video images
US6760728B1 (en) * 2000-09-27 2004-07-06 Palmsource, Inc. Method and apparatus for importing and exporting directory and calendar information to and from personal information management applications
US7568080B2 (en) * 2002-10-07 2009-07-28 Commvault Systems, Inc. Snapshot storage and management system with indexing and user interface
US20060284981A1 (en) * 2005-06-20 2006-12-21 Ricoh Company, Ltd. Information capture and recording system
US7600198B2 (en) * 2005-11-23 2009-10-06 Bluebeam Software, Inc. Method of tracking data objects using related thumbnails in a palette window
US20080002888A1 (en) * 2006-06-29 2008-01-03 Nokia Corporation Apparatus, method, device and computer program product providing enhanced text copy capability with touch input display
US20080082932A1 (en) * 2006-09-29 2008-04-03 Beumer Bradley R Computer-Implemented Clipboard
US20110035145A1 (en) * 2008-04-17 2011-02-10 Sanyo Electric Co., Ltd. Navigation device
US20120010995A1 (en) * 2008-10-23 2012-01-12 Savnor Technologies Web content capturing, packaging, distribution
US20100195131A1 (en) * 2009-02-03 2010-08-05 Canon Kabushiki Kaisha Information processing apparatus, and print control method
US20140141836A1 (en) * 2009-07-18 2014-05-22 Abbyy Software Ltd. Entering Information Through an OCR-Enabled Viewfinder
US20110081083A1 (en) * 2009-10-07 2011-04-07 Google Inc. Gesture-based selective text recognition
US20110107206A1 (en) * 2009-11-03 2011-05-05 Oto Technologies, Llc E-reader semantic text manipulation
US20110231796A1 (en) * 2010-02-16 2011-09-22 Jose Manuel Vigil Methods for navigating a touch screen device in conjunction with gestures
US20170220680A1 (en) * 2010-02-22 2017-08-03 Google Inc. Search query reformulation using a user-selected term
US20120042272A1 (en) * 2010-08-12 2012-02-16 Hong Jiyoung Mobile terminal and method of controlling the same
US20120110486A1 (en) * 2010-10-01 2012-05-03 Imerj LLC Universal clipboard
US20120131497A1 (en) * 2010-11-18 2012-05-24 Google Inc. Orthogonal Dragging on Scroll Bars
US20120144293A1 (en) * 2010-12-06 2012-06-07 Samsung Electronics Co., Ltd. Display apparatus and method of providing user interface thereof
US20120159334A1 (en) * 2010-12-21 2012-06-21 Microsoft Corporation Extensible system action for sharing while remaining in context
US20130013987A1 (en) * 2011-07-04 2013-01-10 Nhn Corporation System and method for linking web documents
US20130011055A1 (en) * 2011-07-08 2013-01-10 Qualcomm Incorporated Parallel Processing Method and Apparatus for Determining Text Information from an Image
US20130332878A1 (en) * 2011-08-08 2013-12-12 Samsung Electronics Co., Ltd. Apparatus and method for performing capture in portable terminal
US20130046935A1 (en) * 2011-08-18 2013-02-21 Microsoft Corporation Shared copy cache across networked devices
US8723824B2 (en) * 2011-09-27 2014-05-13 Apple Inc. Electronic devices with sidewall displays
US20140304599A1 (en) * 2011-10-06 2014-10-09 Sony Ericsson Mobile Communications Ab Method and Electronic Device for Manipulating a First or a Second User Interface Object
US20130104032A1 (en) * 2011-10-19 2013-04-25 Jiyoun Lee Mobile terminal and method of controlling the same
US20130104037A1 (en) * 2011-10-20 2013-04-25 Microsoft Corporation Merging and Fragmenting Graphical Objects
US9092428B1 (en) * 2011-12-09 2015-07-28 Guangsheng Zhang System, methods and user interface for discovering and presenting information in text content
US9201868B1 (en) * 2011-12-09 2015-12-01 Guangsheng Zhang System, methods and user interface for identifying and presenting sentiment information
US20130187873A1 (en) * 2012-01-25 2013-07-25 Samsung Electronics Co. Ltd. Scroll control apparatus and method of terminal
US20130254714A1 (en) * 2012-03-23 2013-09-26 Samsung Electronics Co., Ltd. Method and apparatus for providing floating user interface
US20160216862A1 (en) * 2012-04-25 2016-07-28 Amazon Technologies, Inc. Using gestures to deliver content to predefined destinations
US20130304815A1 (en) * 2012-05-10 2013-11-14 Intel Mobile Communications GmbH Method for transferring data between a first device and a second device
US20130346914A1 (en) * 2012-06-20 2013-12-26 Samsung Electronics Co., Ltd. Information display apparatus and method of user device
US20130346872A1 (en) * 2012-06-25 2013-12-26 Microsoft Corporation Input method editor application platform
US20140053061A1 (en) * 2012-08-16 2014-02-20 Realnetworks, Inc. System for clipping webpages
US20150207850A1 (en) * 2012-09-11 2015-07-23 Google Inc. Clipboard
US9165406B1 (en) * 2012-09-21 2015-10-20 A9.Com, Inc. Providing overlays based on text in a live camera view
US20150254518A1 (en) * 2012-10-26 2015-09-10 Blackberry Limited Text recognition through images and video
US20140123045A1 (en) * 2012-10-31 2014-05-01 Motorola Mobility Llc Mixed Type Text Extraction and Distribution
US9170714B2 (en) * 2012-10-31 2015-10-27 Google Technology Holdings LLC Mixed type text extraction and distribution
US20140132535A1 (en) * 2012-11-12 2014-05-15 Lg Electronics Inc. Mobile terminal and control method thereof
US20140157169A1 (en) * 2012-12-05 2014-06-05 Microsoft Corporation Clip board system with visual affordance
US20140168095A1 (en) * 2012-12-14 2014-06-19 Barnesandnoble.Com Llc Touch sensitive device with pinch-based archive and restore functionality
US20140223372A1 (en) * 2013-02-04 2014-08-07 602531 British Columbia Ltd. Method, system, and apparatus for executing an action related to user selection
US20140258838A1 (en) * 2013-03-11 2014-09-11 Sap Ag Expense input utilities, systems, and methods
US20140282106A1 (en) * 2013-03-13 2014-09-18 Cambridgesoft Corporation Systems and methods for gesture-based sharing of data between separate electronic devices
US20140280132A1 (en) * 2013-03-15 2014-09-18 Desire2Learn Incorporated Method and system for network enabled digital clipboard
US20140324943A1 (en) * 2013-04-30 2014-10-30 Adobe Systems Incorporated Drag-and-drop clipboard for html documents
US20140372952A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation Simplified Data Input in Electronic Documents
US20160154686A1 (en) * 2013-08-01 2016-06-02 Tencent Technology (Shenzhen) Company Limited Method and apparatus for presenting clipboard contents on a mobile terminal
US20150058369A1 (en) * 2013-08-23 2015-02-26 Samsung Electronics Co., Ltd. Electronic device and method for using captured image in electronic device
US20160328110A1 (en) * 2013-12-30 2016-11-10 Beijing Qihoo Technology Company Limited Method, system, equipment and device for identifying image based on image
US9429447B2 (en) * 2014-06-10 2016-08-30 Samsung Electronics Co., Ltd. Method of utilizing image based on location information of the image in electronic device and the electronic device thereof
US20170235445A1 (en) * 2014-07-14 2017-08-17 Samsung Electronics Co., Ltd. Electronic device for playing-playing contents and method thereof
US20160036966A1 (en) * 2014-08-01 2016-02-04 Lg Electronics Inc. Mobile terminal controlled by at least one touch and method of controlling therefor
US20160140095A1 (en) * 2014-11-18 2016-05-19 Samsung Electronics Co., Ltd. Device and method for managing reference information for provided content
US20160210013A1 (en) * 2015-01-21 2016-07-21 Lg Electronics Inc. Mobile terminal and method for controlling the same
US9645732B2 (en) * 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD940756S1 (en) 2014-09-01 2022-01-11 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD772297S1 (en) * 2014-09-01 2016-11-22 Apple Inc. Display screen or portion thereof with graphical user interface
US10528207B2 (en) * 2015-01-12 2020-01-07 Facebook, Inc. Content-based interactive elements on online social networks
US20170269800A1 (en) * 2015-01-13 2017-09-21 Lg Electronics Inc. Mobile terminal and control method thereof
US10423306B2 (en) * 2015-01-13 2019-09-24 Lg Electronics Inc. Mobile terminal and control method thereof
USD822042S1 (en) * 2016-02-08 2018-07-03 Bizagi Group Ltd. Display screen with graphical user interface
USD823326S1 (en) * 2016-02-08 2018-07-17 Bizagi Group Ltd. Display screen with graphical user interface
US20170315684A1 (en) * 2016-04-28 2017-11-02 Chiun Mai Communication Systems, Inc. Method for managing multiple types of data
US10809880B2 (en) * 2016-04-28 2020-10-20 Chiun Mai Communication Systems, Inc. Method for managing multiple types of data
US20180203825A1 (en) * 2017-01-16 2018-07-19 Seiko Epson Corporation Electronic apparatus, electronic system, method of controlling electronic apparatus, and computer-readable recording medium
US20180332216A1 (en) * 2017-05-12 2018-11-15 Microsoft Technology Licensing, Llc Synchronized display on hinged multi-screen device
US10440263B2 (en) * 2017-05-12 2019-10-08 Microsoft Technology Licensing, Llc Synchronized display on hinged multi-screen device
US11269483B2 (en) * 2017-05-16 2022-03-08 Apple Inc. Device, method, and graphical user interface for managing content items and associated metadata
USD861704S1 (en) 2017-09-11 2019-10-01 Apple Inc. Electronic device with graphical user interface
USD900833S1 (en) 2017-09-11 2020-11-03 Apple Inc. Electronic device with animated graphical user interface
USD956088S1 (en) 2017-09-11 2022-06-28 Apple Inc. Electronic device with animated graphical user interface
USD900145S1 (en) * 2018-01-05 2020-10-27 Google Llc Display screen or portion thereof with transitional graphical user interface
US11489944B2 (en) * 2020-04-21 2022-11-01 Acer Incorporated Electronic apparatus and data transmission method thereof based off marquees
EP4318299A4 (en) * 2021-04-29 2024-08-07 Vivo Mobile Communication Co Ltd Document generation method and apparatus and electronic device

Also Published As

Publication number Publication date
CN115097982B (en) 2023-12-26
CN106662969A (en) 2017-05-10
KR20220091446A (en) 2022-06-30
AU2015312629B2 (en) 2020-11-05
WO2016036132A1 (en) 2016-03-10
CN106662969B (en) 2022-06-03
KR102497195B1 (en) 2023-02-08
KR102411890B1 (en) 2022-06-23
AU2015312629A1 (en) 2017-01-19
KR20160027865A (en) 2016-03-10
EP2993567B1 (en) 2019-05-15
US20210064193A1 (en) 2021-03-04
KR20230025675A (en) 2023-02-22
US20240118781A1 (en) 2024-04-11
EP2993567A1 (en) 2016-03-09
CN115097981A (en) 2022-09-23
US11847292B2 (en) 2023-12-19
CN115097981B (en) 2024-02-23
CN115097982A (en) 2022-09-23

Similar Documents

Publication Publication Date Title
US11847292B2 (en) Method of processing content and electronic device thereof
KR102309175B1 (en) Scrapped Information Providing Method and Apparatus
US10509616B2 (en) Method for operating electronic device, and electronic device
US10416843B2 (en) Electronic device and method of controlling object in electronic device
EP3245583B1 (en) Electronic device and method of processing information in electronic device
US20190187758A1 (en) Flexible device and operating method therefor
US20160191694A1 (en) Method and apparatus for providing communication-related information
EP3107087B1 (en) Device for controlling multiple areas of display independently and method thereof
US20160313781A1 (en) Method for displaying user interface and electronic device thereof
US10719209B2 (en) Method for outputting screen and electronic device supporting the same
US10466856B2 (en) Electronic device having two displays and a method for executing a different application on each display of the electronic device based on simultaneous inputs into a plurality of application icons
US20170017359A1 (en) Electronic device for displaying image and control method thereof
US20190026017A1 (en) Method for displaying user interface and electronic device for supporting same
US20180129409A1 (en) Method for controlling execution of application on electronic device using touchscreen and electronic device for the same
US10613724B2 (en) Control method for selecting and pasting content
US20180173701A1 (en) Method for contents tagging and electronic device supporting the same
US20160100100A1 (en) Method for Configuring Screen, Electronic Device and Storage Medium
US10009421B2 (en) Contents control in electronic device
EP3070623A1 (en) Electronic device and method of processing information in electronic device
US11061544B2 (en) Method and electronic device for processing input
US20180074697A1 (en) Method for outputting screen according to force input and electronic device supporting the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, KYUNG-HWAN;KIM, DONG-JEON;JEONG, JIN-HONG;AND OTHERS;SIGNING DATES FROM 20150824 TO 20150826;REEL/FRAME:036480/0224

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION