[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2022257889A1 - 显示方法及相关装置 - Google Patents

显示方法及相关装置 Download PDF

Info

Publication number
WO2022257889A1
WO2022257889A1 PCT/CN2022/097177 CN2022097177W WO2022257889A1 WO 2022257889 A1 WO2022257889 A1 WO 2022257889A1 CN 2022097177 W CN2022097177 W CN 2022097177W WO 2022257889 A1 WO2022257889 A1 WO 2022257889A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
electronic device
preset
camera
user interface
Prior art date
Application number
PCT/CN2022/097177
Other languages
English (en)
French (fr)
Inventor
张羽翕
吴忠标
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US18/041,547 priority Critical patent/US20240012451A1/en
Priority to EP22819487.4A priority patent/EP4181494A4/en
Priority to KR1020237005512A priority patent/KR20230038290A/ko
Publication of WO2022257889A1 publication Critical patent/WO2022257889A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1641Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • G06F1/1618Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position the display being foldable up to the back of the other housing with a single degree of freedom, e.g. by 360° rotation over the axis defined by the rear edge of the base enclosure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1677Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0241Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings using relative motion of the body parts to change the operational status of the telephone set, e.g. switching on/off, answering incoming call
    • H04M1/0243Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings using relative motion of the body parts to change the operational status of the telephone set, e.g. switching on/off, answering incoming call using the relative angle between housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • H04M1/0268Details of the structure or mounting of specific components for a display module assembly including a flexible display panel
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the present application relates to the field of electronic technology, in particular to a display method and related devices.
  • a larger folding screen is configured on the front of the mobile phone.
  • the folding screen When the folding screen is in the folded state, the folding screen can be folded into at least two screens, the electronic device can display on one of the screens, and the user can manipulate the display content of one of the above-mentioned screens through touch operations or buttons.
  • the folding screen When the folding screen is in the unfolded state, the electronic device can display a full screen on the folding screen, and the user can control the displayed content of the entire folding screen through touch operations or keys.
  • the present application provides a display method, which realizes automatically starting a corresponding camera and a display screen for real-time preview under a specific posture of a foldable electronic device, avoids cumbersome user operations, and effectively improves user experience.
  • the present application provides a display method, which is applied to an electronic device.
  • the electronic device includes a first screen, a folding screen, and a first camera.
  • the folding screen can be folded along the folding edge to form a second screen and a third screen.
  • the first The orientation of the screen and the second screen are opposite, the orientation of the first screen is consistent with the shooting direction of the first camera, and the orientation of the second screen is consistent with the shooting direction of the second camera.
  • the method includes:
  • the electronic device Based on the detected first angle between the second screen and the third screen, it is determined that the electronic device is in a first preset posture; based on the first preset posture of the electronic device, the electronic device displays a first user interface on the first screen; the second The first preview display area of a user interface is used to display the image collected by the first camera.
  • the first included angle does not include 0° and 180°; based on the detected first included angle, determine that the electronic device In the second preset posture; based on the second preset posture of the electronic device, the electronic device displays a second user interface on the second screen; the second preview display area of the second user interface is used to display images collected by the second camera, and the second In the second preset attitude, the first included angle does not include 0° and 180°.
  • the electronic device when the electronic device detects that the electronic device is in a first preset posture, the electronic device may start the first camera corresponding to the first preset posture to collect images, and display the collected images in the first preset posture. Set the first screen corresponding to the gesture.
  • the electronic device detects that the electronic device is in the second preset posture, the electronic device can start the second camera corresponding to the second preset posture to collect images, and display the collected images on the second screen corresponding to the second preset posture . It should be noted that, when the electronic device is in the first preset posture, it is convenient for the user to view the display content on the first screen; when the electronic device is in the second preset posture, it is convenient for the user to view the display content on the second screen.
  • the user can start the camera corresponding to the preset posture to take pictures without both hands being free, and perform real-time preview through the display screen corresponding to the preset posture, avoiding cumbersome User operation effectively improves the user experience.
  • the method further includes: based on the detected first angle, determining that the electronic device is in a third preset posture; based on the third preset posture of the electronic device, the electronic device Three screens are displayed in split screens, the display content of the second screen includes the third preview display area, and the third preview display area is used to display the images collected by the second camera.
  • the first included angle does not include 0° and 180°.
  • the electronic device when the electronic device detects that the electronic device is in a third preset posture, the electronic device may start the second camera corresponding to the third preset posture to collect images, and display the collected images in the third preset posture. Assuming that the gesture corresponds to the second screen, interface elements associated with the image captured by the above-mentioned second camera can also be displayed through the third screen. It should be noted that the electronic device is in the third preset posture, which can facilitate the user to check the display content of the second screen and the third screen.
  • the user can start the second camera corresponding to the preset posture to take pictures without both hands being free, and perform real-time preview through the second screen corresponding to the preset posture, and Viewing the interface elements associated with the image captured by the second camera through the third screen avoids cumbersome user operations and effectively improves the user experience.
  • the electronic device displays the first user interface on the first screen, including: when it is detected that the electronic device is in the first preset posture, and the electronic device meets the first Under preset conditions, the electronic device starts the first camera to collect images, and displays the first user interface on the first screen; wherein, the first preset posture includes that the first included angle is within the first preset range, and the first preset condition Including one or more of the following: the pause time of the first angle at the current angle value reaches the first preset time; when the first screen is on, the electronic device does not receive any action within the second preset time In the input operation on the first screen; in the image collected by the camera corresponding to the first screen, the face or the face of the preset user is detected; in the image collected by the camera corresponding to the first screen, the first preset is detected gesture.
  • the user after the user places the electronic device in the first preset posture and makes the electronic device meet the first preset condition, the user can activate the first camera corresponding to the first preset posture without both hands being free. Shooting is performed, and real-time preview is performed on the first screen corresponding to the first preset posture, which avoids tedious user operations and effectively improves the user experience.
  • the electronic device displays the second user interface on the second screen, including: when it is detected that the electronic device is in the second preset posture, and the electronic device meets the second preset posture.
  • the electronic device starts the second camera to collect images, and displays a second user interface on the second screen; wherein, the second preset posture includes that the first included angle is within a second preset range, and the second preset condition Including one or more of the following: the pause time of the first angle at the current angle value reaches the first preset time; when the second screen is on, the electronic device does not receive any action within the second preset time In the input operation on the second screen; in the image collected by the camera corresponding to the second screen, the face or the face of the preset user is detected; in the image collected by the camera corresponding to the second screen, the second preset is detected gesture.
  • the user after the user places the electronic device in the second preset posture and makes the electronic device meet the second preset condition, the user can start the second camera corresponding to the second preset posture while freeing his hands Shooting is performed, and a real-time preview is performed through the second screen corresponding to the second preset posture, which avoids cumbersome user operations and effectively improves the user experience.
  • the electronic device performs split-screen display on the second screen and the third screen, including: when it is detected that the electronic device is in the third preset posture, and the electronic device When the third preset condition is satisfied, the electronic device starts the second camera to collect images, and displays them in split screens on the second screen and the third screen; wherein, the third preset posture includes that the first included angle is within a third preset range , the third preset condition includes one or more of the following: the pause time of the first angle at the current angle value reaches the first preset time; when the second screen and/or the third screen are on, the electronic The device does not receive any input operation on the second screen or the third screen within the second preset time; in the image collected by the camera corresponding to the second screen, a human face or the face of the preset user is detected; In the images captured by the cameras corresponding to the two screens, a third preset gesture is detected.
  • the user after the user places the electronic device in the third preset posture and makes the electronic device meet the third preset condition, the user can start the second camera corresponding to the third preset posture while freeing his hands Shoot, and perform real-time preview through the second screen corresponding to the third preset posture, and view the interface elements associated with the image collected by the second camera through the third screen, avoiding cumbersome user operations and effectively improving the user experience. Use experience.
  • the first preset posture further includes: the second angle between the third screen and the horizontal plane is within the fourth preset range, and the angle between the orientation of the third screen and the Z axis of the geographic coordinate system is less than 90° , the angle within the first preset range is greater than 0° and less than 120°; or, the first preset posture also includes: the second included angle is within the fourth preset range, the orientation of the third screen is in line with the Z of the geographic coordinate system The included angle of the axis is greater than 90°, and the angle within the first preset range is greater than 180° and less than 300°; or, the first preset posture further includes: the difference between the third included angle and the second included angle is within the fifth preset Within the set range, the third angle is the angle between the second screen and the horizontal plane, the angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is greater than 90°, and the angle within the first preset range is greater than 0° and less than 180°.
  • the first preset posture further includes: the second angle between the third screen and the horizontal plane is within the fourth preset range, and the angle between the orientation of the third screen and the Z axis of the geographic coordinate system is less than 90° , the angle within the first preset range is less than 120°; the above displaying the first user interface on the first screen includes: displaying the first user interface rotated by 180° on the first screen.
  • the first preset posture includes a first stand state and a fifth stand state, and when it is detected that the electronic device is in the first preset pose and the electronic device meets the first preset condition, the electronic device starts The first camera collects images, and displays the first user interface on the first screen, including: when it is detected that the electronic device is in the first support state, and the electronic device meets the first preset condition, the electronic device starts the first camera to collect images, And display the first user interface on the first screen; after the first user interface is displayed on the first screen, it also includes: when it is detected that the electronic device is switched from the first stand state to the fifth stand state, the electronic device is switched on the first screen The first user interface rotated by 180° is displayed.
  • the display direction of the user interface corresponding to the first screen can be adaptively adjusted according to the physical posture of the electronic device, so as to facilitate the user's viewing and effectively improve the user experience.
  • the above further includes: the electronic device identifies the first area where the preset local features are located in the image captured by the first camera, and the first preview display area is used to display A magnified image of the image in the first area.
  • the user after the user places the electronic device in the first preset posture and makes the electronic device meet the first preset condition, the user can activate the first camera corresponding to the first preset posture without both hands being free. Shoot, and preview the details of the preset local features in the image captured by the first camera in real time through the first screen.
  • the above further includes: receiving a user's first input operation; in response to the first input operation, the electronic device displays one or more of the following on the first user interface: Three interface elements: makeup test control, shooting control, camera switching control, fill light control, photo album control, and display frame; among them, the makeup test control is used to add preset makeup effects to the faces in the image displayed in the first preview display area
  • the shooting control is used to trigger the electronic device to save the image displayed in the first preview display area
  • the camera switching control is used to switch the camera for capturing images
  • the fill light control is used to supplement the ambient light
  • the photo album control is used to trigger the electronic device to display the user of the photo album application Interface
  • the display frame is used to display the enlarged image of the preset local features in the image captured by the first camera.
  • the user after the user places the electronic device in the first preset posture and makes the electronic device meet the first preset condition, the user can activate the first camera corresponding to the first preset posture without both hands being free. Shoot and display only the image captured by the first camera on the first screen. Then, after receiving the user's first input operation, other related interface elements, such as shooting controls, are displayed on the first screen.
  • the first camera is an ultraviolet camera, and the images collected by the first camera are used to highlight areas where sunscreen is applied.
  • the user after the user places the electronic device in the first preset posture and makes the electronic device meet the first preset condition, the user can activate the first camera corresponding to the first preset posture without both hands being free. Take a shot and preview the application of the sunscreen in real time on the first screen.
  • the method further includes: the electronic device controls the second screen and the third screen to turn off.
  • the electronic device when the electronic device controls the first screen to light up, it controls the second screen and the third screen to turn off. In this way, energy consumption can be effectively saved.
  • the second preset posture further includes: the second included angle between the third screen and the horizontal plane is within a fourth preset range, and the included angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90° °, the angle within the second preset range is greater than 60° and less than 180°; or, the second preset posture also includes: the second included angle is within the fourth preset range, the orientation of the third screen and the geographic coordinate system The included angle of the Z axis is greater than 90°, and the angle within the second preset range is greater than 240° and less than 360°; or, the second preset posture also includes: the difference between the third included angle and the second included angle is at the fifth Within the preset range, the third angle is the angle between the second screen and the horizontal plane, the angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is greater than 90°, and the angle within the second preset range is greater than 180° and Less than 360°.
  • the first included angle does not include 0° and 180°.
  • the method further includes: the electronic device controls the first screen to turn off.
  • the electronic device controls the second screen and the third screen to light up, it controls the first screen to turn off. In this way, energy consumption can be effectively saved.
  • the third preset posture further includes: the second included angle between the third screen and the horizontal plane is within a fourth preset range, and the included angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90° °, the angle within the third preset range is greater than 60° and less than 180°.
  • the display content on the third screen includes an enlarged image of a preset local feature in the image captured by the second camera.
  • the user when the user previews the image captured by the second camera in real time on the second screen, the user can also preview the details of preset local features in the image captured by the second camera in real time on the third screen.
  • the split-screen display on the second screen and the third screen includes: split-screen display of the third user interface on the second screen and the third display screen; the display content of the second screen includes the first The third preview display area of the third user interface, and zero, one or more interface elements in the third user interface except the third preview display area, the display content of the third screen includes the third user interface except the second One or more interface elements other than the display content of the screen; the third user interface also includes one or more of the following interface elements: makeup test control, shooting control, camera switching control, fill light control, photo album control, display frame; , the makeup test control is used to add a preset makeup effect to the face in the image in the preview display area of the third user interface; the shooting control is used to trigger the electronic device to save the image in the preview display area of the third user interface; camera switching The control is used to switch the camera for capturing images; the fill light control is used to supplement ambient light; the photo album control is used to trigger the electronic device to display the user interface of the photo album application; the display frame
  • the user after the user places the electronic device in the third preset posture and makes the electronic device meet the third preset condition, the user can start the second camera corresponding to the third preset posture while freeing his hands Shoot, and perform real-time preview through the second screen corresponding to the third preset posture, and control the shooting parameters of the second camera and the display effect of the preview image in the second screen through the interface elements displayed on the third screen, which effectively improves the user experience. use experience.
  • the present application provides a display method, which is applied to an electronic device.
  • the electronic device includes a first screen, a folding screen and a first camera.
  • the folding screen can be folded along the folding edge to form a second screen and a third screen.
  • the first screen and the The orientation of the second screen is opposite, and the orientation of the first screen is consistent with the shooting direction of the first camera.
  • the method includes:
  • the electronic device When it is detected that the electronic device is in the first preset posture and the electronic device meets the first preset condition, the electronic device starts the first camera to collect images, and displays the first user interface on the first screen, the first user interface of the first user interface
  • the preview display area is used to display images collected by the first camera; wherein, the first preset posture includes that the first angle between the second screen and the third screen is within the first preset range, and the first preset condition includes the following items or multiple items: the pause time of the first included angle at the current included angle value reaches the first preset time; when the first screen is on, the electronic device does not receive any action on the first screen within the second preset time
  • a face or the face of a preset user is detected; in the image captured by the camera corresponding to the first screen, a first preset gesture is detected.
  • the electronic device is in the first preset posture, which can facilitate the user to check the display content on the first screen.
  • the user can activate the first camera corresponding to the first preset posture without both hands being free. Shooting is performed, and real-time preview is performed on the first screen corresponding to the first preset posture, which avoids tedious user operations and effectively improves the user experience.
  • the camera corresponding to the above-mentioned first screen and the first camera may be the same camera, or may be different cameras.
  • the above-mentioned camera corresponding to the first screen is a low-power camera.
  • the electronic device further includes a second camera, and the orientation of the second screen is consistent with the shooting direction of the second camera.
  • the folding screen is folded along the folding edge to form the second screen and the third screen, the third screen and the third screen
  • the second camera is located on a different side of the folding edge
  • the method further includes: when it is detected that the electronic device is in a second preset posture and the electronic device satisfies a second preset condition, the electronic device starts the second camera to collect an image, and The second screen displays a second user interface, and the second preview display area of the second user interface is used to display images collected by the second camera; wherein, the second preset posture includes that the first included angle is within a second preset range, and the second preset posture includes that the first included angle is within a second preset range, and
  • the two preset conditions include one or more of the following: the pause time of the first included angle at the current included angle value reaches the first preset time; when the second screen is on, the electronic device is within the second prese
  • the electronic device is in the second preset posture, which can facilitate the user to view the display content on the second screen.
  • the user can start the second camera corresponding to the second preset posture while freeing his hands Shooting is performed, and a real-time preview is performed through the second screen corresponding to the second preset posture, which avoids cumbersome user operations and effectively improves the user experience.
  • the camera corresponding to the second screen and the second camera may be the same camera or different cameras.
  • the above-mentioned camera corresponding to the second screen is a low-power camera.
  • the electronic device further includes a second camera, and the orientation of the second screen is consistent with the shooting direction of the second camera.
  • the folding screen is folded along the folding edge to form the second screen and the third screen, the third screen and the third screen
  • the second camera is located on a different side of the folding edge
  • the method further includes: when it is detected that the electronic device is in a third preset posture and the electronic device satisfies a third preset condition, the electronic device starts the second camera to collect an image, and The second screen and the third screen are displayed in split screens, the display content of the second screen includes a third preview display area, and the third preview display area is used to display images collected by the second camera; wherein, the third preset posture includes the first The included angle is within the third preset range, and the third preset condition includes one or more of the following: the pause time of the first included angle at the current included angle value reaches the first preset time; when the second screen and/or the second When the three screens are on, the electronic device does not receive any input operation on
  • the electronic device is in the third preset posture, which can facilitate the user to check the display content of the second screen and the third screen.
  • the user after the user places the electronic device in the third preset posture and makes the electronic device meet the third preset condition, the user can start the second camera corresponding to the third preset posture while freeing his hands Shoot, and perform real-time preview through the second screen corresponding to the third preset posture, and view the interface elements associated with the image collected by the second camera through the third screen, avoiding cumbersome user operations and effectively improving the user experience. Use experience.
  • the first preset posture further includes: the second angle between the third screen and the horizontal plane is within the fourth preset range, and the angle between the orientation of the third screen and the Z axis of the geographic coordinate system is less than 90° , the angle within the first preset range is greater than 0° and less than 120°; or, the first preset posture also includes: the second included angle is within the fourth preset range, the orientation of the third screen is in line with the Z of the geographic coordinate system The included angle of the axis is greater than 90°, and the angle within the first preset range is greater than 180° and less than 300°; or, the first preset posture further includes: the difference between the third included angle and the second included angle is within the fifth preset Within the set range, the third angle is the angle between the second screen and the horizontal plane, the angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is greater than 90°, and the angle within the first preset range is greater than 0° and less than 180°.
  • the first preset posture further includes: the second angle between the third screen and the horizontal plane is within the fourth preset range, and the angle between the orientation of the third screen and the Z axis of the geographic coordinate system is less than 90° , the angle within the first preset range is less than 120°; the above displaying the first user interface on the first screen includes: displaying the first user interface rotated by 180° on the first screen.
  • the display direction of the user interface corresponding to the first screen can be adaptively adjusted according to the physical posture of the electronic device, so as to facilitate the user's viewing and effectively improve the user experience.
  • the above further includes: the electronic device identifies the first area where the preset local features are located in the image captured by the first camera, and the first preview display area is used to display A magnified image of the image in the first area.
  • the user after the user places the electronic device in the first preset posture and makes the electronic device meet the first preset condition, the user can activate the first camera corresponding to the first preset posture without both hands being free. Shoot, and preview the details of the preset local features in the image captured by the first camera in real time through the first screen.
  • the above further includes: receiving a user's first input operation; in response to the first input operation, the electronic device displays one or more of the following on the first user interface: Three interface elements: makeup test control, shooting control, camera switching control, fill light control, photo album control, and display frame; among them, the makeup test control is used to add preset makeup effects to the faces in the image displayed in the first preview display area
  • the shooting control is used to trigger the electronic device to save the image displayed in the first preview display area
  • the camera switching control is used to switch the camera for capturing images
  • the fill light control is used to supplement the ambient light
  • the photo album control is used to trigger the electronic device to display the user of the photo album application Interface
  • the display frame is used to display the enlarged image of the preset local features in the image captured by the first camera.
  • the user after the user places the electronic device in the first preset posture and makes the electronic device meet the first preset condition, the user can activate the first camera corresponding to the first preset posture without both hands being free. Shoot and display only the image captured by the first camera on the first screen. Then, after receiving the user's first input operation, other related interface elements, such as shooting controls, are displayed on the first screen.
  • the first camera is an ultraviolet camera, and the images collected by the first camera are used to highlight areas where sunscreen is applied.
  • the user after the user places the electronic device in the first preset posture and makes the electronic device meet the first preset condition, the user can activate the first camera corresponding to the first preset posture without both hands being free. Take a shot and preview the application of the sunscreen in real time on the first screen.
  • the method when it is detected that the electronic device is in the first preset posture and the electronic device meets the first preset condition, the method further includes: the electronic device controls the second screen and the third screen to turn off.
  • the electronic device when the electronic device controls the first screen to light up, it controls the second screen and the third screen to turn off. In this way, energy consumption can be effectively saved.
  • the second preset posture further includes: the second included angle between the third screen and the horizontal plane is within a fourth preset range, and the included angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90° °, the angle within the second preset range is greater than 60° and less than 180°; or, the second preset posture also includes: the second included angle is within the fourth preset range, the orientation of the third screen and the geographic coordinate system The included angle of the Z axis is greater than 90°, and the angle within the second preset range is greater than 240° and less than 360°; or, the second preset posture also includes: the difference between the third included angle and the second included angle is at the fifth Within the preset range, the third angle is the angle between the second screen and the horizontal plane, the angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is greater than 90°, and the angle within the second preset range is greater than 180° and Less than 360°.
  • the first included angle does not include 0° and 180°.
  • the method when it is detected that the electronic device is in the second preset posture and the electronic device satisfies a second preset condition, the method further includes: the electronic device controls the first screen to turn off.
  • the electronic device controls the second screen and the third screen to light up, it controls the first screen to turn off. In this way, energy consumption can be effectively saved.
  • the third preset posture further includes: the second included angle between the third screen and the horizontal plane is within a fourth preset range, and the included angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90° °, the angle within the third preset range is greater than 60° and less than 180°.
  • the display content on the third screen includes an enlarged image of a preset local feature in the image captured by the second camera.
  • the user when the user previews the image captured by the second camera in real time on the second screen, the user can also preview the details of preset local features in the image captured by the second camera in real time on the third screen.
  • the split-screen display on the second screen and the third screen includes: split-screen display of the third user interface on the second screen and the third display screen; the display content of the second screen includes the first The third preview display area of the third user interface, and zero, one or more interface elements in the third user interface except the third preview display area, the display content of the third screen includes the third user interface except the second One or more interface elements other than the display content of the screen; the third user interface also includes one or more of the following interface elements: makeup test control, shooting control, camera switching control, fill light control, photo album control, display frame; , the makeup test control is used to add a preset makeup effect to the face in the image in the preview display area of the third user interface; the shooting control is used to trigger the electronic device to save the image in the preview display area of the third user interface; camera switching The control is used to switch the camera for capturing images; the fill light control is used to supplement ambient light; the photo album control is used to trigger the electronic device to display the user interface of the photo album application; the display frame
  • the user after the user places the electronic device in the third preset posture and makes the electronic device meet the third preset condition, the user can start the second camera corresponding to the third preset posture while freeing his hands Shoot, and perform real-time preview through the second screen corresponding to the third preset posture, and control the shooting parameters of the second camera and the display effect of the preview image in the second screen through the interface elements displayed on the third screen, which effectively improves the user experience. use experience.
  • the present application provides an electronic device.
  • the electronic device includes a first screen, a folding screen and a first camera.
  • the folding screen can be folded along the folding edge to form a second screen and a third screen.
  • the first screen and the second screen The orientation of the two screens is opposite, the orientation of the first screen is consistent with the shooting direction of the first camera, and the orientation of the second screen is consistent with the shooting direction of the second camera.
  • the electronic device may include a plurality of functional modules or units for correspondingly executing the display method provided in the first aspect.
  • detection unit and display unit For example, detection unit and display unit.
  • a detection unit configured to determine that the electronic device is in a first preset posture based on the detected first angle between the second screen and the third screen;
  • the display unit is configured to display the first user interface on the first screen based on the first preset posture of the electronic device; the first preview display area of the first user interface is used to display images collected by the first camera, and the first preset posture is , the first included angle does not include 0° and 180°;
  • the detection unit is further configured to determine that the electronic device is in a second preset posture based on the detected first included angle
  • the display unit is further configured to display a second user interface on the second screen based on the second preset posture of the electronic device; the second preview display area of the second user interface is used to display images captured by the second camera, and the second preset In attitude, the first included angle does not include 0° and 180°.
  • the detection unit is further configured to determine that the electronic device is in a third preset posture based on the detected first included angle; the display unit is also configured to determine, based on the third preset posture of the electronic device, at The second screen and the third screen are displayed in split screens, the display content of the second screen includes a third preview display area, and the third preview display area is used to display images collected by the second camera.
  • the third preset posture the first angle 0° and 180° are not included.
  • displaying the first user interface on the first screen based on the first preset posture of the electronic device includes: when it is detected that the electronic device is in the first preset posture and the electronic device meets the first preset posture condition, start the first camera to collect images, and display the first user interface on the first screen; wherein, the first preset posture includes that the first included angle is within the first preset range, and the first preset condition includes the following item or multiple items: the pause time of the first included angle at the current included angle value reaches the first preset time; when the first screen is on, the electronic device does not receive any action on the first screen within the second preset time In the image captured by the camera corresponding to the first screen, a face or the face of a preset user is detected; in the image captured by the camera corresponding to the first screen, a first preset gesture is detected.
  • displaying the second user interface on the second screen based on the second preset posture of the electronic device includes: when it is detected that the electronic device is in the second preset posture, and the electronic device meets the second preset posture condition, start the second camera to collect images, and display the second user interface on the second screen; wherein, the second preset posture includes that the first angle is within the second preset range, and the second preset condition includes the following item or multiple items: the pause time of the first included angle at the current included angle value reaches the first preset time; when the second screen is on, the electronic device does not receive any action on the second screen within the second preset time In the image captured by the camera corresponding to the second screen, a face or the face of a preset user is detected; in the image captured by the camera corresponding to the second screen, a second preset gesture is detected.
  • the above-mentioned split-screen display on the second screen and the third screen based on the third preset posture of the electronic device includes: when it is detected that the electronic device is in the third preset posture, and the electronic device satisfies the first Under three preset conditions, start the second camera to collect images, and display them in split screens on the second screen and the third screen; wherein, the third preset posture includes that the first included angle is within the third preset range, and the third preset
  • the set conditions include one or more of the following: the pause time of the first included angle at the current included angle value reaches the first preset time; when the second screen and/or the third screen are on, the electronic device No input operation on the second screen or the third screen is received within the preset time; in the image collected by the camera corresponding to the second screen, a human face or the face of the preset user is detected; In the image collected by the camera, a third preset gesture is detected.
  • the present application provides an electronic device, the electronic device includes a first screen, a folding screen and a first camera, the folding screen can be folded along the folding edge to form a second screen and a third screen, the first screen and the second The orientation of the screen is opposite, and the orientation of the first screen is consistent with the shooting direction of the first camera.
  • the electronic device may include multiple functional modules or units for correspondingly executing the display method provided by the second aspect.
  • display unit For example, display unit.
  • the display unit is configured to start the first camera to collect images and display the first user interface on the first screen when it is detected that the electronic device is in the first preset posture and the electronic device satisfies the first preset condition.
  • the first preview display area is used to display the image captured by the first camera; wherein, the first preset posture includes that the first angle between the second screen and the third screen is within the first preset range, and the first preset condition includes One or more of the following: the pause time of the first included angle at the current included angle value reaches the first preset time; when the first screen is on, the electronic device does not receive any action within the second preset time Input operation on the first screen; in the image captured by the camera corresponding to the first screen, a human face or the face of the preset user is detected; in the image captured by the camera corresponding to the first screen, the first preset gesture is detected .
  • the camera corresponding to the above-mentioned first screen and the first camera may be the same camera, or may be different cameras.
  • the above-mentioned camera corresponding to the first screen is a low-power camera.
  • the electronic device further includes a second camera, and the orientation of the second screen is consistent with the shooting direction of the second camera.
  • the folding screen is folded along the folding edge to form the second screen and the third screen, the third screen and the third screen
  • the second camera is on a different side of the fold.
  • the display unit is also used to, when it is detected that the electronic device is in the second preset posture and the electronic device meets the second preset condition, start the second camera to collect images, and display the second user interface on the second screen, the second user
  • the second preview display area of the interface is used to display the image collected by the second camera; wherein, the second preset posture includes that the first included angle is within a second preset range, and the second preset condition includes one or more of the following: The pause time of the first included angle at the current included angle value reaches the first preset time; when the second screen is on, the electronic device does not receive any input operation on the second screen within the second preset time; In the image captured by the camera corresponding to the second screen, a face or the face of a preset user is detected; in the image captured by the camera corresponding to the second screen, a second preset gesture is detected.
  • the camera corresponding to the second screen and the second camera may be the same camera or different cameras.
  • the above-mentioned camera corresponding to the second screen is a low-power camera.
  • the electronic device further includes a second camera, and the orientation of the second screen is consistent with the shooting direction of the second camera.
  • the folding screen is folded along the folding edge to form the second screen and the third screen, the third screen and the third screen
  • the second camera is on a different side of the fold.
  • the display unit is further configured to, when it is detected that the electronic device is in a third preset posture and the electronic device satisfies the third preset condition, start the second camera to collect images, and perform split-screen display on the second screen and the third screen,
  • the display content of the second screen includes a third preview display area, and the third preview display area is used to display the image collected by the second camera;
  • the third preset posture includes that the first included angle is within a third preset range
  • the third The preset conditions include one or more of the following: the pause time of the first angle at the current angle value reaches the first preset time; when the second screen and/or the third screen are on, the electronic device No input operation on the second screen or the third screen is received within the preset time; in the image collected by the camera corresponding to the second screen, a human face or the preset user’s face is detected; In the image collected by the camera, a third preset gesture is detected.
  • the first preset posture further includes: the second angle between the third screen and the horizontal plane is within the fourth preset range, and the angle between the orientation of the third screen and the Z axis of the geographic coordinate system is less than 90° , the angle within the first preset range is greater than 0° and less than 120°; or, the first preset posture also includes: the second included angle is within the fourth preset range, the orientation of the third screen is in line with the Z of the geographic coordinate system The included angle of the axis is greater than 90°, and the angle within the first preset range is greater than 180° and less than 300°; or, the first preset posture also includes: the difference between the third included angle and the second included angle is within the fifth preset Within the set range, the third angle is the angle between the second screen and the horizontal plane, the angle between the orientation of the third screen and the Z axis of the geographic coordinate system is greater than 90°, and the angle within the first preset range is greater than 0° and less than 180°.
  • the first preset posture further includes: the second angle between the third screen and the horizontal plane is within the fourth preset range, and the angle between the orientation of the third screen and the Z axis of the geographic coordinate system is less than 90° , the angle within the first preset range is less than 120°; the display unit displays the first user interface on the first screen, comprising: the display unit displays the first user interface rotated by 180° on the first screen.
  • the electronic device further includes an identification unit.
  • the identification unit is used to identify the first area where the preset local feature is located in the image captured by the first camera,
  • the first preview display area is used to display the enlarged image of the image in the first area.
  • the electronic device further includes a receiving unit.
  • the receiving unit is used to receive the user's first input operation; the display unit is also used to respond to the first input operation. Operation, displaying one or more of the following interface elements in the first user interface: makeup test control, shooting control, camera switch control, fill light control, photo album control, and display frame; wherein, the makeup test control is used for the first preview display Add preset makeup effects to the face in the image displayed in the first preview display area; the shooting control is used to trigger the electronic device to save the image displayed in the first preview display area; the camera switching control is used to switch the camera that captures the image; the supplementary light control is used to supplement the ambient light
  • the album control is used to trigger the electronic device to display the user interface of the album application; the display frame is used to display the enlarged image of the preset local features in the image captured by the first camera.
  • the first camera is an ultraviolet camera, and the images collected by the first camera are used to highlight areas where sunscreen is applied.
  • the display unit is further configured to control the second screen and the third screen to turn off when it is detected that the electronic device is in the first preset posture and the electronic device meets the first preset condition.
  • the second preset posture further includes: the second included angle between the third screen and the horizontal plane is within a fourth preset range, and the included angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90° °, the angle within the second preset range is greater than 60° and less than 180°; or, the second preset posture also includes: the second included angle is within the fourth preset range, the orientation of the third screen and the geographic coordinate system The included angle of the Z axis is greater than 90°, and the angle within the second preset range is greater than 240° and less than 360°; or, the second preset posture also includes: the difference between the third included angle and the second included angle is at the fifth Within the preset range, the third angle is the angle between the second screen and the horizontal plane, the angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is greater than 90°, and the angle within the second preset range is greater than 180° and Less than 360°.
  • the first included angle does not include 0° and 180°.
  • the display unit is further configured to control the first screen to turn off when it is detected that the electronic device is in a second preset posture and the electronic device meets a second preset condition.
  • the third preset posture further includes: the second included angle between the third screen and the horizontal plane is within a fourth preset range, and the included angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90° °, the angle within the third preset range is greater than 60° and less than 180°.
  • the display content on the third screen includes an enlarged image of a preset local feature in the image captured by the second camera.
  • the above-mentioned display unit performs split-screen display on the second screen and the third screen, including: the display unit performs split-screen display on the third user interface on the second screen and the third display screen;
  • the display content includes the third preview display area of the third user interface, and zero, one or more interface elements in the third user interface except the third preview display area, and the display content of the third screen includes the third user interface
  • the third user interface also includes one or more of the following interface elements: makeup test control, shooting control, camera switching control, fill light control, photo album control, display frame; wherein, the makeup test control is used to add a preset makeup effect to the face in the image in the preview display area of the third user interface; the shooting control is used to trigger the electronic device to save the image in the preview display area of the third user interface image; the camera switching control is used to switch the camera for capturing images; the supplementary light control is used to supplement ambient light; the photo album control is used to trigger the electronic device
  • the present application provides an electronic device, including one or more processors and one or more memories.
  • the one or more memories are coupled with one or more processors, the one or more memories are used to store computer program codes, the computer program codes include computer instructions, and when the one or more processors execute the computer instructions, the electronic device performs A display method in any possible implementation manner of any one of the above aspects.
  • an embodiment of the present application provides a computer storage medium, including computer instructions, which, when the computer instructions are run on the electronic device, cause the electronic device to execute the display method in any possible implementation of any one of the above aspects.
  • an embodiment of the present application provides a computer program product, which, when running on a computer, causes the computer to execute the display method in any possible implementation manner of any one of the above aspects.
  • FIG. 1A to FIG. 1F are schematic diagrams of the product form of the vertically folded electronic device provided by the embodiment of the present application.
  • FIGS. 2A to 2F are schematic diagrams of the product form of the horizontally folded electronic device provided by the embodiment of the present application.
  • 3A to 3C are schematic diagrams of the external screen of the electronic device folded vertically according to the embodiment of the present application.
  • FIG. 4A to FIG. 4C are schematic diagrams of the external screen of the horizontally folded electronic device provided by the embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 6A is a schematic diagram of a geographical marking system provided by the embodiment of the present application.
  • Fig. 6B is a schematic diagram of calculating the angle between A screen and B screen provided by the embodiment of the present application.
  • Fig. 7A is a schematic diagram of calculating the angle between the A screen and the horizontal plane provided by the embodiment of the present application.
  • Fig. 7B is a schematic diagram of calculating the angle between a B-screen and the horizontal plane provided by the embodiment of the present application.
  • FIG. 8A is a schematic diagram of a coordinate system of an electronic device provided by an embodiment of the present application.
  • FIG. 8B is a schematic diagram of a coordinate system of another electronic device provided by an embodiment of the present application.
  • FIGS. 9A to 9F are schematic diagrams of the six bracket states of the electronic device provided by the embodiment of the present application.
  • Figures 10A to 10C are the display interfaces of the C-screen in a specific stand state provided by the embodiment of the present application;
  • FIGS. 11A to 11F are schematic diagrams of a set of user interfaces provided by the embodiment of the present application.
  • FIGS. 12A to 12N are schematic diagrams of another set of user interfaces provided by the embodiment of the present application.
  • Fig. 13A to Fig. 13F are the display interface of screen A in a specific stand state provided by the embodiment of the present application;
  • 14A to 14B are the display interface of the internal screen provided by the embodiment of the present application.
  • Fig. 15A to Fig. 15F are the display interface of the inner screen in a specific stand state provided by the embodiment of the present application;
  • FIG. 16 is a schematic diagram of a software architecture of an electronic device provided in an embodiment of the present application.
  • FIG. 17 is a schematic structural diagram of another electronic device provided by an embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, the “multiple” The meaning is two or more.
  • UI user interface
  • the term "user interface (UI)” in the following embodiments of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the difference between the internal form of information and the form acceptable to the user. conversion between.
  • the user interface is the source code written in a specific computer language such as iava, extensible markup language (XML), etc.
  • the source code of the interface is parsed and rendered on the electronic device, and finally presented as content that can be recognized by the user.
  • the commonly used form of user interface is the graphical user interface (graphic user interface, GUI), which refers to the user interface related to computer operation displayed in a graphical way. It may be text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, and other visible interface elements displayed on the display screen of the electronic device.
  • An embodiment of the present application provides a display method for a folding screen, which can be applied to an electronic device 100 with a vertical folding screen or a horizontal folding screen.
  • the folding screen of the electronic device 100 can be folded along the folding edge to form at least two screens, such as an A screen and a B screen.
  • the folding screen can take on various forms.
  • the folding screen of the electronic device 100 may be in an unfolded state, a forward half-folded state, or a forward folded state.
  • the folding screen of the electronic device 100 may also present a reverse half-folded form or a reversed folded form.
  • FIG. 1A to FIG. 1F show a schematic view of a product form of an electronic device 100 with a vertical folding screen according to an embodiment of the present application.
  • the folding edge of the vertical folding screen is perpendicular to the top edge line of the electronic device 100 (for For the convenience of description, the top edge line is referred to simply as the top edge) and the bottom edge line (for the convenience of description, the bottom edge line is simply referred to as the bottom edge).
  • FIG. 1A is a schematic diagram of an unfolded state of a vertically folding screen.
  • the longitudinal folding screen shown in FIG. 1A can be folded inward along the folding edge according to the directions 11a and/or 11b shown in FIG. Second screen) and B screen (ie the third screen).
  • the A screen and the front camera on the electronic device 100 may be on the same side of the folding edge.
  • the vertically foldable screen shown in FIG. 1C can be folded inward along the folded edge along the directions 11a and 11b shown in FIG. 1C to form a vertically foldable screen in the forward folded form shown in FIG. 1D .
  • FIG. 1D after the vertical folding screen of the electronic device 100 is fully folded forward, the A screen and the B screen are opposite and invisible to the user.
  • the vertically folded screen shown in FIG. 1A can also be folded outward along the folded edge to form A-panel and B-panel in reverse half-folded form as shown in FIG. 1E .
  • the vertically foldable screen shown in FIG. 1E can be folded outward along the folded edge along the directions 22a and 22b shown in FIG. 1E to form a reversely folded vertically folded screen shown in FIG. 1F .
  • FIG. 1F after the vertical folding screen of the electronic device 100 is completely reversed, the A screen and the B screen face each other, and the back of the electronic device 100 (ie, the back of the A screen and the back of the B screen) is invisible to the user.
  • FIG. 2A to FIG. 2F show a schematic view of a product form of an electronic device 100 with a horizontal folding screen according to an embodiment of the present application, and the folding edge of the horizontal folding screen is parallel to the top edge and the bottom edge of the electronic device 100 .
  • FIG. 2A is a schematic diagram of an unfolded form of the horizontal folding screen.
  • the horizontal folding screen shown in FIG. 2A can be folded inward along the folded edge according to the directions 33a and/or 33b shown in FIG. 2A to form A-screen and B-screen in the forward half-folding configuration shown in FIG. 2B and FIG. 2C .
  • the A screen and the front camera on the electronic device 100 may be on the same side of the folding edge.
  • the horizontal folding screen shown in FIG. 2C can continue to be folded inward along the folding edge according to the directions 33a and 33b shown in FIG. 2C to form the horizontal folding screen in the forward folding form shown in FIG. 2D .
  • FIG. 2D after the horizontal folding screen of the electronic device 100 is fully folded forward, the A screen and the B screen face each other and are invisible to the user.
  • the horizontally folded screen shown in FIG. 2A can also be folded outwards along the folded edge to form A-panel and B-panel in reverse half-folded form as shown in FIG. 2E .
  • the horizontal folding screen shown in FIG. 2E can continue to be folded outward along the folded edge in the directions 44a and 44b shown in FIG. 2E to form a reverse folding horizontal folding screen shown in FIG. 2F .
  • FIG. 2F after the horizontal folding screen of the electronic device 100 is fully reversed, the A screen and the B screen face each other, and the back of the electronic device 100 (ie, the back of the A screen and the back of the B screen) is invisible to the user.
  • a display screen may also be provided on the back of the A screen and/or the B screen of the folding screen (vertical folding screen or horizontal folding screen) provided in the embodiment of the present application.
  • the folding screen composed of A screen and B screen is the inner screen of the electronic device 100, and the A screen, the B screen and the front camera are located on the front of the electronic device 100;
  • the C screen ie, the second screen
  • the C screen and the rear camera are located at the back of the electronic device 100 .
  • screen C may be called the first screen
  • screen A may be called the second screen
  • screen B may be called the third screen
  • screen C is set on the back of screen A
  • screen C and A The orientation of the screen is reversed.
  • the rear camera corresponding to the C screen may be called a first camera
  • the front camera corresponding to the A screen may be called a second camera. It can be understood that the orientation of the C screen is consistent with the shooting direction of the rear camera, and the orientation of the A screen is consistent with the shooting direction of the front camera.
  • the front cameras ie, the second camera
  • FIG. 3A to FIG. 3C show schematic diagrams of the outer screen of the electronic device 100 whose inner screen is configured as a vertical folding screen.
  • 4A to 4C show schematic diagrams of the outer screen of the electronic device 100 whose inner screen is configured as a horizontal folding screen.
  • a C-screen may be provided on the back of the A-screen in the vertical folding screen of the electronic device 100 .
  • a C-screen may be provided on the back of the A-screen in the horizontal folding screen of the electronic device 100 .
  • the C screen is located on the back of the A screen, and the C screen is visible to the user.
  • the C-screen may be on the same side of the folding edge as the rear camera of the electronic device 100 .
  • a vertically foldable C-screen can be arranged on the back of the A-screen and B-screen of the inner screen.
  • a horizontally foldable C-screen can be provided on the back of the A-screen and B-screen of the inner screen.
  • an electronic device 100 with a C screen when the inner screen (that is, the folding screen composed of the A screen and the B screen) is in the folded state, the electronic device 100 can display the user interface on the C screen; In the folded state and the unfolded state, the electronic device 100 may display a user interface on the A screen, the B screen and/or the C screen.
  • the inner screen that is, the folding screen composed of the A screen and the B screen
  • the folding screen of the electronic device 100 may surround the electronic device 100, and the above screens A, B and C may all be part of the folding screen.
  • the electronic device 100 may determine the shape of the folding screen configured with the inner screen based on the detected angle ⁇ (the first angle) between the A screen and the B screen.
  • the angle ⁇ between the A screen and the B screen of the folding screen (vertical folding screen or horizontal folding screen) of the electronic device 100 ranges from [0°, 180°], and the electronic device 100 cannot be reversed. fold.
  • the angle ⁇ between the A screen and the B screen of the folding screen (vertical folding screen or horizontal folding screen) of the electronic device 100 ranges from [0°, 360°], and the electronic device 100 can perform normal It can be folded to the opposite direction.
  • the included angle ⁇ between the A screen and the B screen may also be referred to as a first included angle.
  • the value range of ⁇ is [0°, 360°].
  • the electronic device 100 can determine that the folding screen is in the forward folding state; when the included angle ⁇ [P1, P2), the electronic device 100 can determine that the folding screen is in the forward half-folding state;
  • the included angle ⁇ [P2, P3) the electronic device 100 can determine that the foldable screen is in an unfolded state; when ⁇ [P3, P4), the electronic device 100 can determine that the foldable screen is in a reverse half-folded state; when the included angle ⁇ [P4, 360), the electronic device 100 may determine that the folding screen is in a reverse folding state.
  • P1, P2, P3 and P4 are preset angle thresholds.
  • P1, P2, P3 and P4 may be set by the user in the electronic device 100, or set by the electronic device 100 by default.
  • the difference between P1 and 0°, the difference between P2 and 180°, the difference between P3 and 180°, and the difference between P4 and 360° are preset values set by the electronic device 100 or the user. Set the error value. For example, if the preset error value is equal to 2°, P1, P2, P3 and P4 are respectively 2°, 178°, P3 is 172°, and P4 is 358°.
  • P1, P2, P3 and P4 may be determined according to the user's usage habits of the folding screen. For example, according to the usage habits of most users, when the angle ⁇ between screen A and screen B is less than 50 degrees, the possibility that the user intends not to use screen A or screen B is high; when the angle ⁇ between screen A and screen B is When the angle ⁇ is greater than 50° and less than or equal to 160° (or ⁇ is greater than 190° and less than or equal to 280°), the possibility that the user intends to use screen A and screen B to display different display content is high; when the angle between screen A and screen B is ⁇ When it is greater than 160° and less than or equal to 190°, the possibility that the user intends to use screen A and screen B as a whole (that is, as a complete display screen) is higher; when the angle between screen A and screen B is greater than 280° and less than or equal to 360° °, the possibility that the user intends to use screen A or screen B alone is high.
  • the value range of P1 can be (0, 40°), the value range of P2 can be [160°, 180°), the value range of P3 can be [180°, 190°), P4 The value range of can be [280°, 360°).
  • P1 is 30°
  • P2 is 170°
  • P3 is 185°
  • P4 is 300°.
  • the at least two display screens formed by folding the folding screen in the embodiment of the present application can be a plurality of independent display screens, or a complete display screen with an integrated structure, which is only folded to form a At least two parts.
  • the folding screen may be a flexible folding screen, and the flexible folding screen includes folding edges made of flexible materials. Part or all of the flexible folding screen is made of flexible materials.
  • the at least two screens formed after the flexible folding screen is folded are a complete screen of an integral structure, but are folded to form at least two parts.
  • the above folding screen may be a multi-screen folding screen.
  • the multi-screen folding screen may include multiple (two or more) display screens.
  • the multiple displays are separate displays.
  • the plurality of display screens can be sequentially connected via folding shafts. Each screen can rotate around the folding shaft connected to it to realize the folding of multi-screen folding screens.
  • the display method provided in the embodiments of the present application will be described by taking the folding screen as an example of a flexible folding screen that can be folded horizontally.
  • the electronic device 100 can be a terminal device equipped with iOS, Android, Microsoft or other operating systems, for example, the electronic device 100 can be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, a super mobile personal computer (ultra-mobile personal computer, UMPC), netbook, and cellular phone, personal digital assistant (personal digital assistant, PDA), augmented reality (Augmented reality, AR) ⁇ virtual reality (virtual reality, VR) equipment, etc., including the above-mentioned folding screen
  • PDA personal digital assistant
  • augmented reality Augmented reality, AR
  • VR virtual reality
  • the specific type of the electronic device 100 is not specifically limited in the embodiment of the present application.
  • FIG. 5 shows a schematic structural diagram of the electronic device 100 .
  • the electronic device 100 may include: a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also supply power to the electronic device 100 through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input of the battery 142 and/or the charging management module 140, and supplies power for the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160, etc.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou satellite navigation system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be liquid crystal display (LCD), organic light emitting diode (organic light-emitting diode, OLED), active matrix organic light emitting diode or active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (non-volatile memory, NVM).
  • RAM random access memory
  • NVM non-volatile memory
  • Random access memory can include static random-access memory (SRAM), dynamic random access memory (DRAM), synchronous dynamic random access memory (synchronous dynamic random access memory, SDRAM), double data rate synchronous Dynamic random access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as the fifth generation DDR SDRAM is generally called DDR5 SDRAM), etc.; non-volatile memory can include disk storage devices, flash memory (flash memory) .
  • SRAM static random-access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • non-volatile memory can include disk storage devices, flash memory (flash memory) .
  • flash memory can include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc.
  • it can include single-level storage cells (single-level cell, SLC), multi-level storage cells (multi-level cell, MLC), three-level storage unit (triple-level cell, TLC), fourth-level storage unit (quad-level cell, QLC), etc.
  • can include universal flash storage English: universal flash storage, UFS) according to storage specifications , embedded multimedia memory card (embedded multi media Card, eMMC), etc.
  • the random access memory can be directly read and written by the processor 110, and can be used to store executable programs (such as machine instructions) of an operating system or other running programs, and can also be used to store data of users and application programs.
  • the non-volatile memory can also store executable programs and data of users and application programs, etc., and can be loaded into the random access memory in advance for the processor 110 to directly read and write.
  • the external memory interface 120 can be used to connect an external non-volatile memory, so as to expand the storage capacity of the electronic device 100 .
  • the external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external non-volatile memory.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the earphone interface 170D is used for connecting wired earphones.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the coordinate system of the gyro sensor is a geographic coordinate system.
  • the origin O of the geographic coordinate system is located at the point where the carrier (i.e., the device containing the gyroscope sensor, such as the electronic device 100) is located, the X-axis points east (E) along the local latitude, and the Y-axis points along the local meridian Pointing north (N), the Z axis points upward along the local geographic vertical and forms a right-handed Cartesian coordinate system with the X and Y axes.
  • the plane formed by the X axis and the Y axis is the local horizontal plane
  • the plane formed by the Y axis and the Z axis is the local meridian plane. Therefore, it can be understood that the coordinate system of the gyro sensor is: with the gyro sensor as the origin O, the X axis pointing east along the local latitude, the Y axis pointing north along the local meridian, and pointing upward along the local geographic vertical line (ie The direction of the geographic vertical) is the Z axis.
  • the display screen 194 of the electronic device 100 can be folded to form multiple display screens.
  • a gyro sensor 180B may be provided in each screen for measuring the orientation of the display screen (ie, a direction vector perpendicular to the display screen and pointing from the inside of the electronic device 100 to the outside).
  • the electronic device 100 may determine the angle between adjacent screens according to the orientation change of each display screen measured by the gyro sensor 180B.
  • the display screen 194 of the electronic device 100 can be folded to form adjacent screens A and B.
  • the screen A is provided with a gyro sensor A, and the electronic device 100 can measure the orientation of the screen A through the gyro sensor A;
  • the screen B is provided with a gyro sensor B, and the electronic device 100 can measure the orientation of the screen B through the gyro sensor B.
  • the electronic device 100 may determine the angle ⁇ between the A screen and the B screen according to the measured orientation changes of the A screen and the B screen. The principle of obtaining the included angle ⁇ will be described in detail below.
  • FIG. 6B shows a schematic diagram of the angle ⁇ between the A screen and the B screen.
  • the electronic device 100 uses the gyro sensor A to measure the orientation of the screen A as a vector
  • the direction of the screen B measured by the gyro sensor B is the vector z2.
  • the vector Perpendicular to screen A vector Vertical to screen B.
  • the electronic device 100 uses one or more acceleration sensors to measure the angle between adjacent screens of the folding screen, for example, the angle ⁇ between the A screen and the B screen.
  • an acceleration sensor can be set in each display screen of the folding screen.
  • the electronic device 100 (such as the processor 110) can use the acceleration sensor to measure the motion acceleration of each display screen when it is rotated; then calculate the angle of rotation of one display screen relative to another display screen according to the measured motion acceleration, for example, A screen and Angle ⁇ of B screen.
  • the above-mentioned gyro sensor can be a virtual gyro sensor formed by cooperation of other sensors, and the virtual gyro sensor can be used to calculate the angle between adjacent screens of the folding screen, such as the angle between the A screen and the B screen. Angle ⁇ .
  • an angle sensor is installed on the folding part of the electronic device 100 (for example, on the rotating shaft), and the electronic device 100 can use the angle sensor to measure the angle between adjacent screens of the folding screen, such as the angle formed by the A screen and the B screen. Angle ⁇ .
  • the electronic device 100 may also use the gyro sensor A to measure the angle ⁇ 1 between the screen A and the horizontal plane, and use the gyro sensor B to measure the angle ⁇ 2 between the screen B and the horizontal plane.
  • FIG. 7A shows the coordinate system of the gyro sensor A of the screen A of the electronic device 100 .
  • the XOY plane formed by the X axis and the Y axis is the local horizontal plane
  • the plane formed by the Y axis and the Z axis is the local meridian plane.
  • the orientation of the screen A of the electronic device 100 in the coordinate system of the gyro sensor A is a vector vector
  • FIG. 7B shows the coordinate system of the gyro sensor B of the upper screen B of the electronic device 100 .
  • the XOY plane formed by the X axis and the Y axis is the local horizontal plane
  • the plane formed by the Y axis and the Z axis is the local meridian plane.
  • the orientation of the B screen of the electronic device 100 in the coordinate system of the gyroscope sensor is a vector vector
  • the electronic device 100 uses the gyro sensor A to measure the vector corresponding to the orientation of the screen A Finally, the angle ⁇ 1 between the A screen and the horizontal plane can be determined by using the above formula (2). To sum up, the electronic device 100 uses the gyroscope sensor B to measure the vector corresponding to the orientation of the B screen Finally, use the above formula (3) to determine the angle ⁇ 2 between the B screen and the horizontal plane.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to recognize the posture of the electronic device 100, and be applied to applications such as horizontal and vertical screen switching, pedometer, etc.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light through the light emitting diode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the motor 191 can generate a vibrating reminder.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, messages, notifications and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the display direction of the C-screen of the electronic device 100 involved in the embodiment of the present application will be introduced below.
  • the electronic device 100 may determine the display direction of the user interface 1 corresponding to the C-screen according to the detected physical posture of the C-screen.
  • the four sides of the C-screen include a first side, a second side, a third side and a Four sides, the first side and the second side of the C-screen are parallel to the folding side of the electronic device 100 .
  • the display direction of the user interface 1 corresponding to the C screen includes one or more of the display direction 1 , the display direction 2 , the display direction 3 , and the display direction 4 . in,
  • Display direction 1 means: the top side and the bottom side of the user interface 1 are parallel to the first side, and the top side of the user interface 1 is closer to the first side than the bottom side of the user interface 1 .
  • Display direction 2 means: the top side and the bottom side of the user interface 1 are parallel to the first side, and the bottom side of the user interface 1 is closer to the first side than the top side of the user interface 1 .
  • Display direction 3 means: the two sides (left side and right side) of user interface 1 are parallel to the first side, and compared with the left side of user interface 1, the right side of user interface 1 is closer to the first side side.
  • Display direction 4 means: the two sides (left side and right side) of user interface 1 are parallel to the first side, and compared with the right side of user interface 1, the left side of user interface 1 is closer to the first side side.
  • the physical posture of the electronic device 100 may include: a first physical posture, a first physical posture, a third physical posture, and a fourth physical posture.
  • the display direction of the user interface 1 corresponding to the control C screen is display direction 1; when the detection electronic device 100 is in the second physical posture, the control C screen corresponds to The display direction of the user interface 1 is the display direction 2; when the detection electronic device 100 is in the third physical posture, the display direction of the user interface 1 corresponding to the control C screen is the display direction 3; when the detection electronic device 100 is in the fourth physical posture , the display direction of the user interface 1 corresponding to the control screen C is the display direction 4.
  • the default display direction of the C-screen of the electronic device 100 is display direction 1 .
  • the electronic device 100 displays the user interface 1 corresponding to the C screen according to the default display direction of the C screen.
  • the electronic device 100 rotates the user interface corresponding to the C screen by 180° before displaying it.
  • the default display orientation of the C-screen of the electronic device 100 is display orientation 2 .
  • the electronic device 100 displays the user interface of the C screen according to the default display direction of the C screen; when the electronic device 100 is in the first physical posture, the electronic device 100 displays the user interface of the C screen Display after rotating 180°.
  • the default display orientation of the C-screen of the electronic device 100 is display orientation 3 .
  • the electronic device 100 displays the user interface 1 corresponding to the C screen according to the default display direction of the C screen.
  • the electronic device 100 rotates the user interface corresponding to the C screen by 90° before displaying it.
  • the electronic device 100 rotates the user interface corresponding to the C screen by 180° before displaying it.
  • the electronic device 100 rotates the user interface corresponding to the C screen by 270° (or -90°) before displaying it.
  • the display direction of the user interface corresponding to the C screen can be adaptively adjusted according to the physical posture of the electronic device 100, so as to facilitate the user's viewing and effectively improve the user experience.
  • the display direction of the user interface corresponding to the A screen, the B screen, and the inner screen may also be adaptively adjusted according to the physical posture of the electronic device 100 .
  • the default display orientations of screen A, screen B, and inner screen are usually: the top edge of the user interface is parallel to the top edge of the electronic device 100, and the top edge of the user interface is closer to the bottom edge of the user interface. The top side of the electronic device 100 .
  • the C screen, the A screen and the gyro sensor A are arranged on the same side of the folding edge, and the gyro sensor A can detect the physical posture of the A screen, and can also be used to detect the physical posture of the C screen.
  • Screen A and screen C may or may not share a set of coordinate systems of the electronic device 100 . The following uses an example of a set of coordinate systems of the electronic device 100 shared by screen A and C.
  • FIG. 8A and FIG. 8B show the three-axis coordinate system of screen A (or screen C) of the electronic device 100 .
  • the x1 axis of the three-axis coordinate system of screen A is perpendicular to the left side of screen A, and points from the left side of screen A to the right side of screen A ;
  • the y1 axis of the three-axis coordinate system of the A screen is perpendicular to the bottom edge of the A screen, and points from the bottom edge of the A screen to the top edge of the A screen;
  • the z1 axis of the three-axis coordinate system of the A screen is used to indicate the above-mentioned A screen Orientation, the z1 axis is perpendicular to the A screen.
  • the gyro sensor A can detect the vector corresponding to the y1 axis
  • the electronic device 100 determines that the C-screen is in the first physical posture.
  • the electronic device 100 determines that the C-screen is in the third physical posture.
  • the electronic device 100 determines that the C-screen is in the second physical posture.
  • preset range 11 is [-45°, 45°)
  • preset range 12 is [45°, 135°)
  • preset range 13 is [135°, 225°)
  • preset range 14 is [-45° °, -135°).
  • the gyro sensor A can detect the vector corresponding to the y1 axis
  • the electronic device 100 determines that the C-screen is in the third physical posture.
  • the electronic device 100 determines that the C-screen is in the second physical posture.
  • the electronic device 100 determines that the C-screen is in the fourth physical posture.
  • the electronic device 100 determines that the C-screen is in the first physical posture.
  • the embodiment of the present application may also determine which physical posture the C-screen is in through other implementation manners, which is not specifically limited here.
  • the electronic device 100 uses the gyroscope sensor and the acceleration sensor to detect the pitch angle of the C screen (ie, the angle at which the C screen rotates around the x1 axis), the roll angle (ie, the angle at which the C screen rotates around the y1 axis), and the yaw angle (ie, the C screen rotates around the y1 axis) and the yaw angle (ie, the C The rotation angle of the screen around the z1 axis), and then the physical posture of the C screen can be determined according to the pitch angle, roll angle and yaw angle of the C screen.
  • the default display orientation of the electronic device 100 in this form is usually display orientation 3 .
  • the default display orientation of the electronic device 100 in this form is generally display orientation 2 .
  • the display method provided in the subsequent embodiments is described by taking the horizontally folded electronic device 100 whose default display direction of the C screen is display direction 2 as an example.
  • the horizontally folded electronic device 100 and the vertically folded The electronic device 100 is also applicable to the display method provided in the embodiment of the present application.
  • FIG. 9A shows a schematic diagram of a first support state of the electronic device 100 .
  • FIG. 9C shows a schematic diagram of two second bracket states of the electronic device 100 .
  • the electronic device 100 when the electronic device 100 detects that the angle ⁇ 2 between the B screen and the horizontal plane is within the preset range 15, the vector corresponding to the B screen When the angle ⁇ 5 with the Z-axis of the geographical coordinate system is greater than 90 degrees, and the angle ⁇ between the A screen and the B screen is within the preset range 18 (ie [f6, f7]), it is determined that the electronic device 100 is in the third support state .
  • FIG. 9D shows a schematic diagram of a third support state of the electronic device 100 .
  • the electronic device 100 when the electronic device 100 detects that the angle ⁇ 2 between the B screen and the horizontal plane is within the preset range 15, the vector corresponding to the B screen When the angle ⁇ 5 with the Z-axis of the geographic coordinate system is greater than 90 degrees, and the angle ⁇ between the A screen and the B screen is within the preset range 19 (ie [f8, f9]), it is determined that the electronic device 100 is in the fourth stand state .
  • FIG. 9B shows a schematic diagram of a fourth stand state of the electronic device 100 .
  • the screen B is placed horizontally (or nearly horizontally) downwards, and the electronic device 100 is in a reverse half-folded state.
  • the C screen is in the first physical posture, and the display direction of the user interface corresponding to the C screen is display direction 1.
  • FIG. 9E shows a schematic diagram of a fifth support state of the electronic device 100 . It can be understood that, in the fifth support state, the plane formed by the top edge and the bottom edge of the electronic device 100 is a horizontal plane or close to a horizontal plane, and the electronic device 100 is in a forward half-folded state.
  • FIG. 9F shows a schematic diagram of a sixth stand state of the electronic device 100 . It can be understood that, in the sixth support state, the plane formed by the top edge and the bottom edge of the electronic device 100 is a horizontal plane or close to a horizontal plane, and the electronic device 100 is in a reverse half-folded state.
  • the C screen is in the second physical posture, and the display direction of the user interface corresponding to the C screen is display direction 2 .
  • placing the electronic device 100 in the first stand state can facilitate the user to view the display content of the C screen without both hands being free; View the displayed content on screen A and screen B; placing the electronic device 100 in the third support state can facilitate the user to view the displayed content on screen C without both hands being free.
  • Placing the electronic device 100 in the fourth support state can facilitate the user to check the displayed content of the A screen without both hands being free.
  • Placing the electronic device 100 in the fifth support state can facilitate the user to view the displayed content on the C screen without hands being free.
  • Placing the electronic device 100 in the sixth support state can facilitate the user to check the display content of the A screen or the B screen without hands being free.
  • the electronic device 100 when the electronic device 100 detects that the electronic device 100 is in a preset posture and meets the preset conditions, the electronic device 100 starts the camera 1 corresponding to the display screen 1 corresponding to the preset posture to collect images , and display the collected images on the display screen 1.
  • the preset posture of the electronic device 100 can facilitate the user to view the display content of the display screen 1 (that is, at least one of the screens A, B and C) corresponding to the preset posture.
  • the display screen 1 includes screen C (ie, the first screen), and the screen C corresponds to the rear camera (ie, the first camera).
  • the display screen 1 includes a screen A (ie, the second screen), and the screen A corresponds to the front camera (ie, the second camera).
  • the user after the user places the electronic device 100 in a preset posture, the user can start the camera corresponding to the preset posture to take a selfie without both hands being free, and perform a real-time preview through the display screen corresponding to the preset posture , avoiding cumbersome user operations, and effectively improving user experience.
  • the display method provided by the embodiment of the present application will be introduced by taking the display screen 1 as the C screen (ie, the first screen) as an example.
  • the electronic device 100 when it is detected that the electronic device 100 is in the first preset posture and the electronic device 100 satisfies the first preset condition, the electronic device 100 displays the user interface 11 on the C screen.
  • the first preset posture includes: the angle ⁇ between the A screen and the B screen is within the first preset range.
  • the electronic device 100 also controls the A screen and/or the B screen to be turned off.
  • controlling screen A (or screen B) to turn off means: if screen A is off, keep it off; if screen A is on, control screen A to switch to off.
  • Displaying the user interface 11 on the C screen means: if the C screen is in the extinguished state, then light the C screen and display the user interface 11; if the C screen is displaying the user interface 12, then switch the display content of the C screen to the user interface 11.
  • the electronic device 100 controls the external screen (that is, the C screen) to light up, it also controls the internal screen (that is, the A screen and the B screen) to turn off, and when the electronic device 100 controls the internal screen to light up, it also controls the external screen to turn off. In this way, energy consumption can be effectively saved.
  • the electronic device 100 controls the C-screen to display the user interface 11, it automatically starts the gesture detection service of the rear low-power camera corresponding to the C-screen, and uses the above-mentioned low-power camera to detect the user's air gesture operation in real time . In response to the detected air gesture operation, the electronic device 100 may execute a response event corresponding to the above air gesture operation. In this way, after the electronic device 100 controls the C-screen to display the user interface 11 , the user's hands can be freed, and contactless interaction can be further realized.
  • the user interface 11 may be the last displayed user interface before the C-screen is off. In some embodiments, the user interface 11 is an initial interface corresponding to the C screen. In some embodiments, the user interface 11 is the most recently displayed user interface on the inner screen (ie, screen A and screen B constitute the display screen), so that the continuous display of the outer screen to the inner screen can be realized.
  • the rear camera ie, the first camera
  • the preview display area of the user interface 11 is used to display the image of the rear camera (ie, the first camera).
  • the real-time image collected by the rear camera is the preview image in the user interface 11 .
  • the user interface 11 includes: images collected by the electronic device 100 through one or more of the multiple rear cameras.
  • the user interface 11 displayed on the C screen may be referred to as the first user interface
  • the preview display area of the user interface 11 displayed on the C screen may be referred to as the first preview display area
  • the rear low-power camera and the first camera corresponding to the C screen may be the same camera or different cameras, which are not specifically limited here.
  • the first preset condition includes that the dwell time of the electronic device 100 at the current angle value reaches the first preset time.
  • the first preset time is 3s.
  • the above-mentioned first preset condition when the C-screen is in the on state, the above-mentioned first preset condition further includes that the electronic device 100 does not receive a user's input operation on the C-screen within a second preset time.
  • the second preset time is 2s.
  • the above-mentioned first preset condition further includes that the electronic device 100 detects a human face (or a preset user's human face) through the rear camera corresponding to the C-screen. Specifically, the electronic device 100 starts the face detection service of the low-power camera corresponding to the C screen, and uses a face recognition algorithm to detect whether the image collected by the low-power camera includes a human face (or a preset user's face).
  • the electronic device 100 starts the face detection service of the low-power camera corresponding to the C-screen.
  • the electronic device 100 when the electronic device 100 is in the first preset posture, the electronic device 100 starts the face detection service of the low-power camera corresponding to the C-screen.
  • the electronic device 100 is in the first preset posture and meets other conditions included in the first preset condition, the electronic device 100 starts the face detection service of the low-power camera corresponding to the C-screen.
  • the above-mentioned first preset condition further includes that the electronic device 100 detects a first preset gesture through a rear camera corresponding to the C-screen.
  • the first preset gesture is used to trigger the C screen to display the user interface 11 when the electronic device 100 is in the first preset posture.
  • the electronic device 100 starts the gesture detection service of the low-power camera corresponding to the C-screen, and uses a gesture recognition algorithm to detect whether preset gestures are included in the images collected by the low-power camera.
  • the electronic device 100 starts the gesture detection service of the low-power camera corresponding to the C-screen.
  • the electronic device 100 when the electronic device 100 is in the first preset posture, the electronic device 100 starts the gesture detection service of the low-power camera corresponding to the C-screen.
  • the electronic device 100 when the electronic device 100 is in the first preset posture and meets other conditions included in the first preset condition, the electronic device 100 starts the gesture detection service of the low-power camera corresponding to the C-screen.
  • the first preset range includes: preset range 16 (ie [f2, f3]), preset range 18 (ie [f6, f7]) and preset range 21 (ie [f11, f12] ) at least one of the
  • the first preset posture specifically includes: the angle ⁇ between the A screen and the B screen decreases to (and/or increases to) ⁇ 1, and ⁇ 1 is within the first preset range.
  • the first preset range does not include 0° and 180°.
  • the first preset posture also includes that the electronic device 100 is in the first stand state.
  • the first preset range is [d1, d2], and [d1, d2] is within the preset range 16 (ie [f2, f3]), that is, f2 ⁇ d1 ⁇ d2 ⁇ f3.
  • the angle ⁇ between screen A and screen B is within [d1, d2]
  • the electronic device 100 controls the internal screen to turn off, and displays a user interface 11 rotated by 180° on screen C, the user interface 11 Includes images captured by the rear camera.
  • the angles within the first preset range are greater than 0° and less than 120°.
  • the first preset posture further includes that the electronic device 100 is in a third stand state.
  • the first preset range is [d3, d4]
  • [d3, d4] is within the preset range 18 (ie [f6, f7]), that is, f6 ⁇ d3 ⁇ d4 ⁇ f7.
  • the angle ⁇ between screen A and screen B is within [d3, d4]
  • the electronic device 100 controls the internal screen to turn off, and displays a user interface 11 rotated by 180° on screen C
  • the user interface 11 Includes images captured by the rear camera.
  • the angle within the first preset range is greater than 180° and less than 300°.
  • the first preset posture further includes that the electronic device 100 is in the fifth stand state.
  • the first preset range is [d5, d6]
  • [d5, d6] is within the preset range 21 (ie [f11, f12]), that is, f11 ⁇ d5 ⁇ d6 ⁇ f12.
  • the angle ⁇ between screen A and screen B is within [d5, d6]
  • the electronic device 100 controls the internal screen to be turned off, and displays the user interface 11 in the default display direction on the screen C.
  • the user interface 11 includes Images captured by the rear camera.
  • the angles within the first preset range are greater than 0° and less than 180°.
  • the electronic device 100 detects the angle ⁇ between the screen A and the screen B
  • [d1, d2] is exceeded, or when it is detected that the electronic device 100 is out of the first support state, or when it is detected that the electronic device 100 is folded into an unfolded state
  • the electronic device 100 stops displaying the user interface 11 on the C screen.
  • the user continues to fold the electronic device 100 in the first stand state, so that when the angle ⁇ is less than 5°, the electronic device 100 stops displaying the user interface 11 on the C screen.
  • the user continues to unfold the electronic device 100 in the first stand state, so that when the angle ⁇ is greater than 90°, the electronic device 100 stops displaying the user interface 11 on the C screen.
  • the electronic device 100 after the user interface 11 is displayed on the screen C of the electronic device 100 that meets the first preset condition in the third stand state, when the electronic device 100 detects the angle ⁇ between the screen A and the screen B When [d3, d4] is exceeded, or when it is detected that the electronic device 100 is out of the third support state, or when it is detected that the electronic device 100 is folded into an unfolded state, the electronic device 100 stops displaying the user interface 11 on the C screen.
  • the electronic device 100 after the user interface 11 is displayed on the screen C of the electronic device 100 that meets the first preset condition in the fifth stand state, when the electronic device 100 detects the angle ⁇ between the screen A and the screen B When [d5, d6] is exceeded, or when it is detected that the electronic device 100 is out of the fifth stand state, or when it is detected that the electronic device 100 is folded into an unfolded state, the electronic device 100 stops displaying the user interface 11 on the C screen.
  • the first preset condition includes detecting a human face (or a preset user's face) through a rear low-power camera, and controlling the C screen of the electronic device 100 that meets the first preset condition to display the user's face.
  • the electronic device 100 stops displaying the user interface 11 on the C screen, and turns off the low power consumption Camera-consuming face detection service.
  • the electronic device 100 uses the low-power camera corresponding to the C-screen to detect the user's preset gesture 1, and in response to the preset gesture 1, the electronic device 100 stops displaying the user interface 11 on the C-screen.
  • the electronic device 100 stops displaying the user interface 11 on the C screen, specifically including: the electronic device 100 controls the C screen to turn off; or, the electronic device 100 controls the C screen to display other preset interfaces, such as the initial interface corresponding to the C screen , such as the user interface recently displayed on the C screen before the user interface 11 is displayed.
  • the preview display area of the user interface 11 includes images captured by the rear camera in real time
  • the user interface 11 will be described in detail below.
  • the application program 1 is started, and the rear camera corresponding to the C screen is called through the application program 1 to collect images, and the collected images are passed to the user corresponding to the application program 1.
  • Interface 11 is displayed on the C screen.
  • the preview display area of the user interface 11 is used to display the image captured by the rear camera corresponding to the C screen.
  • the user interface 11 shown in FIG. 11A is a user interface of a mirror application.
  • the screen C shown in FIG. 11A displays the image captured by the above-mentioned rear camera in full screen, and the preview display area of the user interface 11 occupies the entire screen C; Some areas of the screen are not specifically limited here.
  • the rear camera corresponding to the above C screen includes an ultraviolet (Ultraviolet, UV) camera
  • the user interface 11 includes images collected by the UV camera
  • the UV camera uses ultraviolet light as a light source for shooting.
  • the image collected by the UV camera can highlight the area where the sunscreen is applied. In this way, through the user interface 11 displayed on the C screen, the user can check the application situation of the sunscreen in real time.
  • the user interface 11 may further include a makeup test control 201 .
  • the makeup test control 201 may receive a user's input operation (such as a touch operation), and in response to the above input operation, the electronic device 100 displays at least one makeup test option 202 shown in FIG. 11D , such as lipstick option 202A, blush option 202B, contouring options, eyebrow options, eyelash options, eye shadow options, eyeliner options, etc.
  • the makeup trial option 1 in the at least one makeup trial option 203 may receive an input operation (such as a touch operation) from the user, and in response to the above input operation, the electronic device 100 displays at least one makeup trial option 1 corresponding to Try makeup styles. After the user selects makeup trial pattern 1 among the above at least one makeup trial pattern, the electronic device 100 can add a makeup trial effect corresponding to the makeup trial pattern on the face captured by the camera, and display it in the user interface 11 . In this way, through the user interface 11 displayed on the C screen, the user can preview the above-mentioned makeup trial effect in real time through the C screen.
  • the application program 1 corresponding to the user interface 11 may be a makeup application.
  • the lipstick option 202A corresponds to a lipstick color number.
  • the electronic device 100 displays at least one lipstick number corresponding to the lipstick option 202A shown in FIG. 11E , such as a color number 203A and a color number 203B.
  • the color number 203A may receive a user's input operation (such as a touch operation), and in response to the above input operation, the electronic device 100 converts the lip color of the face captured by the camera in real time displayed on the user interface 11 into the lip color shown in FIG. 11F Shade 203A.
  • the user can also select a corresponding makeup trial style through other makeup trial options, and preview various makeup trial effects in real time.
  • the fitting style corresponding to the blush option 202B may indicate the blush number, blush position and/or blush shape.
  • the user interface 11 may further include a capture control 204 .
  • the application program 1 corresponding to the user interface 11 may be a camera application.
  • the electronic device 100 may store the image currently displayed on the C screen with the makeup trial effect added.
  • the user interface 11 may include a camera switching control 205, and the camera switching control 205 may switch the camera used to capture images displayed on the C screen to other cameras.
  • the electronic device 100 may include multiple cameras.
  • the camera switching control 205 may receive user input operations, and in response to the above operations, the electronic device 100 directly switches the current camera to other preset cameras, or displays at least one camera option shown in FIG. 12C , for example, the front A camera 205A, a rear telephoto camera 205B and a rear UV camera 205C.
  • the user may select a target camera from at least one of the above camera options.
  • the electronic device 100 if the user chooses to switch the rear camera corresponding to the C screen used to capture images to the front camera, then in response to the user's input operation, the electronic device 100 also displays prompt information, the above prompt information is used to prompt The user turns over the electronic device 100 so that the electronic device 100 can collect images through the front camera. For example, the user is prompted to turn the electronic device 100 into a folded and unfolded state (or the second stand state, the fourth stand state, or the sixth stand state).
  • the user interface 11 may further include a fill light control 206 .
  • the electronic device 100 may display at least one fill light option shown in FIG. The flash light opening control 206C, the display screen fill light control 206D.
  • the electronic device 100 may determine whether to turn on the flashlight based on the brightness of the ambient light.
  • the electronic device 100 in response to an input operation (such as a touch operation) on the display screen fill light control 206D, the electronic device 100 brightens the brightness of the preset fill light area of the C screen.
  • the position, shape and brightness of the preset supplementary light area may be set by the user, or may be set by default by the electronic device 100 , which is not specifically limited here.
  • the user interface 11 may also include at least one shooting mode 207 corresponding to the shooting control, such as night scene mode, portrait mode, large aperture mode, photo mode, video recording mode, professional mode, etc., the user selects After a shooting mode, the electronic device 100 controls the camera to collect images in this shooting mode, and displays them on the user interface 11 corresponding to the C screen.
  • User interface 11 may also include an album control 208 .
  • the photo album control 208 may receive a user's input operation, and in response to the above input operation, the electronic device 100 may display a user interface of the photo album application.
  • the user interface 11 may further include a local feature display frame 210 .
  • the electronic device 100 recognizes the area 1 (ie, the first area) where the preset local feature (such as a human face) in the image 1 captured by the camera is located, and displays the image in the area 1 in the display frame. 210 in. It can be understood that in this implementation manner, the electronic device 100 can continuously track a preset local feature (such as a human face) within the shooting range of the camera, and display the local feature in the display frame 210 in real time. Optionally, the electronic device 100 enlarges the image of the preset local feature in the area 1 and then displays it in the display frame 210, so that the user can preview the details of the preset local feature.
  • the preset local feature such as a human face
  • area 1 may also be referred to as the first area.
  • the display frame 210 shown in FIG. 12H can receive user input operations.
  • the electronic device 100 in response to the above input operations, the electronic device 100 can enlarge the image of the preset local features in area 1 and display it in full screen. On the C screen. Then, the C-screen shown in FIG. 12J can also receive the user's input operation (for example, double-click the C-screen). In response to the above-mentioned input operation, the electronic device 100 can shrink the image of the preset local feature back to the display frame 210, that is, display the image again. 12H shows the user interface 11.
  • the electronic device 100 when it is detected that the electronic device 100 is in the first preset posture and the electronic device 100 satisfies the first preset condition, the electronic device 100 displays a part of the image captured by the camera in the user interface 11 corresponding to the C screen Enlarge the image.
  • the electronic device 100 identifies the area 1 where the preset local features (such as a human face) of the image 1 captured by the camera are located, and the above-mentioned partially enlarged image is the preset local features in area 1 shown in FIG. 12J magnified image of .
  • the above partially enlarged image is an enlarged image of the central area of the image 1 captured by the camera.
  • the user interface 11 shown in FIG. 12J may also include other interface elements of the user interface 11 shown in FIG. 12G (such as the makeup test control 201 ).
  • the electronic device 100 when it is detected that the electronic device 100 is in the first preset posture and the electronic device 100 satisfies the first preset condition, the user interface 11 displayed on the C screen of the electronic device 100 only Including the image captured by the camera corresponding to the C screen (or a partially enlarged image of the above image); then, the electronic device 100 can display other interface elements of the user interface 11 in response to the first input operation received by the user, such as One or more of makeup test control 201 , shooting control 204 , camera switching control 205 , supplementary light control 206 , shooting mode 207 , album control 208 , setting control 209 and preview frame 210 .
  • the above-mentioned first input operation may be a touch operation acting on the C-screen (for example, a user's finger touches the display screen of the electronic device 100 ), or may be a preset air gesture, which is not specifically limited here.
  • the image displayed on the user interface 11 is a partially enlarged image of the image captured by the camera, and the user can drag the image displayed on the user interface 11 to view other areas of the original image captured by the camera.
  • the electronic device 100 shown in FIG. 12J displays the image in the area 1 of the image captured by the camera in real time on the user interface 11; the image shown in FIG. 12J can receive a user's sliding operation (for example, an upward sliding operation); see FIG. 12K, in response to the above sliding operation, the electronic device 100 moves the position of the area 1 in the opposite direction of the sliding direction of the above sliding operation, referring to FIG. 12L , the electronic device 100 displays an enlarged image of the image in the moved area 1 on the user interface 11 .
  • a user's sliding operation for example, an upward sliding operation
  • the image displayed on the user interface 11 moves along the user's sliding direction.
  • the above-mentioned sliding operation may be a touch operation, or an air gesture, which is not specifically limited here.
  • the user can zoom in and zoom out on the image displayed by the user interface 11 .
  • the electronic device 100 shown in FIG. 12J displays the image in the area 1 of the image captured by the camera in real time on the user interface 11; the image shown in FIG. 12J can receive a user's zoom operation (such as a zoom-out operation); see FIG. 12M , in response to the above-mentioned zooming operation, the electronic device 100 zooms in on the size of the area 1, referring to FIG.
  • the electronic device 100 can enlarge the size of the area 1 based on the user's zoom-out operation; visually, in response to the user's zoom-out operation, the image displayed on the user interface 11 is reduced.
  • the electronic device 100 can reduce the size of the area 1 based on the user's zoom-in operation; visually, in response to the user's zoom-in operation, the image displayed on the user interface 11 is enlarged.
  • the above-mentioned zooming operation may be a touch operation, or an air gesture, which is not specifically limited here.
  • the electronic device 100 when it is detected that the electronic device 100 is in the first preset posture and the electronic device 100 satisfies the first preset condition, the electronic device 100 displays the macro in the user interface 11 corresponding to the C screen. Images captured by the camera.
  • the display method provided by the embodiment of the present application will be introduced by taking the display screen 1 as screen A (ie, the second screen) as an example.
  • the electronic device 100 when it is detected that the electronic device 100 is in the second preset posture and the electronic device 100 satisfies the second preset condition, the electronic device 100 displays the user interface 11 on the screen A.
  • the second preset posture includes: the angle ⁇ between the A screen and the B screen is within the second preset range.
  • the electronic device 100 also controls the C screen and/or the B screen to be turned off.
  • the electronic device 100 controls the screen A to display the user interface 11, it can use the gesture detection service of the front low-power camera corresponding to the screen A to detect the user's air gesture in real time. operate. In response to the detected air gesture operation, the electronic device 100 may execute a response event corresponding to the above air gesture operation. In this way, after the electronic device 100 controls the A-screen to display the user interface 11 , the user's hands can be freed, and touch-free space interaction can be further realized.
  • the user interface 11 may be the last displayed user interface before the screen A (or inner screen) is off. Specifically, if before the A screen is extinguished, the A screen and the B screen are independently displayed in split screens, and the user interface 11 is the user interface displayed most recently before the A screen is extinguished; Full-screen display is performed, and the user interface 11 is the latest full-screen display user interface before the internal screen goes out.
  • the user interface 11 is the main interface corresponding to the inner screen.
  • the front camera ie, the second camera
  • the preview display area of the user interface 11 is used to display the front camera (ie, the second camera) ) images collected in real time.
  • the user interface 11 includes: images collected by the electronic device 100 through one or more cameras of the multiple front cameras.
  • the user interface 11 displayed on screen A may be called a second user interface
  • the preview display area of the user interface 11 displayed on screen A may be called a second preview display area
  • the low-power camera and the second camera corresponding to the above-mentioned screen A may be the same camera or different cameras, which are not specifically limited here.
  • the second preset condition includes that the dwell time of the electronic device 100 at the current angle value reaches the first preset time.
  • the first preset time is 3s.
  • the above-mentioned second preset condition further includes that the electronic device 100 does not receive any input operation from the user on the screen A within the second preset time.
  • the second preset time is 2s.
  • the above-mentioned second preset condition further includes that the electronic device 100 detects a human face (or a preset user's human face) through a front low-power camera corresponding to screen A.
  • the above-mentioned second preset condition further includes that the electronic device 100 detects a second preset gesture through the front low-power camera corresponding to screen A.
  • the second preset gesture is used to trigger the A screen to display the user interface 11 when the electronic device 100 is in the second preset posture.
  • the second preset range includes: preset range 17 (ie [f4, f5]), preset range 19 (ie [f8, f9]) and preset range 22 (ie [f13, f14] ) at least one of the
  • the second preset posture specifically includes: the angle ⁇ between the A screen and the B screen decreases to (and/or increases to) ⁇ 2, and ⁇ 2 is within the second preset range.
  • the second preset posture also includes that the electronic device 100 is in the second stand state.
  • the second preset range is [d7, d8], and [d7, d8] is within the preset range 17 (ie [f4, f5]), that is, f4 ⁇ d7 ⁇ d8 ⁇ f5.
  • the angle ⁇ between screen A and screen B is within [d7, d8]
  • the electronic device 100 controls screen C to be off, screen B to be blank or off, and screen A to be turned off according to screen A.
  • the default display direction of displays the user interface 11, and the user interface 11 includes the image collected by the front camera.
  • the angle within the second preset range is greater than 60° and less than 180°.
  • the second preset posture further includes that the electronic device 100 is in a fourth stand state.
  • the second preset range is [d9, d10]
  • [d9, d10] is within the preset range 19 (ie [f8, f9]), that is, f8 ⁇ d9 ⁇ d10 ⁇ f9.
  • the angle ⁇ between screen A and screen B is within [d9, d10]
  • the electronic device 100 controls screen C and screen B to turn off, and screen A follows the default display direction of screen A
  • the user interface 11 includes images collected by the front camera.
  • the angle within the second preset range is greater than 240° and less than 360°.
  • the second preset attitude further includes that the electronic device 100 is in the sixth shelf state.
  • the second preset range is [d11, d12]
  • [d11, d12] is within the preset range 22 (ie [f13, f14]), that is, f13 ⁇ d11 ⁇ d12 ⁇ f14.
  • the angle ⁇ between the A screen and the B screen is within [d11, d12]
  • the electronic device 100 controls the inner screen to be turned off, and displays the user interface 11 rotated by 180° on the A screen
  • the user interface 11 includes images captured by the front camera.
  • the angle within the second preset range is greater than 180° and less than 360°.
  • the second preset range does not include 0° and 180°.
  • the electronic device 100 when the electronic device 100 is in the sixth stand state, the display direction corresponding to the A screen is opposite to the default display direction of the A screen, therefore, the electronic device 100 shown in FIG. 13E and FIG. 13F rotates the user interface 11 by 180° Then it will be displayed on screen A.
  • the electronic device 100 after the user interface 11 is displayed on the screen A of the electronic device 100 that meets the second preset condition in the second stand state, when the electronic device 100 detects that the screen A and the screen B are When the included angle ⁇ exceeds [d7, d8], or when it is detected that the electronic device 100 is out of the second support state, the electronic device 100 stops displaying the user interface 11 on the screen A.
  • the electronic device 100 after the user interface 11 is displayed on the screen A of the electronic device 100 that meets the second preset condition in the fourth stand state, when the electronic device 100 detects that the screen A and the screen B are When the included angle ⁇ exceeds [d9, d10], or when it is detected that the electronic device 100 is out of the fourth support state, the electronic device 100 stops displaying the user interface 11 on the screen A.
  • the electronic device 100 stops displaying the user interface 11 on the screen A.
  • the second preset condition includes detecting a human face (or a preset user's face) through a front low-power camera, and controlling the A screen of the electronic device 100 that meets the second preset condition to display the user's face.
  • the electronic device 100 stops displaying the user interface 11 on the A screen, and turns off the front low-power camera. Face detection service.
  • the electronic device 100 uses the low-power camera corresponding to the screen A to detect the user's preset gesture 2, and in response to the preset gesture 2, the electronic device 100 stops displaying the user interface 11 on the screen A.
  • the electronic device 100 stops displaying the user interface 11 on the A screen, which specifically includes: the electronic device 100 controls the A screen to turn off; or, the electronic device 100 controls the A screen to display other preset interfaces, such as the main interface corresponding to the inner screen Part or all of the user interface, for example, the most recently displayed user interface on screen A before the user interface 11 is displayed.
  • the electronic device 100 Control the inner screen (screen A and screen B) to display the preset interface in full screen, for example, display the user interface 11 shown in FIG. 14A or the main interface 12 shown in FIG. 14B in full screen.
  • the user interface 11 displayed on the aforementioned screen A may be the user interface 11 described in the related embodiments of the aforementioned FIGS. 11A to 12N .
  • the difference is that the images in the preview display area of the user interface 11 displayed on the A screen are collected by the electronic device 100 through the front camera corresponding to the A screen.
  • the sizes of screen A, screen C, and the inner screen may be different, and the same user interface (such as user interface 11) displayed on screen A, screen C, and the inner screen contains the same interface elements.
  • the size of the displayed user interface 11 may be different, and the layout (ie position and size) of each interface element of the user interface 11 displayed on the three display screens may be different.
  • the layout of each interface element of the user interface 11 displayed on each display screen (for example, C screen, A screen, inner screen) is associated with the size of the display screen.
  • the electronic device 100 in response to the user's input operation, the electronic device 100 also displays prompt information, which is used to prompt the user to turn over the electronic device 100 so that the electronic device 100 can collect images through the rear camera. For example, the user is prompted to turn the electronic device 100 into a folded and unfolded state (or the first stand state, the third stand state, or the fifth stand state).
  • the display method provided by the embodiment of the present application will be introduced by taking the display screen 1 as an A screen (ie, the second screen) and a B screen (ie, the third screen) as an example.
  • the electronic device 100 when the electronic device 100 detects that the electronic device 100 is in a third preset posture and the electronic device 100 meets the third preset condition, the electronic device 100 starts the front camera corresponding to the screen A ( That is, the second camera) captures an image, and displays a user interface 11 on the A screen, and displays a partially enlarged image of the image collected by the above-mentioned front camera (that is, the second camera) on the B screen, and the preview display area of the user interface 11 is used to display the above-mentioned Images captured by the front camera.
  • the above partial enlarged image is an enlarged image of preset local features in the image captured by the front camera, for example, the enlarged image of a human face as shown in FIG. 15A .
  • the electronic device 100 when the electronic device 100 detects that the electronic device 100 is in the third preset posture and the electronic device 100 meets the third preset condition, the electronic device 100 starts the front camera corresponding to the screen A ( That is, the second camera) collects images, and displays the user interface 11 on the B screen, and displays the partially enlarged image of the image collected by the above-mentioned front camera (that is, the second camera) on the A screen, and the preview display area of the user interface 11 includes the above-mentioned front camera. Images captured by the camera.
  • the above partial enlarged image is an enlarged image of preset local features in the image collected by the front camera.
  • the user in the second support state, can preview the images captured by the camera on one screen (such as screen A) of the inner screen in real time, and at the same time view the images captured by the camera on another screen (such as screen B) of the internal screen.
  • the magnified image of the preset local features in the image allows the user to see the details of the face while previewing the overall shooting effect, which effectively improves the user experience.
  • the electronic device 100 when the electronic device 100 detects that the electronic device 100 is in the third preset posture and the electronic device 100 meets the third preset condition, the electronic device 100 starts the front camera corresponding to the screen A ( That is, the second camera) collects images at shooting angle 1 and shooting angle 2 respectively, and displays the image collected at shooting angle 1 on screen A, and displays the image collected at shooting angle 2 on screen B.
  • screen A and/or screen B may also display other interface elements other than the image captured by the camera shown in FIG. 12G . In this way, the user can view the shooting effects under different shooting angles at the same time.
  • the electronic device 100 when the electronic device 100 detects that the electronic device 100 is in the third preset posture and the electronic device 100 meets the third preset condition, it starts the front camera corresponding to the screen A to capture an image, and
  • the user interface 11 is displayed in split screens on the A screen and the B screen, that is, the first part of the user interface 11 is displayed on the A screen, and the second part of the user interface 11 is displayed on the B screen, and the first part of the user interface 11 includes a preview display of the user interface 11 area, the preview display area is used to display the images captured by the front camera in real time, and the second part of the user interface 11 includes one or more interface elements in the user interface 11 except the display content of the A screen.
  • the third preset posture includes: the angle ⁇ between the A screen and the B screen is within the third preset range.
  • the electronic device 100 also controls the C screen to turn off.
  • the user interface 11 displayed on screen A and screen B may be referred to as the third user interface
  • the preview display area of the third user interface may be referred to as the third preview display area
  • the electronic device 100 controls the split-screen display of screen A and screen B, it can use the gesture detection service of the low-power camera corresponding to screen A to detect the user's air gesture operation in real time . In response to the detected air gesture operation, the electronic device 100 may execute a response event corresponding to the above air gesture operation. In this way, after the electronic device 100 controls the split-screen display of the A-screen and the B-screen, the user's hands can continue to be freed, further realizing touch-free space interaction.
  • the third preset condition includes that the dwell time of the electronic device 100 at the current angle value reaches the first preset time.
  • the first preset time is 3s.
  • the above-mentioned third preset condition when screen A and screen B are on, the above-mentioned third preset condition also includes that the electronic device 100 does not receive user input operations on screen A and screen B within the second preset time .
  • the second preset time is 2s.
  • the above-mentioned third preset condition further includes that the electronic device 100 detects a human face (or a preset user's human face) through a front low-power camera corresponding to screen A.
  • the above-mentioned third preset condition further includes that the electronic device 100 detects a third preset gesture through the front low-power camera corresponding to the A screen.
  • the third preset gesture is used to trigger screen A and screen B to perform split-screen display when the electronic device 100 is in the third preset posture.
  • the third preset posture specifically includes: the angle ⁇ between the A screen and the B screen decreases to (and/or increases to) ⁇ 3, and ⁇ 3 is within the third preset range.
  • the third preset range does not include 0° and 180°.
  • the third preset posture further includes that the electronic device 100 is in the second stand state.
  • the third preset range is [d7, d8], and [d7, d8] is within the preset range 17, that is, f4 ⁇ d7 ⁇ d8 ⁇ f5.
  • the angle ⁇ between screen A and screen B is within [d7, d8]
  • the electronic device 100 controls screen A to display images captured by the front camera in real time, and screen B to display other interface elements of the user interface 11 , and the C screen goes off.
  • the angle within the third preset range is greater than 60° and less than 180°.
  • the user interface 11 displayed on the screen A and the screen B in split screen may be the user interface 11 described in the above-mentioned related embodiments of FIG. 11A to FIG. 12N .
  • the difference is that the images in the preview display area of the user interface 11 displayed in split screens on screen A and screen B are collected by the electronic device 100 through the front camera corresponding to screen A.
  • the following takes the user interface 11 shown in FIG. 12G as an example for description.
  • the user interface 11 displayed on the screen A and screen B in split screen will be introduced below.
  • the first part of the user interface 11 displayed on the A screen only includes images captured by the front camera in real time
  • the second part of the user interface 11 displayed on the B screen includes the images in the user interface 11 other than the above-mentioned images. All other interface elements, and the layout of each interface element displayed on the B screen is associated with the interface layout of the user interface 11 shown in FIG. 12G , and the electronic device 100 stores the interface layout of the user interface 11 shown in FIG. 12G .
  • the second part of the user interface 11 displayed on the B-screen includes all the above-mentioned other interface elements, and secondary interface elements of one or more of the above-mentioned all other interface elements.
  • the second part of the user interface 11 displayed on the B screen includes the makeup test control 201 and the fill light control 206 of the user interface 11, and also includes the secondary interface elements corresponding to the makeup test control 201, namely The one or more makeup trial options 202 also include secondary interface elements corresponding to the fill light control 206 , that is, one or more fill light options.
  • the electronic device 100 can rearrange the interface elements of the user interface 11 displayed on the B screen to adapt to the size of the B screen and improve user experience.
  • the first part of the user interface 11 displayed on the A screen includes images captured by the front camera in real time, and at least one other interface element of the user interface 11, and the second part of the user interface 11 displayed on the B screen includes the user interface Interface elements in 11 except the display content of screen A.
  • the display content on screen A includes images captured by the front camera in real time, as well as shooting control 204 , camera switching control 205 and photo album control 208 .
  • the display content of the B screen includes the makeup test control 201 and the secondary interface elements corresponding to the makeup test control 201 , the supplementary light control 206 and the secondary interface elements corresponding to the supplementary light control 206 , and the shooting mode 207 .
  • the user in the second support state, can preview the images captured by the camera in real time on screen A, control the shooting parameters of the camera on screen B, and control the display effect of the images captured by the camera.
  • the user in the state of the second bracket, can check the state of his face in real time through the A screen, modify the fill light parameters through the fill light option displayed on the B screen, and modify the face test displayed on the A screen through the makeup test option displayed on the B screen.
  • the makeup effect effectively improves the user experience.
  • the electronic device 100 after controlling the internal screen of the electronic device 100 meeting the third preset condition in the second bracket state to display the preset content in split screens, when the electronic device 100 detects that the A screen and the B screen When the included angle ⁇ of the screen exceeds [d7, d8], or when it is detected that the electronic device 100 is out of the second support state, the electronic device 100 stops displaying the above preset content on the A screen and the B screen.
  • the third preset condition includes detecting a human face (or a preset user's face) through a front low-power camera, and controlling the split-screen display of the electronic device 100 that meets the third preset condition. After the content is set, when no face is detected by the front low-power camera within the third preset time, the electronic device 100 stops displaying the preset content on the screens A and B, and turns off the front camera. Face detection service for low-power cameras.
  • the electronic device 100 uses the low-power camera corresponding to the A screen to detect the user's preset gesture 3. set content.
  • the electronic device 100 stops displaying the above-mentioned preset content on the split screens of the A screen and the B screen, specifically including: the electronic device 100 controls the inner screen to be turned off; or, the electronic device 100 controls the split screens of the A screen and the B screen to display other Default interface, such as the main interface corresponding to the inner screen.
  • the electronic device 100 controls the inner screen (screen A and screen B) to display a preset interface in full screen, for example, display the user interface 11 shown in FIG. 14A or the main interface 12 shown in FIG. 14B in full screen.
  • the software structure of the electronic device 100 is described as an example below.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the software structure of the electronic device 100 is exemplarily described by taking an Android system with a layered architecture as an example.
  • FIG. 16 is a block diagram of a software structure of an electronic device 100 provided by an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application program (application, APP) layer may include a series of application program packages. As shown in FIG. 16, the application package may include application programs such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the framework layer may include a sensor management module (sensor manager) 707, a gesture recognition module (posture recognition) 708, a display management module (display manager) 709, and a window management module (window manager service, WMS) 710.
  • a sensor management module sensor manager
  • gesture recognition module pose recognition
  • display management module display manager
  • window management module window manager service, WMS
  • WMS window manager service
  • it may also include an activity manager (activity manager service, AMS), a content provider, a view system, a phone manager, a resource manager, a notification manager, etc. (not shown in the drawings).
  • the window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • the hardware abstraction layer includes a sensor service module (sensor service) 706, which can be used to report the processing results of the sensor data processing module 705 in the kernel layer to the sensor management in the framework layer Module 707.
  • the hardware abstraction layer further includes a camera detection service module 713, which can be used to report the image processing results of the camera detection data processing module 712 in the kernel layer to the face recognition module in the framework layer 714 and gesture recognition 715 .
  • the kernel layer is the layer between hardware and software.
  • the sensor data processing module 705 may be included in the kernel layer. Wherein, the sensor data processing module 705 can be used to obtain the data reported by one or more sensors in the hardware layer (Hardware), perform data processing, and report the data processing result to the sensor service module 706.
  • the camera detection data processing module 712 can also be included in the kernel layer, and the camera detection data processing module 712 can be used to obtain the image reported by the camera 711 in the hardware layer, perform image processing, and report the image processing result to Camera detection service module 713.
  • the hardware layer may include an acceleration sensor 701, a gyro sensor 702, an acceleration sensor 703, a gyro sensor 704, and the like.
  • the acceleration sensor 701 and the gyroscope sensor 702 can be set on the A screen of the electronic device 100
  • the acceleration sensor 703 and the gyroscope sensor 704 can be set on the B screen of the electronic device 100 .
  • the acceleration sensor 701 can be used to measure the acceleration data of the A screen, and report to the sensor data processing module 705 .
  • the acceleration sensor 703 can be used to measure the acceleration data of the B screen and report it to the sensor data processing module 705 .
  • the gyroscope sensor 702 can be used to measure the gyroscope data of the A screen and report to the sensor data processing module 705 .
  • the gyroscope sensor 704 can be used to measure the gyroscope data of the B screen and report to the sensor data processing module 705 .
  • the acceleration sensor 701, the gyroscope sensor 702, the acceleration sensor 703 and the gyroscope sensor 704 in the hardware layer can report the sensor data measured respectively to the sensor data processing in the kernel layer.
  • the sensor data processing module 705 can calculate the vector corresponding to the orientation of the A screen according to the sensor data reported by multiple sensors in the hardware layer The vector corresponding to the orientation of the B screen Then calculate the angle ⁇ between A screen and B screen.
  • the sensor data processing module 705 can use the sensor service module 706 in the hardware abstraction layer to direct the direction vector of the A screen to Direction vector of screen B facing
  • the angle ⁇ between the screen A and the screen B is reported to the sensor management module 707 in the frame layer.
  • the sensor management module 707 can be used to convert the vector vector The angle ⁇ between A and A is given to the gesture recognition module 708 .
  • Gesture recognition module 708 can be based on the vector vector and the included angle ⁇ to determine the stand state of the electronic device 100 , and then based on the included angle ⁇ and the stand state of the electronic device 100 , the attitude type of the electronic device 100 is recognized. And send the gesture type to the display management module 709.
  • the display management module 709 can determine the display state of each display screen (A screen, B screen and C screen) according to the gesture type, and display the user interface 11 on the display screen 1 .
  • the display state of the display screen includes the on state and the off state, and the display management module 709 may notify the window management module 710 to create a window corresponding to the user interface 11 and update window attributes (such as size and position).
  • the window management module 710 can refresh the window system, redraw the window, and notify the upper layer application to adjust the attributes (such as size and position) of the display elements in the window.
  • the electronic device 100 when the posture type is a preset posture, the electronic device 100 starts the face detection service of the camera 711 corresponding to the preset posture, the electronic device 100 uses the camera 711 to collect images, and reports the collected images to the camera
  • the detection data processing module 712 and the camera detection service module 713 perform image processing on the above-mentioned images, and upload the processed images to the face recognition module 714 through the camera detection service module 713, and the face recognition module 714 recognizes whether the above-mentioned images include people face (or preset user’s face)
  • the display management module 709 can determine the display state and display content of each screen according to the gesture type and face recognition results. For example, the first preset pose corresponds to the rear camera, and the second preset pose corresponds to the front camera.
  • the gesture detection service of the camera 711 corresponding to the display screen 1 is enabled.
  • the camera detection service module 713 uploads the processed image to the gesture recognition module 715 through the camera detection service module 713, the gesture recognition module 715 recognizes the gesture type in the above image, and the display management module 709 can update the display screen 1 according to the above gesture type. Display status and display content and so on.
  • the present application also provides an electronic device 100.
  • the electronic device 100 includes a first screen, a folding screen and a first camera.
  • the folding screen can be folded along the folding edge to form a second screen and a third screen.
  • the first screen and the second The orientation of the screens is opposite, the orientation of the first screen is consistent with the shooting direction of the first camera, and the orientation of the second screen is consistent with the shooting direction of the second camera.
  • FIG. 17 shows a schematic structural diagram of another electronic device provided by an embodiment of the present invention.
  • the electronic device 100 may include: a detection unit and a display unit.
  • a detection unit configured to determine that the electronic device is in a first preset posture based on the detected first angle between the second screen and the third screen;
  • the display unit is configured to display the first user interface on the first screen based on the first preset posture of the electronic device; the first preview display area of the first user interface is used to display images collected by the first camera, and the first preset posture is , the first included angle does not include 0° and 180°;
  • the detection unit is further configured to determine that the electronic device is in a second preset posture based on the detected first included angle
  • the display unit is further configured to display a second user interface on the second screen based on the second preset posture of the electronic device; the second preview display area of the second user interface is used to display images captured by the second camera, and the second preset In attitude, the first included angle does not include 0° and 180°.
  • the detection unit is further configured to determine that the electronic device is in a third preset posture based on the detected first included angle; the display unit is also configured to, based on the third preset posture of the electronic device, at the second screen and the third screen are displayed in split screens, the display content of the second screen includes a third preview display area, and the third preview display area is used to display images collected by the second camera. Including 0° and 180°.
  • displaying the first user interface on the first screen based on the first preset posture of the electronic device includes: when it is detected that the electronic device is in the first preset posture and the electronic device meets the first preset condition , start the first camera to collect images, and display the first user interface on the first screen; wherein, the first preset posture includes that the first included angle is within the first preset range, and the first preset condition includes the following one or Multiple items: the pause time of the first included angle at the current included angle value reaches the first preset time; when the first screen is on, the electronic device does not receive any action on the first screen within the second preset time Input operation: In the image captured by the camera corresponding to the first screen, a human face or the face of a preset user is detected; in the image captured by the camera corresponding to the first screen, a first preset gesture is detected.
  • displaying the second user interface on the second screen based on the second preset posture of the electronic device includes: when it is detected that the electronic device is in the second preset posture and the electronic device meets the second preset condition , start the second camera to collect images, and display the second user interface on the second screen; wherein, the second preset posture includes that the first angle is within the second preset range, and the second preset condition includes the following one or Multiple items: the pause time of the first included angle at the current included angle value reaches the first preset time; when the second screen is on, the electronic device does not receive any signal acting on the second screen within the second preset time Input operation: In the image captured by the camera corresponding to the second screen, a face or the face of a preset user is detected; in the image captured by the camera corresponding to the second screen, a second preset gesture is detected.
  • performing split-screen display on the second screen and the third screen includes: when it is detected that the electronic device is in the third preset posture, and the electronic device satisfies the third Under preset conditions, start the second camera to collect images, and perform split-screen display on the second screen and the third screen; wherein, the third preset posture includes that the first included angle is within the third preset range, and the third preset The conditions include one or more of the following: the pause time of the first included angle at the current included angle value reaches the first preset time; when the second screen and/or the third screen are on, the electronic device No input operation on the second screen or the third screen is received within the set time; in the image collected by the camera corresponding to the second screen, a human face or the face of the preset user is detected; In the collected images, a third preset gesture is detected.
  • the present application also provides an electronic device 100, the electronic device 100 includes a first screen, a folding screen and a first camera, the folding screen can be folded along the folding edge to form a second screen and a third screen, the first screen and The orientation of the second screen is opposite, and the orientation of the first screen is consistent with the shooting direction of the first camera.
  • the electronic device may include a plurality of functional modules or units, for example, a display unit. in,
  • the display unit is configured to start the first camera to collect images and display the first user interface on the first screen when it is detected that the electronic device is in the first preset posture and the electronic device satisfies the first preset condition.
  • the first preview display area is used to display the image captured by the first camera; wherein, the first preset posture includes that the first angle between the second screen and the third screen is within the first preset range, and the first preset condition includes One or more of the following: the pause time of the first included angle at the current included angle value reaches the first preset time; when the first screen is on, the electronic device does not receive any action within the second preset time Input operation on the first screen; in the image captured by the camera corresponding to the first screen, a human face or the face of the preset user is detected; in the image captured by the camera corresponding to the first screen, the first preset gesture is detected .
  • the camera corresponding to the above-mentioned first screen and the first camera may be the same camera, or may be different cameras.
  • the above-mentioned camera corresponding to the first screen is a low-power camera.
  • the electronic device further includes a second camera, and the orientation of the second screen is consistent with the shooting direction of the second camera.
  • the folding screen is folded along the folding edge to form the second screen and the third screen, the third screen and the third screen
  • the two cameras are located on different sides of the fold.
  • the display unit is also used to, when it is detected that the electronic device is in the second preset posture and the electronic device meets the second preset condition, start the second camera to collect images, and display the second user interface on the second screen, the second user
  • the second preview display area of the interface is used to display the image collected by the second camera; wherein, the second preset posture includes that the first included angle is within a second preset range, and the second preset condition includes one or more of the following: The pause time of the first included angle at the current included angle value reaches the first preset time; when the second screen is on, the electronic device does not receive any input operation on the second screen within the second preset time; In the image captured by the camera corresponding to the second screen, a face or the face of a preset user is detected; in the image captured by the camera corresponding to the second screen, a second preset gesture is detected.
  • the camera corresponding to the second screen and the second camera may be the same camera or different cameras.
  • the above-mentioned camera corresponding to the second screen is a low-power camera.
  • the electronic device further includes a second camera, and the orientation of the second screen is consistent with the shooting direction of the second camera.
  • the folding screen is folded along the folding edge to form the second screen and the third screen, the third screen and the third screen
  • the two cameras are located on different sides of the fold.
  • the display unit is further configured to, when it is detected that the electronic device is in a third preset posture and the electronic device satisfies the third preset condition, start the second camera to collect images, and perform split-screen display on the second screen and the third screen,
  • the display content of the second screen includes a third preview display area, and the third preview display area is used to display the image collected by the second camera;
  • the third preset posture includes that the first included angle is within a third preset range
  • the third The preset conditions include one or more of the following: the pause time of the first angle at the current angle value reaches the first preset time; when the second screen and/or the third screen are on, the electronic device No input operation on the second screen or the third screen is received within the preset time; in the image collected by the camera corresponding to the second screen, a human face or the preset user’s face is detected; In the image collected by the camera, a third preset gesture is detected.
  • the first preset attitude further includes: the second angle between the third screen and the horizontal plane is within the fourth preset range, the angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90°, The angle within the first preset range is greater than 0° and less than 120°; or, the first preset posture also includes: the second included angle is within the fourth preset range, the orientation of the third screen is in line with the Z axis of the geographic coordinate system The included angle is greater than 90°, and the angle within the first preset range is greater than 180° and less than 300°; or, the first preset posture also includes: the difference between the third included angle and the second included angle is within the fifth preset Within the range, the third angle is the angle between the second screen and the horizontal plane, the angle between the orientation of the third screen and the Z axis of the geographic coordinate system is greater than 90°, and the angle within the first preset range is greater than 0° and less than 180° °.
  • the first preset attitude further includes: the second angle between the third screen and the horizontal plane is within the fourth preset range, the angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90°, The angle within the first preset range is less than 120°; the display unit displays the first user interface on the first screen, including: the display unit displays the first user interface rotated by 180° on the first screen.
  • the electronic device further includes an identification unit. Before the display unit displays the first user interface on the first screen, the identification unit is used to identify the first area where the preset local feature is located in the image captured by the first camera. A preview display area is used to display the enlarged image of the image in the first area.
  • the electronic device further includes a receiving unit.
  • the receiving unit is used to receive the user's first input operation; the display unit is also used to respond to the first input operation , displaying one or more of the following interface elements in the first user interface: makeup test control, shooting control, camera switching control, fill light control, photo album control, and display frame; wherein, the makeup test control is used for the first preview display area A preset makeup effect is added to the face in the displayed image; the shooting control is used to trigger the electronic device to save the image displayed in the first preview display area; the camera switching control is used to switch the camera for capturing the image; the supplementary light control is used to supplement the ambient light; The album control is used to trigger the electronic device to display the user interface of the album application; the display frame is used to display the enlarged image of the preset local feature in the image captured by the first camera.
  • the first camera is an ultraviolet camera, and the images collected by the first camera are used to highlight areas where sunscreen is applied.
  • the display unit is further configured to control the second screen and the third screen to turn off when it is detected that the electronic device is in the first preset posture and the electronic device satisfies the first preset condition.
  • the second preset posture further includes: the second included angle between the third screen and the horizontal plane is within a fourth preset range, and the included angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90° , the angle within the second preset range is greater than 60° and less than 180°; or, the second preset posture also includes: the second included angle is within the fourth preset range, the orientation of the third screen is in line with the Z of the geographic coordinate system The included angle of the axis is greater than 90°, and the angle within the second preset range is greater than 240° and less than 360°; or, the second preset posture also includes: the difference between the third included angle and the second included angle is within the fifth preset Within the set range, the third angle is the angle between the second screen and the horizontal plane, the angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is greater than 90°, and the angle within the second preset range is greater than 180° and less than 360°.
  • the first included angle does not include 0° and 180°.
  • the display unit is further configured to control the first screen to turn off when it is detected that the electronic device is in the second preset posture and the electronic device meets the second preset condition.
  • the third preset posture further includes: the second included angle between the third screen and the horizontal plane is within a fourth preset range, and the included angle between the orientation of the third screen and the Z-axis of the geographic coordinate system is less than 90° , the angle within the third preset range is greater than 60° and less than 180°.
  • the displayed content on the third screen includes enlarged images of preset local features in the images captured by the second camera.
  • the above-mentioned display unit performs split-screen display on the second screen and the third screen, including: the display unit performs split-screen display on the third user interface on the second screen and the third display screen; the display of the second screen
  • the content includes the third preview display area of the third user interface, and zero, one or more interface elements in the third user interface except the third preview display area, and the display content of the third screen includes One or more interface elements except the display content of the second screen;
  • the third user interface also includes one or more of the following interface elements: makeup test control, shooting control, camera switching control, fill light control, photo album control, display box; wherein, the makeup test control is used to add a preset makeup effect to the face in the image in the preview display area of the third user interface; the shooting control is used to trigger the electronic device to save the image in the preview display area of the third user interface ;
  • the camera switching control is used to switch the camera that captures the image;
  • the fill light control is used to supplement the ambient light;
  • the photo album control is used
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, DSL) or wireless (eg, infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.
  • the processes can be completed by computer programs to instruct related hardware.
  • the programs can be stored in computer-readable storage media.
  • When the programs are executed may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Mathematical Physics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)

Abstract

本申请公开了显示方法及相关装置,所述方法应用于电子设备,电子设备包括第一屏、折叠屏和第一摄像头,折叠屏沿折叠边可折叠成第二屏和第三屏,第一屏和第二屏的朝向相反,第一屏的朝向和第一摄像头的拍摄方向一致,第二屏的朝向和第二摄像头的拍摄方向一致,所述方法包括:基于第二屏和第三屏的第一夹角,确定电子设备处于第一预设姿态,电子设备在第一屏显示第一用户界面;第一用户界面包括第一摄像头采集的图像;基于第一夹角,确定电子设备处于第二预设姿态,电子设备在第二屏显示第二用户界面;第二用户界面包括第二摄像头采集的图像。这样,实现了在电子设备的特定姿态下,自动启动相应的摄像头和显示屏进行实时预览,避免繁琐的用户操作。

Description

显示方法及相关装置
本申请要求于2021年6月9日提交中国专利局、申请号为202110642450.0、申请名称为“显示方法及折叠屏终端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。本申请要求于2021年8月11日提交中国专利局、申请号为202110920022.X、申请名称为“显示方法及相关装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子技术领域,尤其涉及显示方法及相关装置。
背景技术
伴随电子技术的发展,电子设备配置的显示屏越来越大,显示屏可以给用户提供更丰富的信息,进而带来更好的使用体验。例如,在手机的正面配置尺寸较大的折叠屏。当折叠屏处于折叠形态时,折叠屏可以被折叠成至少两个屏,电子设备可以在其中一个屏上进行显示,用户可以通过触控操作或按键操控上述其中一个屏的显示内容。当折叠屏处于展开形态时,电子设备可以在折叠屏进行全屏显示,用户可以通过触控操作或按键操控整个折叠屏的显示内容。
目前,用户与折叠屏的交互操作繁琐,用户体验差。
发明内容
本申请提供了显示方法,实现了在可折叠的电子设备的特定姿态下,自动启动相应的摄像头和显示屏进行实时预览,避免繁琐的用户操作,有效提高用户体验。
第一方面,本申请提供了显示方法,应用于电子设备,上述电子设备包括第一屏、折叠屏和第一摄像头,上述折叠屏沿折叠边可折叠形成第二屏和第三屏,第一屏和第二屏的朝向相反,第一屏的朝向和第一摄像头的拍摄方向一致,第二屏的朝向和第二摄像头的拍摄方向一致,所述方法包括:
基于检测到的第二屏和第三屏的第一夹角,确定电子设备处于第一预设姿态;基于电子设备的第一预设姿态,电子设备在第一屏显示第一用户界面;第一用户界面的第一预览显示区用于显示第一摄像头采集的图像,第一预设姿态下,第一夹角不包括0°和180°;基于检测到的第一夹角,确定电子设备处于第二预设姿态;基于电子设备的第二预设姿态,电子设备在第二屏显示第二用户界面;第二用户界面的第二预览显示区用于显示第二摄像头采集的图像,第二预设姿态下,第一夹角不包括0°和180°。
实施本申请实施例中,电子设备检测到电子设备处于第一预设姿态时,电子设备可以启动上述第一预设姿态对应的第一摄像头采集图像,并将采集到的图像显示在第一预设姿态对应的第一屏。电子设备检测到电子设备处于第二预设姿态时,电子设备可以启动上述第二预设姿态对应的第二摄像头采集图像,并将采集到的图像显示在第二预设姿态对应的第二屏。需要说明的是,电子设备处于第一预设姿态,可以便于用户查看第一屏的显示内容;电子设备处于第二预设姿态,可以便于用户查看第二屏的显示内容。这样,用户将电子设备放置于预设姿态后,用户可以在解放双手的情况下启动上述预设姿态对应的摄像头进行拍摄,并通过上述预设姿态对应的显示屏进行实时预览,避免了繁琐的用户操作,有效提高了用户的使 用体验。
在一种实现方式中,所述方法还包括:基于检测到的第一夹角,确定电子设备处于第三预设姿态;基于电子设备的第三预设姿态,电子设备在第二屏和第三屏进行分屏显示,第二屏的显示内容包括第三预览显示区,第三预览显示区用于显示第二摄像头采集的图像,第三预设姿态下,第一夹角不包括0°和180°。
实施本申请实施例中,电子设备检测到电子设备处于第三预设姿态时,电子设备可以启动上述第三预设姿态对应的第二摄像头采集图像,并将采集到的图像显示在第三预设姿态对应的第二屏,还可以通过第三屏显示与上述第二摄像头采集的图像相关联的界面元素。需要说明的是,电子设备处于第三预设姿态,可以便于用户查看第二屏和第三屏的显示内容。这样,用户将电子设备放置于预设姿态后,用户可以在解放双手的情况下启动上述预设姿态对应的第二摄像头进行拍摄,并通过上述预设姿态对应的第二屏进行实时预览,以及通过第三屏查看与第二摄像头采集的图像相关联的界面元素,避免了繁琐的用户操作,有效提高了用户的使用体验。
在一种实现方式中,上述基于电子设备的第一预设姿态,电子设备在第一屏显示第一用户界面,包括:当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,电子设备启动第一摄像头采集图像,并在第一屏显示第一用户界面;其中,第一预设姿态包括第一夹角在第一预设范围内,第一预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第一屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第一屏的输入操作;在第一屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第一屏对应的摄像头采集的图像中,检测到第一预设手势。
实施本申请实施例中,用户将电子设备放置于第一预设姿态,且令电子设备满足第一预设条件后,用户可以在解放双手的情况下启动第一预设姿态对应的第一摄像头进行拍摄,并通过第一预设姿态对应的第一屏进行实时预览,避免了繁琐的用户操作,有效提高了用户的使用体验。
在一种实现方式中,上述基于电子设备的第二预设姿态,电子设备在第二屏显示第二用户界面,包括:当检测到电子设备处于第二预设姿态,且电子设备满足第二预设条件时,电子设备启动第二摄像头采集图像,并在第二屏显示第二用户界面;其中,第二预设姿态包括第一夹角在第二预设范围内,第二预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第二预设手势。
实施本申请实施例中,用户将电子设备放置于第二预设姿态,且令电子设备满足第二预设条件后,用户可以在解放双手的情况下启动第二预设姿态对应的第二摄像头进行拍摄,并通过第二预设姿态对应的第二屏进行实时预览,避免了繁琐的用户操作,有效提高了用户的使用体验。
在一种实现方式中,上述基于电子设备的第三预设姿态,电子设备在第二屏和第三屏进行分屏显示,包括:当检测到电子设备处于第三预设姿态,且电子设备满足第三预设条件时,电子设备启动第二摄像头采集图像,并在第二屏和第三屏进行分屏显示;其中,第三预设姿态包括第一夹角在第三预设范围内,第三预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏和/或第三屏处于点亮状态时,电子设备在第二 预设时间内未接收到作用于第二屏或第三屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第三预设手势。
实施本申请实施例中,用户将电子设备放置于第三预设姿态,且令电子设备满足第三预设条件后,用户可以在解放双手的情况下启动第三预设姿态对应的第二摄像头进行拍摄,并通过第三预设姿态对应的第二屏进行实时预览,以及通过第三屏查看与第二摄像头采集的图像相关联的界面元素,避免了繁琐的用户操作,有效提高了用户的使用体验。
在一种实现方式中,第一预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏朝向与地理坐标系的Z轴的夹角小于90°,第一预设范围内的角度大于0°且小于120°;或者,第一预设姿态还包括:第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第一预设范围内的角度大于180°且小于300°;或者,第一预设姿态还包括:第三夹角与第二夹角的差值在第五预设范围内,第三夹角为第二屏与水平面的夹角,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第一预设范围内的角度大于0°且小于180°。
实施本申请实施例中,通过有效设置第一预设姿态,无需为电子设备安装额外的支架装置,也无需用户手持电子设备,用户既可以更方便的查看第一屏的显示内容。
在一种实现方式中,第一预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏朝向与地理坐标系的Z轴的夹角小于90°,第一预设范围内的角度小于120°;上述在第一屏显示第一用户界面,包括:在第一屏显示旋转180°后的第一用户界面。
在一种实现方式中,第一预设姿态包括第一支架状态和第五支架状态,上述当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,电子设备启动第一摄像头采集图像,并在第一屏显示第一用户界面,包括:当检测到电子设备处于第一支架状态,且电子设备满足第一预设条件时,电子设备启动第一摄像头采集图像,并在第一屏显示第一用户界面;上述在第一屏显示第一用户界面之后,还包括:当检测到电子设备由第一支架状态切换为第五支架状态时,电子设备在第一屏显示旋转180°后的第一用户界面。
实施本申请实施例中,第一屏对应的用户界面的显示方向可以自适应地随着电子设备的物理姿态进行调整,以便于用户查看,有效提升了用户体验。
在一种实现方式中,上述在第一屏显示第一用户界面之前,还包括:电子设备识别第一摄像头采集的图像中预设局部特征所在的第一区域,第一预览显示区用于显示第一区域内的图像的放大图像。
实施本申请实施例中,用户将电子设备放置于第一预设姿态,且令电子设备满足第一预设条件后,用户可以在解放双手的情况下启动第一预设姿态对应的第一摄像头进行拍摄,并通过第一屏实时预览第一摄像头采集的图像中的预设局部特征的细节。
在一种实现方式中,上述在第一屏显示第一用户界面之后,还包括:接收用户的第一输入操作;响应于第一输入操作,电子设备在第一用户界面中显示以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;其中,试妆控件用于为第一预览显示区显示的图像中的人脸添加预设化妆效果;拍摄控件用于触发电子设备保存第一预览显示区显示的图像;摄像头切换控件用于切换采集图像的摄像头;补光控件用于补充环境光;相册控件用于触发电子设备显示相册应用的用户界面;显示框用于显示第一摄像头采集的图像中的预设局部特征的放大图像。
实施本申请实施例中,用户将电子设备放置于第一预设姿态,且令电子设备满足第一预设条件后,用户可以在解放双手的情况下启动第一预设姿态对应的第一摄像头进行拍摄,并在第一屏仅显示第一摄像头采集的图像。然后,在接收到用户的第一输入操作后,才在第一屏显示其他相关的界面元素,例如拍摄控件。
在一种实现方式中,第一摄像头为紫外线摄像头,第一摄像头采集的图像用于凸显涂抹防晒霜的区域。
实施本申请实施例中,用户将电子设备放置于第一预设姿态,且令电子设备满足第一预设条件后,用户可以在解放双手的情况下启动第一预设姿态对应的第一摄像头进行拍摄,并通过第一屏实时预览防晒霜的涂抹情况。
在一种实现方式中,基于电子设备的第一预设姿态,电子设备在第一屏显示第一用户界面时,所述方法还包括:电子设备控制第二屏和第三屏熄灭。
实施本申请实施例中,电子设备控制第一屏点亮时,控制第二屏和第三屏熄灭。这样,可以有效节省能耗。
在一种实现方式中,第二预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角小于90°,第二预设范围内的角度大于60°且小于180°;或者,第二预设姿态还包括:第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第二预设范围内的角度大于240°且小于360°;或者,第二预设姿态还包括:第三夹角与第二夹角的差值在第五预设范围内,第三夹角为第二屏与水平面的夹角,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第二预设范围内的角度大于180°且小于360°。
实施本申请实施例中,通过有效设置第二预设姿态,无需为电子设备安装额外的支架装置,也无需用户手持电子设备,用户既可以更方便的查看第二屏的显示内容。
在一种实现方式中,第一预设姿态和第二预设姿态下,第一夹角不包括0°和180°。
在一种实现方式中,基于电子设备的第二预设姿态,电子设备在第二屏显示第二用户界面时,所述方法还包括:电子设备控制第一屏熄灭。
实施本申请实施例中,电子设备控制第二屏和第三屏点亮时,控制第一屏熄灭。这样,可以有效节省能耗。
在一种实现方式中,第三预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角小于90°,第三预设范围内的角度大于60°且小于180°。
实施本申请实施例中,通过有效设置第三预设姿态,无需为电子设备安装额外的支架装置,也无需用户手持电子设备,用户既可以更方便的查看第二屏和第三屏的显示内容。
在一种实现方式中,第三屏的显示内容包括第二摄像头采集的图像中的预设局部特征的放大图像。
实施本申请实施例中,用户在第二屏实时预览第二摄像头采集的图像时,还可以通过第三屏实时预览第二摄像头采集的图像中的预设局部特征的细节。
在一种实现方式中,上述在第二屏和第三屏进行分屏显示,包括:在第二屏和第三显示屏对第三用户界面进行分屏显示;第二屏的显示内容包括第三用户界面的第三预览显示区,以及第三用户界面中除第三预览显示区之外的零个、一个或多个界面元素,第三屏的显示内容包括第三用户界面中除第二屏的显示内容之外的一个或多个界面元素;第三用户界面还包 括以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;其中,试妆控件用于为第三用户界面的预览显示区内的图像中的人脸添加预设化妆效果;拍摄控件用于触发电子设备保存第三用户界面的预览显示区内的图像;摄像头切换控件用于切换采集图像的摄像头;补光控件用于补充环境光;相册控件用于触发电子设备显示相册应用的用户界面;显示框用于显示第二摄像头采集的图像中的预设局部特征的放大图像。
实施本申请实施例中,用户将电子设备放置于第三预设姿态,且令电子设备满足第三预设条件后,用户可以在解放双手的情况下启动第三预设姿态对应的第二摄像头进行拍摄,并通过第三预设姿态对应的第二屏进行实时预览,以及通过第三屏显示的界面元素控制第二摄像头的拍摄参数以及第二屏中预览图像的显示效果,有效提高了用户的使用体验。
第二方面,本申请提供了显示方法,应用于电子设备,电子设备包括第一屏、折叠屏和第一摄像头,折叠屏沿折叠边可折叠形成第二屏和第三屏,第一屏和第二屏的朝向相反,第一屏的朝向和第一摄像头的拍摄方向一致,所述方法包括:
当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,电子设备启动第一摄像头采集图像,并在第一屏显示第一用户界面,第一用户界面的第一预览显示区用于显示第一摄像头采集的图像;其中,第一预设姿态包括第二屏和第三屏的第一夹角在第一预设范围内,第一预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第一屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第一屏的输入操作;在第一屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第一屏对应的摄像头采集的图像中,检测到第一预设手势。
需要说明的是,电子设备处于第一预设姿态,可以便于用户查看第一屏的显示内容。实施本申请实施例中,用户将电子设备放置于第一预设姿态,且令电子设备满足第一预设条件后,用户可以在解放双手的情况下启动第一预设姿态对应的第一摄像头进行拍摄,并通过第一预设姿态对应的第一屏进行实时预览,避免了繁琐的用户操作,有效提高了用户的使用体验。
本申请实施例中,上述第一屏对应的摄像头与第一摄像头可以为同一摄像头,也可以为不同摄像头。可选的,上述第一屏对应的摄像头为低功耗摄像头。
在一种实现方式中,电子设备还包括第二摄像头,第二屏的朝向和第二摄像头的拍摄方向一致,当折叠屏沿折叠边折叠形成第二屏和第三屏时,第三屏和第二摄像头位于折叠边的不同侧,所述方法还包括:当检测到电子设备处于第二预设姿态,且电子设备满足第二预设条件时,电子设备启动第二摄像头采集图像,并在第二屏显示第二用户界面,第二用户界面的第二预览显示区用于显示第二摄像头采集的图像;其中,第二预设姿态包括第一夹角在第二预设范围内,第二预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第二预设手势。
需要说明的是,电子设备处于第二预设姿态,可以便于用户查看第二屏的显示内容。实施本申请实施例中,用户将电子设备放置于第二预设姿态,且令电子设备满足第二预设条件后,用户可以在解放双手的情况下启动第二预设姿态对应的第二摄像头进行拍摄,并通过第二预设姿态对应的第二屏进行实时预览,避免了繁琐的用户操作,有效提高了用户的使用体验。
本申请实施例中,上述第二屏对应的摄像头与第二摄像头可以为同一摄像头,也可以为不同摄像头。可选的,上述第二屏对应的摄像头为低功耗摄像头。
在一种实现方式中,电子设备还包括第二摄像头,第二屏的朝向和第二摄像头的拍摄方向一致,当折叠屏沿折叠边折叠形成第二屏和第三屏时,第三屏和第二摄像头位于折叠边的不同侧,所述方法还包括:当检测到电子设备处于第三预设姿态,且电子设备满足第三预设条件时,电子设备启动第二摄像头采集图像,并在第二屏和第三屏进行分屏显示,第二屏的显示内容包括第三预览显示区,第三预览显示区用于显示第二摄像头采集的图像;其中,第三预设姿态包括第一夹角在第三预设范围内,第三预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏和/或第三屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏或第三屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第三预设手势。
需要说明的是,电子设备处于第三预设姿态,可以便于用户查看第二屏和第三屏的显示内容。实施本申请实施例中,用户将电子设备放置于第三预设姿态,且令电子设备满足第三预设条件后,用户可以在解放双手的情况下启动第三预设姿态对应的第二摄像头进行拍摄,并通过第三预设姿态对应的第二屏进行实时预览,以及通过第三屏查看与第二摄像头采集的图像相关联的界面元素,避免了繁琐的用户操作,有效提高了用户的使用体验。
在一种实现方式中,第一预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏朝向与地理坐标系的Z轴的夹角小于90°,第一预设范围内的角度大于0°且小于120°;或者,第一预设姿态还包括:第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第一预设范围内的角度大于180°且小于300°;或者,第一预设姿态还包括:第三夹角与第二夹角的差值在第五预设范围内,第三夹角为第二屏与水平面的夹角,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第一预设范围内的角度大于0°且小于180°。
实施本申请实施例中,通过有效设置第一预设姿态,无需为电子设备安装额外的支架装置,也无需用户手持电子设备,用户既可以更方便的查看第一屏的显示内容。
在一种实现方式中,第一预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏朝向与地理坐标系的Z轴的夹角小于90°,第一预设范围内的角度小于120°;上述在第一屏显示第一用户界面,包括:在第一屏显示旋转180°后的第一用户界面。
实施本申请实施例中,第一屏对应的用户界面的显示方向可以自适应地随着电子设备的物理姿态进行调整,以便于用户查看,有效提升了用户体验。
在一种实现方式中,上述在第一屏显示第一用户界面之前,还包括:电子设备识别第一摄像头采集的图像中预设局部特征所在的第一区域,第一预览显示区用于显示第一区域内的图像的放大图像。
实施本申请实施例中,用户将电子设备放置于第一预设姿态,且令电子设备满足第一预设条件后,用户可以在解放双手的情况下启动第一预设姿态对应的第一摄像头进行拍摄,并通过第一屏实时预览第一摄像头采集的图像中的预设局部特征的细节。
在一种实现方式中,上述在第一屏显示第一用户界面之后,还包括:接收用户的第一输入操作;响应于第一输入操作,电子设备在第一用户界面中显示以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;其中,试妆控件用 于为第一预览显示区显示的图像中的人脸添加预设化妆效果;拍摄控件用于触发电子设备保存第一预览显示区显示的图像;摄像头切换控件用于切换采集图像的摄像头;补光控件用于补充环境光;相册控件用于触发电子设备显示相册应用的用户界面;显示框用于显示第一摄像头采集的图像中的预设局部特征的放大图像。
实施本申请实施例中,用户将电子设备放置于第一预设姿态,且令电子设备满足第一预设条件后,用户可以在解放双手的情况下启动第一预设姿态对应的第一摄像头进行拍摄,并在第一屏仅显示第一摄像头采集的图像。然后,在接收到用户的第一输入操作后,才在第一屏显示其他相关的界面元素,例如拍摄控件。
在一种实现方式中,第一摄像头为紫外线摄像头,第一摄像头采集的图像用于凸显涂抹防晒霜的区域。
实施本申请实施例中,用户将电子设备放置于第一预设姿态,且令电子设备满足第一预设条件后,用户可以在解放双手的情况下启动第一预设姿态对应的第一摄像头进行拍摄,并通过第一屏实时预览防晒霜的涂抹情况。
在一种实现方式中,当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,所述方法还包括:电子设备控制第二屏和第三屏熄灭。
实施本申请实施例中,电子设备控制第一屏点亮时,控制第二屏和第三屏熄灭。这样,可以有效节省能耗。
在一种实现方式中,第二预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角小于90°,第二预设范围内的角度大于60°且小于180°;或者,第二预设姿态还包括:第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第二预设范围内的角度大于240°且小于360°;或者,第二预设姿态还包括:第三夹角与第二夹角的差值在第五预设范围内,第三夹角为第二屏与水平面的夹角,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第二预设范围内的角度大于180°且小于360°。
实施本申请实施例中,通过有效设置第二预设姿态,无需为电子设备安装额外的支架装置,也无需用户手持电子设备,用户既可以更方便的查看第二屏的显示内容。
在一种实现方式中,第一预设姿态和第二预设姿态下,第一夹角不包括0°和180°。
在一种实现方式中,当检测到电子设备处于第二预设姿态,且电子设备满足第二预设条件时,所述方法还包括:电子设备控制第一屏熄灭。
实施本申请实施例中,电子设备控制第二屏和第三屏点亮时,控制第一屏熄灭。这样,可以有效节省能耗。
在一种实现方式中,第三预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角小于90°,第三预设范围内的角度大于60°且小于180°。
实施本申请实施例中,通过有效设置第三预设姿态,无需为电子设备安装额外的支架装置,也无需用户手持电子设备,用户既可以更方便的查看第二屏和第三屏的显示内容。
在一种实现方式中,第三屏的显示内容包括第二摄像头采集的图像中的预设局部特征的放大图像。
实施本申请实施例中,用户在第二屏实时预览第二摄像头采集的图像时,还可以通过第三屏实时预览第二摄像头采集的图像中的预设局部特征的细节。
在一种实现方式中,上述在第二屏和第三屏进行分屏显示,包括:在第二屏和第三显示屏对第三用户界面进行分屏显示;第二屏的显示内容包括第三用户界面的第三预览显示区,以及第三用户界面中除第三预览显示区之外的零个、一个或多个界面元素,第三屏的显示内容包括第三用户界面中除第二屏的显示内容之外的一个或多个界面元素;第三用户界面还包括以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;其中,试妆控件用于为第三用户界面的预览显示区内的图像中的人脸添加预设化妆效果;拍摄控件用于触发电子设备保存第三用户界面的预览显示区内的图像;摄像头切换控件用于切换采集图像的摄像头;补光控件用于补充环境光;相册控件用于触发电子设备显示相册应用的用户界面;显示框用于显示第二摄像头采集的图像中的预设局部特征的放大图像。
实施本申请实施例中,用户将电子设备放置于第三预设姿态,且令电子设备满足第三预设条件后,用户可以在解放双手的情况下启动第三预设姿态对应的第二摄像头进行拍摄,并通过第三预设姿态对应的第二屏进行实时预览,以及通过第三屏显示的界面元素控制第二摄像头的拍摄参数以及第二屏中预览图像的显示效果,有效提高了用户的使用体验。
第三方面,本申请提供了一种电子设备,该电子设备包括第一屏、折叠屏和第一摄像头,上述折叠屏沿折叠边可折叠形成第二屏和第三屏,第一屏和第二屏的朝向相反,第一屏的朝向和第一摄像头的拍摄方向一致,第二屏的朝向和第二摄像头的拍摄方向一致。该电子设备可包括多个功能模块或单元,用于相应的执行第一方面所提供的显示方法。
例如,检测单元和显示单元。
检测单元,用于基于检测到的第二屏和第三屏的第一夹角,确定电子设备处于第一预设姿态;
显示单元,用于基于电子设备的第一预设姿态,在第一屏显示第一用户界面;第一用户界面的第一预览显示区用于显示第一摄像头采集的图像,第一预设姿态下,第一夹角不包括0°和180°;
检测单元,还用于基于检测到的第一夹角,确定电子设备处于第二预设姿态;
显示单元,还用于基于电子设备的第二预设姿态,在第二屏显示第二用户界面;第二用户界面的第二预览显示区用于显示第二摄像头采集的图像,第二预设姿态下,第一夹角不包括0°和180°。
在一种实现方式中,检测单元,还用于基于检测到的第一夹角,确定电子设备处于第三预设姿态;显示单元,还用于基于电子设备的第三预设姿态,在第二屏和第三屏进行分屏显示,第二屏的显示内容包括第三预览显示区,第三预览显示区用于显示第二摄像头采集的图像,第三预设姿态下,第一夹角不包括0°和180°。
在一种实现方式中,上述基于电子设备的第一预设姿态,在第一屏显示第一用户界面,包括:当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,启动第一摄像头采集图像,并在第一屏显示第一用户界面;其中,第一预设姿态包括第一夹角在第一预设范围内,第一预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第一屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第一屏的输入操作;在第一屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第一屏对应的摄像头采集的图像中,检测到第一预设手势。
在一种实现方式中,上述基于电子设备的第二预设姿态,在第二屏显示第二用户界面,包括:当检测到电子设备处于第二预设姿态,且电子设备满足第二预设条件时,启动第二摄 像头采集图像,并在第二屏显示第二用户界面;其中,第二预设姿态包括第一夹角在第二预设范围内,第二预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第二预设手势。
在一种实现方式中,上述基于电子设备的第三预设姿态,在第二屏和第三屏进行分屏显示,包括:当检测到电子设备处于第三预设姿态,且电子设备满足第三预设条件时,启动第二摄像头采集图像,并在第二屏和第三屏进行分屏显示;其中,第三预设姿态包括第一夹角在第三预设范围内,第三预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏和/或第三屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏或第三屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第三预设手势。
第四方面,本申请提供了一种电子设备,该电子设备包括第一屏、折叠屏和第一摄像头,折叠屏沿折叠边可折叠形成第二屏和第三屏,第一屏和第二屏的朝向相反,第一屏的朝向和第一摄像头的拍摄方向一致。该电子设备可包括多个功能模块或单元,用于相应的执行第二方面所提供的显示方法。
例如,显示单元。
显示单元,用于当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,启动第一摄像头采集图像,并在第一屏显示第一用户界面,第一用户界面的第一预览显示区用于显示第一摄像头采集的图像;其中,第一预设姿态包括第二屏和第三屏的第一夹角在第一预设范围内,第一预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第一屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第一屏的输入操作;在第一屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第一屏对应的摄像头采集的图像中,检测到第一预设手势。
本申请实施例中,上述第一屏对应的摄像头与第一摄像头可以为同一摄像头,也可以为不同摄像头。可选的,上述第一屏对应的摄像头为低功耗摄像头。
在一种实现方式中,电子设备还包括第二摄像头,第二屏的朝向和第二摄像头的拍摄方向一致,当折叠屏沿折叠边折叠形成第二屏和第三屏时,第三屏和第二摄像头位于折叠边的不同侧。显示单元还用于,当检测到电子设备处于第二预设姿态,且电子设备满足第二预设条件时,启动第二摄像头采集图像,并在第二屏显示第二用户界面,第二用户界面的第二预览显示区用于显示第二摄像头采集的图像;其中,第二预设姿态包括第一夹角在第二预设范围内,第二预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第二预设手势。
本申请实施例中,上述第二屏对应的摄像头与第二摄像头可以为同一摄像头,也可以为不同摄像头。可选的,上述第二屏对应的摄像头为低功耗摄像头。
在一种实现方式中,电子设备还包括第二摄像头,第二屏的朝向和第二摄像头的拍摄方向一致,当折叠屏沿折叠边折叠形成第二屏和第三屏时,第三屏和第二摄像头位于折叠边的不同侧。显示单元还用于,当检测到电子设备处于第三预设姿态,且电子设备满足第三预设 条件时,启动第二摄像头采集图像,并在第二屏和第三屏进行分屏显示,第二屏的显示内容包括第三预览显示区,第三预览显示区用于显示第二摄像头采集的图像;其中,第三预设姿态包括第一夹角在第三预设范围内,第三预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏和/或第三屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏或第三屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第三预设手势。
在一种实现方式中,第一预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏朝向与地理坐标系的Z轴的夹角小于90°,第一预设范围内的角度大于0°且小于120°;或者,第一预设姿态还包括:第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第一预设范围内的角度大于180°且小于300°;或者,第一预设姿态还包括:第三夹角与第二夹角的差值在第五预设范围内,第三夹角为第二屏与水平面的夹角,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第一预设范围内的角度大于0°且小于180°。
在一种实现方式中,第一预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏朝向与地理坐标系的Z轴的夹角小于90°,第一预设范围内的角度小于120°;上述显示单元在第一屏显示第一用户界面,包括:显示单元在第一屏显示旋转180°后的第一用户界面。
在一种实现方式中,电子设备还包括识别单元,上述显示单元在第一屏显示第一用户界面之前,识别单元用于识别第一摄像头采集的图像中预设局部特征所在的第一区域,第一预览显示区用于显示第一区域内的图像的放大图像。
在一种实现方式中,电子设备还包括接收单元,上述显示单元在第一屏显示第一用户界面之后,接收单元用于接收用户的第一输入操作;显示单元还用于响应于第一输入操作,在第一用户界面中显示以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;其中,试妆控件用于为第一预览显示区显示的图像中的人脸添加预设化妆效果;拍摄控件用于触发电子设备保存第一预览显示区显示的图像;摄像头切换控件用于切换采集图像的摄像头;补光控件用于补充环境光;相册控件用于触发电子设备显示相册应用的用户界面;显示框用于显示第一摄像头采集的图像中的预设局部特征的放大图像。
在一种实现方式中,第一摄像头为紫外线摄像头,第一摄像头采集的图像用于凸显涂抹防晒霜的区域。
在一种实现方式中,显示单元还用于,当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,控制第二屏和第三屏熄灭。
在一种实现方式中,第二预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角小于90°,第二预设范围内的角度大于60°且小于180°;或者,第二预设姿态还包括:第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第二预设范围内的角度大于240°且小于360°;或者,第二预设姿态还包括:第三夹角与第二夹角的差值在第五预设范围内,第三夹角为第二屏与水平面的夹角,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第二预设范围内的角度大于180°且小于360°。
在一种实现方式中,第一预设姿态和第二预设姿态下,第一夹角不包括0°和180°。
在一种实现方式中,显示单元还用于,当检测到电子设备处于第二预设姿态,且电子设备满足第二预设条件时,控制第一屏熄灭。
在一种实现方式中,第三预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角小于90°,第三预设范围内的角度大于60°且小于180°。
在一种实现方式中,第三屏的显示内容包括第二摄像头采集的图像中的预设局部特征的放大图像。
在一种实现方式中,上述显示单元在第二屏和第三屏进行分屏显示,包括:显示单元在第二屏和第三显示屏对第三用户界面进行分屏显示;第二屏的显示内容包括第三用户界面的第三预览显示区,以及第三用户界面中除第三预览显示区之外的零个、一个或多个界面元素,第三屏的显示内容包括第三用户界面中除第二屏的显示内容之外的一个或多个界面元素;第三用户界面还包括以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;其中,试妆控件用于为第三用户界面的预览显示区内的图像中的人脸添加预设化妆效果;拍摄控件用于触发电子设备保存第三用户界面的预览显示区内的图像;摄像头切换控件用于切换采集图像的摄像头;补光控件用于补充环境光;相册控件用于触发电子设备显示相册应用的用户界面;显示框用于显示第二摄像头采集的图像中的预设局部特征的放大图像。
第五方面,本申请提供了一种电子设备,包括一个或多个处理器和一个或多个存储器。该一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得电子设备执行上述任一方面任一项可能的实现方式中的显示方法。
第六方面,本申请实施例提供了一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行上述任一方面任一项可能的实现方式中的显示方法。
第七方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述任一方面任一项可能的实现方式中的显示方法。
附图说明
图1A至图1F为本申请实施例提供的纵向折叠的电子设备的产品形态示意图;
图2A至图2F为本申请实施例提供的横向折叠的电子设备的产品形态示意图;
图3A至图3C为本申请实施例提供的纵向折叠的电子设备的外屏示意图;
图4A至图4C为本申请实施例提供的横向折叠的电子设备的外屏示意图;
图5为本申请实施例提供的一种电子设备的结构示意图;
图6A为本申请实施例提供的一种地理做标系的示意图;
图6B为本申请实施例提供的一种A屏和B屏的夹角计算示意图;
图7A为本申请实施例提供的一种A屏与水平面的夹角计算示意图;
图7B为本申请实施例提供的一种B屏与水平面的夹角计算示意图;
图8A为本申请实施例提供的一种电子设备的坐标系的示意图;
图8B为本申请实施例提供的另一种电子设备的坐标系的示意图;
图9A至图9F为本申请实施例提供的电子设备的六种支架状态的示意图;
图10A至图10C为本申请实施例提供的在特定支架状态下的C屏的显示界面;
图11A至图11F为本申请实施例提供的一组用户界面示意图;
图12A至图12N为本申请实施例提供的另一组用户界面示意图;
图13A至图13F为本申请实施例提供的在特定支架状态下的A屏的显示界面;
图14A至图14B为本申请实施例提供的内屏的显示界面;
图15A至图15F为本申请实施例提供的在特定支架状态下的内屏的显示界面;
图16为本申请实施例提供的一种电子设备的软件架构示意图;
图17为本申请实施例提供的另一种电子设备的结构示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清楚、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请以下实施例中的术语“用户界面(user interface,UI)”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面是通过iava、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在电子设备上经过解析,渲染,最终呈现为用户可以识别的内容。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的文本、图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
本申请实施例提供了一种折叠屏的显示方法,该方法可以应用与具有纵向折叠屏或横向折叠的电子设备100。其中,电子设备100的折叠屏可沿折叠边折叠形成至少两个屏,例如A屏和B屏。而根据折叠的程度,折叠屏屏可呈现多种形态。本申请实施例中,电子设备100的折叠屏可呈现展开形态、正向半折叠形态、正向折叠形态。可选的,电子设备100的折叠屏还可呈现反向半折叠形态、反向折叠形态。
下面对配置纵向折叠屏的电子设备100的各种形态进行介绍。
示例性的,图1A至图1F示出了本申请实施例提供的一种具有纵向折叠屏的电子设备100的产品形态示意图,纵向折叠屏的折叠边垂直于电子设备100的顶部边缘线(为便于描述,将顶部边缘线简称为顶边)和底部边缘线(为便于描述,将底部边缘线简称为底边)。
其中,图1A是纵向折叠屏的展开形态的示意图。图1A所示的纵向折叠屏可沿折叠边,按照图1A所示的方向11a和/或11b向内翻折,形成图1B和图1C所示的正向半折叠形态的A屏(即第二屏)和B屏(即第三屏)。其中,当纵向折叠屏被折叠分成A屏和B屏后,A屏可以与电子设备100上的前置摄像头在折叠边的同一侧。图1C所示的纵向折叠屏可沿折叠边,按照图1C所示的方向11a和11b继续向内翻折,可形成图1D所示的正向折叠形态的纵向折叠屏。如图1D所示,电子设备100的纵向折叠屏被完全正向折叠后,A屏和B屏相 对,对用户不可见。
在一些实施例中,图1A所示的纵向折叠屏还可沿折叠边向外翻折,形成图1E所示的反向半折叠形态的A屏和B屏。图1E所示的纵向折叠屏可沿折叠边,按照图1E所示的方向22a和22b继续向外翻折,形成图1F所示的反向折叠形态的纵向折叠屏。如图1F所示,电子设备100的纵向折叠屏被完全反向折叠后,A屏和B屏相背,电子设备100的背面(即A屏的背面和B屏背面)对用户不可见。
下面对配置横向折叠屏的电子设备100的各种形态进行介绍。
示例性的,图2A至图2F示出了本申请实施例提供的一种具有横向折叠屏的电子设备100的产品形态示意图,横向折叠屏的折叠边平行于电子设备100的顶边和底边。
其中,图2A是横向折叠屏的展开形态的示意图。图2A所示的横向折叠屏可沿折叠边,按照图2A所示的方向33a和/或33b向内翻折,形成图2B和图2C所示的正向半折叠形态的A屏和B屏。其中,当横向折叠屏被折叠分成A屏和B屏后,A屏可以与电子设备100上的前置摄像头在折叠边的同一侧。图2C所示的横向折叠屏可沿折叠边,按照图2C所示的方向33a和33b继续向内翻折,形成图2D所示的正向折叠形态的横向折叠屏。如图2D所示,电子设备100的横向折叠屏被完全正向折叠后,A屏和B屏相对,对用户不可见。
在一些实施例中,图2A所示的横向折叠屏还可沿折叠边向外翻折,形成图2E所示的反向半折叠形态的A屏和B屏。图2E所示的横向折叠屏可沿折叠边,按照图2E所示的方向44a和44b继续向外翻折,形成图2F所示的反向折叠形态的横向折叠屏。如图2F所示,电子设备100的横向折叠屏被完全反向折叠后,A屏和B屏相背,电子设备100的背面(即A屏的背面和B屏背面)对用户不可见。
本申请实施例提供的折叠屏(纵向折叠屏或者横向折叠屏)的A屏和/或B屏的背面还可以设置一个显示屏(C屏)。
其中,A屏和B屏组成的折叠屏为电子设备100的内屏,A屏、B屏和前置摄像头位于电子设备100的正面;C屏(即第二屏)为电子设备100的外屏,C屏和后置摄像头位于电子设备100的背面。
在本申请实施例中,C屏可以被称为第一屏,A屏可以被称为第二屏,B屏可以被称为第三屏;C屏设置于A屏的背面,C屏和A屏的朝向相反。C屏对应的后置摄像头可以被称为第一摄像头,A屏对应的前置摄像头可以被称为第二摄像头。可以理解,C屏的朝向与后置摄像头的拍摄方向一致,A屏的朝向与前置摄像头的拍摄方向一致。当电子设备的内屏配置的折叠屏沿折叠边折叠形成A屏和B屏时,B屏和A屏对应的前置摄像头(即第二摄像头)位于折叠边的不同侧。
示例性的,图3A至图3C示出了内屏配置为纵向折叠屏的电子设备100的外屏示意图。图4A至图4C示出了内屏配置为横向折叠屏的电子设备100的外屏示意图。
示例性的,如图3A和图3B所示,电子设备100的纵向折叠屏中的A屏的背面可设置一个C屏。如图4A和图4B所示,电子设备100的横向折叠屏中的A屏的背面可设置一个C屏。如图3B和图4B所示,内屏对应的折叠屏被完全正向折叠后,A屏和B屏对用户不可见,C屏位于A屏的背面,C屏对用户可见。其中,C屏可以与电子设备100的后置摄像头在折叠边的同一侧。
示例性的,如图3C所示,内屏的A屏和B屏的背面可设置一个可纵向折叠的C屏。示例性的,如图4C所示,内屏的A屏和B屏的背面可以设置一个可横向折叠的C屏。由图 3C和图4C可知,电子设备100的内屏(即A屏和B屏组成的折叠屏)处于展开形态时,外屏(即C屏)也处于展开形态;电子设备100的内屏被翻折时,外屏也随之被翻折;电子设备100的内屏处于折叠形态时,外屏也处于折叠形态。
可以理解,对于具有C屏的电子设备100而言,当内屏(即A屏和B屏组成的折叠屏)处于折叠形态时,电子设备100可以在C屏显示用户界面;当内屏处于半折叠形态和展开形态时,电子设备100可以在A屏、B屏和/或C屏显示用户界面。
在一些实施例中,电子设备100的折叠屏可以环绕电子设备100的四周,上述A屏、B屏、C屏可以都是该折叠屏的一部分。
本申请实施例中,电子设备100可以基于检测到的A屏和B屏的夹角α(第一夹角),确定内屏配置的折叠屏所处的形态。在一些实施例中,电子设备100的折叠屏(纵向折叠屏或横向折叠屏)的A屏和B屏的夹角α取值范围为[0°,180°],电子设备100不能进行反向折叠。在一些实施例中,电子设备100的折叠屏(纵向折叠屏或横向折叠屏)的A屏和B屏的夹角α取值范围为[0°,360°],电子设备100既能进行正向折叠,又能进行反向折叠。
需要说明的是,本申请实施例中,A屏和B屏的夹角α也可以被称为第一夹角。
示例性的,α取值范围为[0°,360°]。当夹角α∈[0°,P1),电子设备100可以确定折叠屏处于正向折叠形态;当夹角α∈[P1,P2),电子设备100可以确定折叠屏处于正向半折叠形态;当夹角α∈[P2,P3),电子设备100可以确定折叠屏处于展开形态;当α∈[P3,P4),电子设备100可以确定折叠屏处于反向半折叠形态;当夹角α∈[P4,360),电子设备100可以确定折叠屏处于反向折叠形态。其中,0°<P1<P2<180°<P3<P4<360°。P1、P2、P3和P4是预设角度阈值。P1、P2、P3和P4可以是用户在电子设备100中设定的,或者电子设备100默认设定的。
在一些实施例中,P1与0°的差值,P2与180°的差值,P3与180°的差值,以及P4与360°的差值,均为电子设备100或用户设定的预设误差值。例如,如预设误差值等于2°,P1、P2、P3和P4,分别为2°、178°、P3为172°、P4为358°。
在一些实施例中,P1、P2、P3和P4可以是根据用户对折叠屏的使用习惯确定的。示例性的,按照大多数用户的使用习惯,当A屏和B屏的夹角α小于50度时,用户意图不使用A屏或者B屏的可能性较高;当A屏和B屏的夹角α大于50°小于等于160°(或者α大于190°小于等于280°)时,用户意图使用A屏和B屏显示不同显示内容的可能性较高;当A屏和B屏的夹角α大于160°小于等于190°时,用户意图将A屏和B屏作为整体(即作为一个完整的显示屏)使用的可能性较高;当A屏和B屏的夹角大于280°小于等于360°,用户意图单独使用A屏或B屏的可能性较高。基于上述使用习惯,P1的取值范围可以为(0,40°],P2的取值范围可以为[160°,180°),P3的取值范围可以为[180°,190°),P4的取值范围可以为[280°,360°)。例如,P1为30°,P2为170°,P3为185°,P4为300°。
需要说明的是,本申请实施例中的折叠屏被折叠后形成的至少两个显示屏,可以为独立存在的多个显示屏,也可以为一体结构的一个完整显示屏,只是被折叠形成了至少两部分。
例如,折叠屏可以是柔性折叠屏,柔性折叠屏包括采用柔性材质制作的折叠边。该柔性折叠屏的部分或全部采用柔性材质制作。柔性折叠屏被折叠后形成的至少两个屏是一体结构的一个完整屏,只是被折叠形成了至少两部分。
又例如,上述折叠屏可以为多屏折叠屏。该多屏折叠屏可包括多个(两个或两个以上)显示屏。这多个显示屏是多个单独的显示屏。这多个显示屏可依次通过折叠轴连接。每个屏 可以绕与其连接的折叠轴转动,实现多屏折叠屏的折叠。
本申请后续实施例中将以折叠屏是可以横向折叠的柔性折叠屏为例,对本申请实施例提供的显示方法进行说明。
下面结合附图对本申请实施例提供的电子设备100进行说明。
电子设备100可以是搭载iOS、Android、Microsoft或者其它操作系统的终端设备,例如,电子设备100可以是手机、平板电脑、桌面型计算机、膝上型计算机、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(Augmented reality,AR)\虚拟现实(virtual reality,VR)设备等包括上述折叠屏的设备,本申请实施例对该电子设备100的具体类型不作特殊限制。
示例性的,图5示出了电子设备100的结构示意图。如图5所示,电子设备100可以包括:处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber  identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备100供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块 141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system, GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(1iquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
内部存储器121可以包括一个或多个随机存取存储器(random access memory,RAM)和一个或多个非易失性存储器(non-volatile memory,NVM)。
随机存取存储器可以包括静态随机存储器(static random-access memory,SRAM)、动态随机存储器(dynamic random access memory,DRAM)、同步动态随机存储器(synchronous dynamic random access memory,SDRAM)、双倍资料率同步动态随机存取存储器(double data rate synchronous dynamic random access memory,DDR SDRAM,例如第五代DDR SDRAM一般称为DDR5 SDRAM)等;非易失性存储器可以包括磁盘存储器件、快闪存储器(flash  memory)。
快闪存储器按照运作原理划分可以包括NOR FLASH、NAND FLASH、3D NAND FLASH等,按照存储单元电位阶数划分可以包括单阶存储单元(single-level cell,SLC)、多阶存储单元(multi-level cell,MLC)、三阶储存单元(triple-level cell,TLC)、四阶储存单元(quad-level cell,QLC)等,按照存储规范划分可以包括通用闪存存储(英文:universal flash storage,UFS)、嵌入式多媒体存储卡(embedded multi media Card,eMMC)等。
随机存取存储器可以由处理器110直接进行读写,可以用于存储操作系统或其他正在运行中的程序的可执行程序(例如机器指令),还可以用于存储用户及应用程序的数据等。
非易失性存储器也可以存储可执行程序和存储用户及应用程序的数据等,可以提前加载到随机存取存储器中,用于处理器110直接进行读写。
外部存储器接口120可以用于连接外部的非易失性存储器,实现扩展电子设备100的存储能力。外部的非易失性存储器通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部的非易失性存储器中。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。
耳机接口170D用于连接有线耳机。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
需要说明的是,陀螺仪传感器的坐标系是地理坐标系。如图6A所示,地理坐标系的原点O位于运载体(即包含陀螺仪传感器的设备,如电子设备100)所在的点,X轴沿着当地纬线指向东(E),Y轴沿当地子午线指向北(N),Z轴沿当地地理垂线指向上,并与X轴和Y轴构成右手直角坐标系。其中,X轴与Y轴构成的平面即为当地水平面,Y轴与Z轴构成的平面即为当地子午面。因此,可以理解的是,陀螺仪传感器的坐标系是:以陀螺仪传感器为原点O,沿当地纬线指向东为X轴,沿当地子午线指向北为Y轴,沿当地地理垂线指向向上(即地理垂线的方向)为Z轴。
在本申请实施例中,电子设备100的显示屏194可折叠形成多个显示屏。每个屏中可设置有陀螺仪传感器180B,用于测量该显示屏的朝向(即垂直于该显示屏且从电子设备100的 内部指向外部的方向向量)。电子设备100可以根据陀螺仪传感器180B测量得到的每个显示屏的朝向变化,确定出相邻屏的夹角。
参考图1A至图2F,电子设备100的显示屏194经折叠可形成相邻的A屏和B屏。A屏设置有陀螺仪传感器A,电子设备100可以通过陀螺仪传感器A测量A屏的朝向;B屏设置有陀螺仪传感器B,电子设备100可以通过陀螺仪传感器B测量B屏的朝向。电子设备100根据测量得到的A屏和B屏的朝向变化,可以确定A屏和B屏的夹角α。下面对夹角α的获取原理进行具体说明。
示例性的,图6B示出了A屏和B屏的夹角α的示意图。如图6B所示,电子设备100利用陀螺仪传感器A测得A屏的朝向为向量
Figure PCTCN2022097177-appb-000001
利用陀螺仪传感器B测得B屏的朝向为向量z2。其中,向量
Figure PCTCN2022097177-appb-000002
与A屏垂直,向量
Figure PCTCN2022097177-appb-000003
与B屏垂直。电子设备100利用如下公式(1),便可计算出向量
Figure PCTCN2022097177-appb-000004
与向量
Figure PCTCN2022097177-appb-000005
的夹角θ,进而电子设备100可以确定A屏与B屏的夹角α=180°-θ。
Figure PCTCN2022097177-appb-000006
需要说明的是,虽然A屏中的陀螺仪传感器A和B屏中的陀螺仪传感器B的位置并不重叠,即两个陀螺仪传感器的坐标系的原点并不重叠,但是,两个坐标系的两个X轴是平行的,两个Y轴是平行的,两个Z轴也是平行的。这样,虽然向量
Figure PCTCN2022097177-appb-000007
和向量
Figure PCTCN2022097177-appb-000008
是通过不同的陀螺仪传感器在不同坐标系下测量的,但是由于两个陀螺仪传感器的坐标系的各轴平行,电子设备100可通过上述公式(1)计算向量
Figure PCTCN2022097177-appb-000009
与向量
Figure PCTCN2022097177-appb-000010
的夹角θ。
在一些实施例中,电子设备100通过一或多个加速度传感器测量折叠屏的相邻屏的夹角,例如A屏与B屏的夹角α。例如,折叠屏的每个显示屏中均可设置一个加速度传感器。电子设备100(如处理器110)可利用加速度传感器测量每个显示屏被转动时的运动加速度;然后根据测量得到的运动加速度计算一个显示屏相对于另一个显示屏转动的角度,例如A屏与B屏的夹角α。
在一些实施例中,上述陀螺仪传感器可以是由其他多个传感器配合形成的虚拟陀螺仪传感器,该虚拟陀螺仪传感器可用于计算折叠屏的相邻屏的夹角,例如A屏与B屏的夹角α。
在另一些实施例中,电子设备100的折叠部位(例如转轴上)安装有角度传感器,电子设备100可以通过该角度传感器测量折叠屏的相邻屏的夹角,例如A屏和B屏所成夹角α。
本申请实施例中,电子设备100还可以通过上述陀螺仪传感器A测量A屏与水平面的夹角β1,通过上述陀螺仪传感器B测量B屏与水平面的夹角β2。
示例性的,图7A示出了电子设备100的A屏的陀螺仪传感器A的坐标系。其中,X轴和Y轴构成的XOY平面即为当地水平面,Y轴和Z轴构成的平面即为当地子午面。电子设备100的A屏在陀螺仪传感器A的坐标系中的朝向为向量
Figure PCTCN2022097177-appb-000011
向量
Figure PCTCN2022097177-appb-000012
和XOY平面(即水平面)的夹角γ1在以下关系:
Figure PCTCN2022097177-appb-000013
其中,电子设备100的A屏与向量
Figure PCTCN2022097177-appb-000014
垂直,电子设备100的A屏所在平面与水平面的夹角β1和夹角γ1互余,即β1+γ1=90°。由此可知:夹角β1与向量
Figure PCTCN2022097177-appb-000015
和存在以下关系:
Figure PCTCN2022097177-appb-000016
其中,
Figure PCTCN2022097177-appb-000017
示例性的,图7B示出了电子设备100的上B屏的陀螺仪传感器B的坐标系。其中,X轴和Y轴构成的XOY平面即为当地水平面,Y轴和Z轴构成的平面即为当地子午面。电子设备100的B屏在陀螺仪传感器的坐标系中朝向为向量
Figure PCTCN2022097177-appb-000018
向量
Figure PCTCN2022097177-appb-000019
和XOY平面(即水平面)的夹角γ2存在以下关系:
Figure PCTCN2022097177-appb-000020
其中,电子设备100的B屏与 向量
Figure PCTCN2022097177-appb-000021
垂直,电子设备100的B屏所在平面与水平面的夹角β2和夹角γ2互余,即β2+γ2=90°。由此可知:夹角β2与向量
Figure PCTCN2022097177-appb-000022
存在以下关系:
Figure PCTCN2022097177-appb-000023
其中,
Figure PCTCN2022097177-appb-000024
综上所述,电子设备100利用陀螺仪传感器A测量得到的A屏的朝向对应的向量
Figure PCTCN2022097177-appb-000025
后,利用上述公式(2)既可确定A屏与水平面的夹角β1。综上所述,电子设备100利用陀螺仪传感器B测量得到的B屏的朝向对应的向量
Figure PCTCN2022097177-appb-000026
后,利用上述公式(3)既可确定B屏与水平面的夹角β2。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备100的姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。
马达191可以产生振动提示。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,消息,通知等。
SIM卡接口195用于连接SIM卡。
下面对本申请实施例涉及的电子设备100的C屏的显示方向进行介绍。
本申请实施例中,电子设备100可以根据检测到的C屏的物理姿态,确定C屏对应的用户界面1的显示方向。
示例性的,参考图8A所示的横向折叠的电子设备100和图8B所示的纵向折叠的电子设备100,C屏的4个侧边包括第一边、第二边、第三边和第四边,C屏的第一边和第二边平行于电子设备100的折叠边。C屏对应的用户界面1的显示方向包括显示方向1、显示方向2、显示方向3、显示方向4中的一个或多个。其中,
显示方向1指:用户界面1的顶边和底边平行于第一边,且相比于用户界面1的底边,用户界面1的顶边更靠近第一边。显示方向2指:用户界面1的顶边和底边平行于第一边,且相比于用户界面1的顶边,用户界面1的底边更靠近第一边。显示方向3指:用户界面1的两个侧边(左侧边和右侧边)平行于第一边,且相比于用户界面1的左侧边,用户界面1的右侧边更靠近第一边。显示方向4指:用户界面1的两个侧边(左侧边和右侧边)平行于第一边,且相比于用户界面1的右侧边,用户界面1的左侧边更靠近第一边。
本申请实施例中,电子设备100的物理姿态可以包括:第一物理姿态、第一物理姿态、第三物理姿态和第四物理姿态。
在一些实施例中,当检测电子设备100处于第一物理姿态时,控制C屏对应的用户界面1的显示方向为显示方向1;当检测电子设备100处于第二物理姿态时,控制C屏对应的用户界面1的显示方向为显示方向2;当检测电子设备100处于第三物理姿态时,控制C屏对应的用户界面1的显示方向为显示方向3;当检测电子设备100处于第四物理姿态时,控制C屏对应的用户界面1的显示方向为显示方向4。
在一些实施例中,参考图7A所示的横向折叠的电子设备100,电子设备100的C屏的默认显示方向为显示方向1。当电子设备100处于第一物理姿态时,电子设备100将C屏对应的用户界面1按C屏的默认显示方向进行显示。当电子设备100处于第二物理姿态时,电子设备100将C屏对应的用户界面旋转180°后再显示。
在一些实施例中,参考图7A所示的横向折叠的电子设备100,电子设备100的C屏的默认显示方向为显示方向2。当电子设备100处于第二物理姿态时,电子设备100将C屏的用户界面按C屏的默认显示方向进行显示;当电子设备100处于第一物理姿态时,电子设备100将C屏的用户界面旋转180°后再显示。
在一些实施例中,参考图7B所示的纵向折叠的电子设备100,电子设备100的C屏的默认显示方向为显示方向3。当电子设备100处于第三物理姿态时,电子设备100将C屏对应的用户界面1按C屏的默认显示方向进行显示。当电子设备100处于第一物理姿态时,电子设备100将C屏对应的用户界面旋转90°后再显示。当电子设备100处于第四物理姿态时,电子设备100将C屏对应的用户界面旋转180°后再显示。当电子设备100处于第二物理姿态时,电子设备100将C屏对应的用户界面旋转270°(或-90°)后再显示。
这样,通过上述实施例,C屏对应的用户界面的显示方向可以自适应地随着电子设备100的物理姿态进行调整,以便于用户查看,有效提升用户体验。类似的,本申请实施例中,A屏、B屏和内屏对应的用户界面的显示方向也可以自适应地随着电子设备100的物理姿态进行调整。可以理解,A屏、B屏和内屏对于的默认显示方向通常均为:用户界面的顶边平行于电子设备100的顶边,且相对于用户界面的底边,用户界面的顶边更靠近电子设备100的顶边。
下面介绍如何确定C屏处于哪个物理姿态。
本申请实施例中,C屏、A屏和陀螺仪传感器A设置于折叠边的同一侧,陀螺仪传感器A可以检测A屏的物理姿态,也可以用于检测C屏的物理姿态。A屏和C屏可以共用一套电子设备100的坐标系,也可以不共用一套电子设备100的坐标系,下面以A屏和C屏共用一套电子设备100的坐标系为例进行说明。
示例性的,图8A和图8B示出了电子设备100的A屏(或C屏)的三轴坐标系。如图 8A和图8B所示的,A屏面向用户时,A屏的三轴坐标系的x1轴垂直于A屏的左侧边,且从A屏的左侧边指向A屏的右侧边;A屏的三轴坐标系的y1轴垂直于A屏的底边,且从A屏的底边指向A屏的顶边;A屏的三轴坐标系的z1轴用于指示前述A屏的朝向,z1轴垂直于A屏。
在一些实施例中,参考图8A所示的横向折叠的电子设备100,陀螺仪传感器A可以检测到y1轴对应的向量
Figure PCTCN2022097177-appb-000027
当y1轴在当地子午面(即地理坐标系的YOZ平面)的投影与地理坐标系的Z轴的夹角β4在预设范围11内时,电子设备100确定C屏处于第一物理姿态。当夹角β4在预设范围12内时,电子设备100确定C屏处于第三物理姿态。当夹角β4在预设范围13内时,电子设备100确定C屏处于第二物理姿态。当夹角β4在预设范围12内时,电子设备100确定C屏处于第四物理姿态。例如,预设范围11为[-45°,45°),预设范围12为[45°,135°),预设范围13为[135°,225°),预设范围14为[-45°,-135°)。
在一些实施例中,参考图8B所示的纵向折叠的电子设备100,陀螺仪传感器A可以检测到y1轴对应的向量
Figure PCTCN2022097177-appb-000028
当y1轴在当地子午面(即地理坐标系的YOZ平面)的投影与地理坐标系的Z轴的夹角β4在预设范围11内时,电子设备100确定C屏处于第三物理姿态。当夹角β4在预设范围12内时,电子设备100确定C屏处于第二物理姿态。当夹角β4在预设范围13内时,电子设备100确定C屏处于第四物理姿态。当夹角β4在预设范围14内时,电子设备100确定C屏处于第一物理姿态。
不限于上述确定电子设备100的物理姿态的实现方式,本申请实施例还可以通过其他实现方式确定C屏处于哪个物理姿态,此处不做具体限定。例如电子设备100利用陀螺仪传感器和加速度传感器检测到C屏的俯仰角(即C屏绕x1轴旋转的角度)、滚转角(即C屏绕y1轴旋转的角度)和偏航角(即C屏绕z1轴旋转的角度),进而可以根据C屏的俯仰角、滚转角和偏航角确定C屏处于哪个物理姿态。
参考图8B所示的纵向折叠的电子设备100,按照用户的使用习惯,通常该形态的电子设备100的默认显示方向为显示方向3。参考图8A所示的横向折叠的电子设备100,按照用户的使用习惯,通常该形态的电子设备100的默认显示方向为显示方向2。需要说明的是,后续实施例提供的显示方法,以C屏的默认显示方向为显示方向2的横向折叠的电子设备100为例进行说明,其他默认显示方向的横向折叠的电子设备100以及纵向折叠的电子设备100同样也适用于本申请实施例提供的显示方法。
参见图9A至图9F,下面对本申请实施例中的电子设备100涉及的几种支架状态进行介绍。
在一些实施例中,当电子设备100检测到B屏与水平面的夹角β2在预设范围15(即[0,f1],例如f1=10°)内,B屏对应的向量
Figure PCTCN2022097177-appb-000029
与地理坐标系的Z轴的夹角β5小于90度,且A屏和B屏的夹角α在预设范围16(即[f2,f3])内时,确定电子设备100处于第一支架状态。其中,0°<f2<f3≤90°,例如f2=20°,f3=90°。示例性的,图9A示出了电子设备100的一种第一支架状态的示意图。
在一些实施例中,当电子设备100检测到B屏与水平面的夹角β2在预设范围15(即[0,f1],例如f1=10°)内,B屏对应的向量
Figure PCTCN2022097177-appb-000030
与地理坐标系的Z轴的夹角β5小于90度,且A屏和B屏的夹角α在预设范围17(即[f4,f5])内时,确定电子设备100处于第二支架状态。其中,90°≤f4<f5<180°,例如f4=90°,f5=160°。示例性的,图9C示出了电子设备100的二种第二 支架状态的示意图。
可以理解,参见图9A和图9C,第一支架状态和第二支架状态下,电子设备100的B屏水平(或接近水平)向上放置,且电子设备100处于正向半折叠状态。
在一些实施例中,当电子设备100检测到B屏与水平面的夹角β2在预设范围15内,B屏对应的向量
Figure PCTCN2022097177-appb-000031
与地理坐标系的Z轴的夹角β5大于90度,且A屏和B屏的夹角α在预设范围18(即[f6,f7])内时,确定电子设备100处于第三支架状态。其中,180°<f6<f7≤270°,例如,f6=200°,f7=270°。示例性的,图9D示出了电子设备100的一种第三支架状态的示意图。
在一些实施例中,当电子设备100检测到B屏与水平面的夹角β2在预设范围15内,B屏对应的向量
Figure PCTCN2022097177-appb-000032
与地理坐标系的Z轴的夹角β5大于90度,且A屏和B屏的夹角α在预设范围19(即[f8,f9])内时,确定电子设备100处于第四支架状态。其中,270°≤f8<f9<360°,例如,f8=270°,f9=340°。示例性的,图9B示出了电子设备100的一种第四支架状态的示意图。
可以理解,参见图9D和图9B,第三支架状态下和第四支架状态下,B屏水平(或接近水平)向下放置,且电子设备100处于反向半折叠状态。
需要说明的是,第一支架状态、第二支架状态、第三支架状态和第四支架状态下,C屏处于第一物理姿态,C屏对应的用户界面的显示方向为显示方向1。
在一些实施例中,当电子设备100检测到A屏与水平面的夹角β1与夹角β2的差值在预设范围20(即[0°,f10°],例如f10°=5°)内,B屏对应的向量
Figure PCTCN2022097177-appb-000033
与地理坐标系的Z轴的夹角β5大于90度,且A屏和B屏的夹角α在预设范围21(即[f11,f12])内时,电子设备100处于第五支架状态。其中,0°<f11<f12<180°,例如f11=30°,f12=150°。示例性的,图9E示出了电子设备100的一种第五支架状态的示意图。可以理解,第五支架状态下,电子设备100的顶边和底边组成的平面为水平面或接近水平面,电子设备100处于正向半折叠状态。
在一些实施例中,当电子设备100检测到A屏与水平面的夹角β1与夹角β2的差值在预设范围22内,B屏对应的向量z2与地理坐标系的Z轴的夹角β5小于90度,且A屏和B屏的夹角α在预设范围22(即[f13,f14])内时,电子设备100处于第六支架状态。其中,180°<f13<f14<360°,例如f13=210°,f14=330°。示例性的,图9F示出了电子设备100的一种第六支架状态的示意图。可以理解,第六支架状态下,电子设备100的顶边和底边组成的平面为水平面或接近水平面,电子设备100处于反向半折叠状态。
需要说明的是,第五支架状态和第六支架状态下,C屏处于第二物理姿态,C屏对应的用户界面的显示方向为显示方向2。
上述六种支架状态下,无需为电子设备100安装额外的支架装置(例如手机支架),在解放双手的情况下,用户既可以更方便的查看显示屏上显示的内容。
示例性的,将电子设备100放置于第一支架状态,可以便于用户在解放双手的情况下查看C屏的显示内内容;将电子设备100放置于第二支架状态下,可以便于用户在解放双手的情况下查看A屏和B屏的显示内内容;将电子设备100放置于第三支架状态下,可以便于用户在解放双手的情况下查看C屏的显示内内容。将电子设备100放置于第四支架状态下,可以便于用户在解放双手的情况下查看A屏的显示内内容。将电子设备100放置于第五支架状态下,可以便于用户在解放双手的情况下查看C屏的显示内内容。将电子设备100放置于第六支架状态下,可以便于用户在解放双手的情况下查看A屏或B屏的显示内容。
本申请实施例提供的显示方法中,电子设备100检测到电子设备100处于预设姿态,且满足预设条件时,电子设备100启动上述预设姿态对应的显示屏1所对应的摄像头1采集图像,并将采集到的图像显示在显示屏1。可以理解,电子设备100的预设姿态,可以便于用户查看与上述预设姿态对应的显示屏1(即A屏、B屏和C屏中的至少一个显示屏)的显示内容。例如,显示屏1显示包括C屏(即第一屏),C屏对应后置摄像头(即第一摄像头)。例如,显示屏1显示包括A屏(即第二屏),A屏对应前置摄像头(即第二摄像头)。实时本申请实施例,用户将电子设备100放置于预设姿态后,用户可以在解放双手的情况下启动上述预设姿态对应的摄像头进行自拍,并通过上述预设姿态对应的显示屏进行实时预览,避免了繁琐的用户操作,有效提高了用户体验。
基于前述实施例对电子设备100的硬件结构、显示方向和支架状态的介绍,下面结合附图对本申请实施例提供的显示方法进行详细介绍。
下面以显示屏1为C屏(即第一屏)为例,对本申请实施例提供的显示方法进行介绍。
在一些实施例中,当检测到电子设备100处于第一预设姿态,且电子设备100满足第一预设条件时,电子设备100在C屏显示用户界面11。其中,第一预设姿态包括:A屏和B屏的夹角α在第一预设范围内。可选的,电子设备100还控制A屏和/或B屏熄灭。
需要说明的时,控制A屏(或B屏)熄灭指:若A屏处于熄灭状态,则保持熄灭状态;若A屏处于点亮状态,则控制A屏切换至熄灭状态。在C屏显示用户界面11指:若C屏处于熄灭状态,则点亮C屏并显示用户界面11;若C屏正在显示用户界面12,则切换C屏的显示内容为用户界面11。在一种实现方式中,电子设备100控制外屏(即C屏)点亮时也控制内屏(即A屏和B屏)熄灭,电子设备100控制内屏点亮时也控制外屏熄灭,这样,可以有效节省能耗。
在一些实施例中,电子设备100控制C屏显示用户界面11后,自动启动C屏对应的后置的低功耗摄像头的手势检测服务,利用上述低功耗摄像头实时检测用户的隔空手势操作。响应于检测到的隔空手势操作,电子设备100可以执行上述隔空手势操作对应的响应事件。这样,电子设备100控制C屏显示用户界面11后,可以继续解放用户双手,进一步实现无接触的隔空交互。
在一些实施例中,若当前C屏处于熄灭状态,用户界面11可以为C屏熄灭前最近显示的用户界面。在一些实施例中,用户界面11为C屏对应的初始界面。在一些实施例中,用户界面11为内屏(即A屏和B屏组成显示屏)最近显示的用户界面,这样,可以实现外屏对内屏的接续显示。
在一些实施例中,电子设备100显示用户界面11前,启动C屏对应的后置摄像头(即第一摄像头)采集图像,用户界面11的预览显示区用于显示后置摄像头(即第一摄像头)实时采集的图像。可以理解,后置摄像头实时采集的图像为用户界面11中的预览图像。需要说明的是,C屏对应多个后置摄像头时,用户界面11包括:电子设备100通过上述多个后置摄像头中的一个或多个摄像头采集的图像。可选的,通过上述多个后置摄像头中的像素数最高的摄像头采集的图像。可选的,通过上述多个后置摄像头中的微距摄像头采集的图像。
需要说明的是,本申请实施例中,C屏显示的用户界面11可以被称为第一用户界面,C屏显示的用户界面11的预览显示区可以被称为第一预览显示区。
需要说明的是,本申请实施例中,C屏对应的后置的低功耗摄像头和第一摄像头可以为 同一摄像头,也可以为不同的摄像头,此处不做具体限定。
下面对上述第一预设条件进行介绍。
在一些实施例中,第一预设条件包括电子设备100在当前夹角值的停顿时间达到第一预设时间。例如第一预设时间为3s。
在一些实施例中,当C屏处于点亮状态时,上述第一预设条件还包括电子设备100在第二预设时间内未接收到用户作用于C屏的输入操作。例如,第二预设时间为2s。
在一些实施例中,上述第一预设条件还包括电子设备100通过C屏对应的后置摄像头检测到人脸(或预设用户的人脸)。具体的,电子设备100开启C屏对应的低功耗摄像头的人脸检测服务,利用人脸识别算法检测上述低功耗摄像头采集的图像中是否包括人脸(或预设用户的人脸)。可选的,电子设备100开机后,电子设备100即开启C屏对应的低功耗摄像头的人脸检测服务。可选的,当电子设备100处于第一预设姿态时,电子设备100才开启C屏对应的低功耗摄像头的人脸检测服务。可选的,当电子设备100处于第一预设姿态且满足第一预设条件内包含的其他条件时,电子设备100才开启C屏对应的低功耗摄像头的人脸检测服务。
在一些实施例中,上述第一预设条件还包括电子设备100通过C屏对应的后置摄像头检测到第一预设手势。第一预设手势用于在电子设备100处于第一预设姿态时,触发C屏显示用户界面11。具体的,电子设备100开启C屏对应的低功耗摄像头的手势检测服务,利用手势识别算法检测上述低功耗摄像头采集的图像中是否包括预设手势。可选的,电子设备100开机后,电子设备100即开启C屏对应的低功耗摄像头的手势检测服务。可选的,当电子设备100处于第一预设姿态时,电子设备100才开启C屏对应的低功耗摄像头的手势检测服务。可选的,当电子设备100处于第一预设姿态且满足第一预设条件内包含的其他条件时,电子设备100才开启C屏对应的低功耗摄像头的手势检测服务。
在一些实施例中,第一预设范围包括:预设范围16(即[f2,f3])、预设范围18(即[f6,f7])和预设范围21(即[f11,f12])中的至少一个。
下面对上述第一预设姿态进行介绍。
在一些实施例中,第一预设姿态具体包括:A屏和B屏的夹角α减小至(和/或增大至)α1,α1在第一预设范围内。
在一些实施例中,第一预设姿态下,第一预设范围不包括0°和180°。
在一些实施例中,第一预设姿态还包括电子设备100处于第一支架状态。该第一预设姿态下,第一预设范围为[d1,d2],[d1,d2]在预设范围16(即[f2,f3])内,即f2≤d1≤d2≤f3。示例性的,参考图10A,A屏和B屏的夹角α在[d1,d2]内,电子设备100控制内屏熄灭,并在C屏显示旋转180°后的用户界面11,用户界面11包括后置摄像头采集的图像。可选的,第一预设范围内的角度大于0°且小于120°。
在一些实施例中,第一预设姿态还包括电子设备100处于第三支架状态。该第一预设姿态下,第一预设范围为[d3,d4],[d3,d4]在预设范围18(即[f6,f7])内,即f 6≤d3≤d4≤f7。示例性的,参考图10B,A屏和B屏的夹角α在[d3,d4]内,电子设备100控制内屏熄灭,并在C屏显示旋转180°后的用户界面11,用户界面11包括后置摄像头采集的图像。可选的,第一预设范围内的角度大于180°且小于300°。
在一些实施例中,第一预设姿态还包括电子设备100处于第五支架状态。该第一预设姿态下,第一预设范围为[d5,d6],[d5,d6]在预设范围21(即[f11,f12])内,即f11≤d5≤d6≤f12。 示例性的,参考图10C,A屏和B屏的夹角α在[d5,d6]内,电子设备100控制内屏熄灭,并在C屏按默认显示方向显示用户界面11,用户界面11包括后置摄像头采集的图像。可选的,第一预设范围内的角度大于0°且小于180°。
需要说明的是,由于C屏的默认显示方向为显示方向2,电子设备100处于第一支架状态和第三支架状态时,C屏对应的显示方向为显示方向1,因此,图10A和图10B所示的电子设备100将用户界面11旋转180°后再显示在C屏。
下面介绍C屏显示用户界面11后,C屏停止显示用户界面11的几种情况。
在一些实施例中,参见图10A,控制第一支架状态下满足第一预设条件的电子设备100的C屏显示用户界面11后,当电子设备100检测到A屏和B屏的夹角α超出[d1,d2]时,或检测到电子设备100脱离第一支架状态时,或检测到电子设备100翻折为展开形态时,电子设备100停止在C屏显示用户界面11。例如,用户将第一支架状态下的电子设备100继续折叠,使得夹角α小于5°时,电子设备100停止在C屏显示用户界面11。例如,用户将第一支架状态下的电子设备100继续展开,使得夹角α大于90°时,电子设备100停止在C屏显示用户界面11。
在一些实施例中,参见图10B,控制第三支架状态下满足第一预设条件的电子设备100的C屏显示用户界面11后,当电子设备100检测到A屏和B屏的夹角α超出[d3,d4]时,或检测到电子设备100脱离第三支架状态时,或检测到电子设备100翻折为展开形态时,电子设备100停止在C屏显示用户界面11。
在一些实施例中,参见图10B,控制第五支架状态下满足第一预设条件的电子设备100的C屏显示用户界面11后,当电子设备100检测到A屏和B屏的夹角α超出[d5,d6]时,或检测到电子设备100脱离第五支架状态时,或检测到电子设备100翻折为展开形态时,电子设备100停止在C屏显示用户界面11。
在一些实施例中,第一预设条件包括通过后置的低功耗摄像头检测到人脸(或预设用户的人脸),控制满足第一预设条件的电子设备100的C屏显示用户界面11后,当通过上述低功耗摄像头在第三预设时间内未检测到人脸(或预设用户的人脸)时,电子设备100停止在C屏显示用户界面11,并关闭低功耗摄像头的人脸检测服务。
在一些实施例中,电子设备100利用C屏对应的低功耗摄像头检测到用户的预设手势1,响应于上述预设手势1,电子设备100停止在C屏显示用户界面11。
在一些实施例中,电子设备100停止在C屏显示用户界面11,具体包括:电子设备100控制C屏熄灭;或者,电子设备100控制C屏显示其他预设界面,例如C屏对应的初始界面,例如在显示用户界面11前C屏最近显示的用户界面。
针对“用户界面11的预览显示区包括后置摄像头实时采集的图像”,下面对用户界面11进行具体说明。
在一些实施例中,电子设备100确定满足第一预设条件后,启动应用程序1,通过应用程序1调用C屏对应的后置摄像头采集图像,并将采集的图像通过应用程序1对应的用户界面11显示在C屏。
示例性的,参见图11A,用户界面11的预览显示区用于显示C屏对应的后置摄像头采集图像。可选的,图11A所示的用户界面11为镜子应用的用户界面。
需要说明的是,图11A所示的C屏全屏显示上述后置摄像头采集图像,用户界面11的 预览显示区占据整个C屏;本申请实施例,用户界面11的预览显示区也可以仅占据C屏的部分区域,此处不做具体限定。
在一些实施例中,上述C屏对应的后置摄像头包括紫外线(Ultraviolet,UV)摄像头,用户界面11包括UV摄像头采集的图像,UV摄像头利用紫外线为光源进行拍摄。示例性的,参见图11B所示的用户界面11,由于防晒霜可以阻隔紫外线,因此,UV摄像头采集的图像可以凸显涂抹防晒霜的区域。这样,通过C屏显示的用户界面11,用户可以实时查看防晒霜的涂抹情况。
在一些实施例中,参见图11C,用户界面11还可以包括试妆控件201。示例性的,试妆控件201可以接收用户的输入操作(例如触摸操作),响应于上述输入操作,电子设备100显示图11D所示的至少一个试妆选项202,例如口红选项202A、腮红选项202B、修容选项、眉毛选项、睫毛选项、眼影选项、眼线选项等。
在一些实施例中,上述至少一个试妆选项203中的试妆选项1可以接收用户的输入操作(例如触摸操作),响应于上述输入操作,电子设备100显示该试妆选项1对应的至少一个试妆样式。用户选择上述至少一个试妆样式中的试妆样式1后,电子设备100可以在摄像头采集的人脸上添加该试妆样式对应的试妆效果,并显示在用户界面11中。这样,通过C屏显示的用户界面11,用户可以通过C屏实时预览上述试妆效果。用户界面11对应的应用程序1可以是试妆应用。
示例性的,以试妆选项1为口红选项202A为例,口红选项202A对应的试装样式为口红色号。响应于针对口红选项202A的输入操作,电子设备100显示图11E所示的口红选项202A对应的至少一个口红色号,例如色号203A和色号203B。示例性的,色号203A可以接收用户的输入操作(例如触摸操作),响应于上述输入操作,电子设备100将用户界面11显示的摄像头实时采集的人脸的嘴唇颜色变换为图11F所示的色号203A。
类似的,参考口红选项202A,用户也可以通过其他试妆选项选择对应的试妆样式,实时预览各种试妆效果。例如,腮红选项202B对应的试装样式可以指示腮红色号、腮红位置和/或腮红形状。
在一些实施例中,参考图12A,用户界面11还可以包括拍摄控件204。用户界面11对应的应用程序1可以是相机应用。可选的,用户选择试妆样式(例如色号203A)后,响应于针对拍摄控件204的输入操作,电子设备100可以存储C屏当前显示的添加试妆效果后的图像。
在一些实施例中,参考图12B,用户界面11可以包括摄像头切换控件205,摄像头切换控件205可以将用于采集C屏显示的图像的摄像头切换至其他摄像头。可以理解,电子设备100可以包括多个摄像头。示例性的,摄像头切换控件205可以接收用户的输入操作,响应于上述操作,电子设备100直接将当前的摄像头切换至其他预设摄像头,或者显示图12C所示的至少一个摄像头选项,例如,前置摄像头205A、后置长焦摄像头205B和后置UV摄像头205C。用户可以从上述至少一个摄像头选项中选择目标摄像头。
在一些实施例中,若用户选择将用于采集图像的C屏对应的后置摄像头切换至前置摄像头,则响应于用户的输入操作,电子设备100还显示提示信息,上述提示信息用于提示用户翻折电子设备100,以便于电子设备100能够通过前置摄像头采集图像。例如提示用户将电子设备100为翻折展开形态(或第二支架状态、第四支架状态、第六支架状态)。
在一些实施例中,参考图12D,用户界面11还可以包括补光控件206。可选的,响应于 针对补光控件206的输入操作(例如触摸操作),电子设备100可以显示图12E所示的至少一个补光选项,例如,闪光灯自动补光控件206A、闪光灯关闭控件206B、闪光灯开启控件206C、显示屏补光控件206D。其中,用户选择闪光灯自动补光控件206A后,电子设备100可以基于环境光亮度确定是否开启闪光灯。
示例性的,参考图12F,响应于针对显示屏补光控件206D的输入操作(例如触摸操作),电子设备100调亮C屏的预设补光区域的亮度。本申请实施例中,预设补光区域的位置、形状和亮度可以是用户设置的,也可以是电子设备100默认设置的,此处不作具体限定。
在一些实施例中,参考图12G,用户界面11还可以包括拍摄控件对应的至少一种拍摄模式207,例如夜景模式、人像模式、大光圈模式、拍照模式、录像模式、专业模式等,用户选择一种拍摄模式后,电子设备100控制摄像头在该拍摄模式下采集图像,并显示在C屏对应的用户界面11中。用户界面11还可以包括相册控件208。相册控件208可以接收用户的输入操作,响应于上述输入操作,电子设备100可以显示相册应用的用户界面。
在一些实施例中,参考图12H,用户界面11还可以包括局部特征显示框210。示例性的,参见图12I,电子设备100识别摄像头采集的图像1中的预设局部特征(例如人脸)的所在区域1(即第一区域),并将区域1内的图像显示在显示框210中。可以理解,该实现方式中,电子设备100可以持续追踪摄像头的拍摄范围内的预设局部特征(例如人脸),并在显示框210实时显示该局部特征。可选的,电子设备100将区域1内的预设局部特征的图像放大后再显示在显示框210,以便于用户预览预设局部特征的细节。
需要说明的是,本申请实施例中,区域1也可以被称为第一区域。
在一些实施例中,图12H所示的显示框210可以接收用户的输入操作,参见图12J,响应于上述输入操作,电子设备100可以将区域1内预设局部特征的图像放大后,全屏显示在C屏。然后,图12J所示的C屏也可以接收用户的输入操作(例如双击C屏),响应于上述输入操作,电子设备100可以将预设局部特征的图像缩小回显示框210,即再次显示图12H所示的用户界面11。
在一些实施例中,当检测到电子设备100处于第一预设姿态,且电子设备100满足第一预设条件时,电子设备100在C屏对应的用户界面11中显示摄像头采集的图像的局部放大图像。可选的,参见图12I,电子设备100识别摄像头采集的图像1的预设局部特征(例如人脸)的所在区域1,上述局部放大图像即为图12J所示的区域1内预设局部特征的放大图像。可选的,上述局部放大图像为摄像头采集的图像1的中心区域的放大图像。需要说明的是,图12J所示的用户界面11也可以包括图12G所示的用户界面11的其他界面元素(例如试妆控件201)。
在一些实施例中,参见图12A或图12J,当检测到电子设备100处于第一预设姿态,且电子设备100满足第一预设条件时,电子设备100在C屏显示的用户界面11仅包括C屏对应的摄像头采集的图像(或上述图像的局部放大图像);然后,电子设备100响应于接收到的用户的第一输入操作,电子设备100才能显示用户界面11的其他界面元素,例如试妆控件201、拍摄控件204、摄像头切换控件205、补光控件206、拍摄模式207、相册控件208、设置控件209和预览框210中的一或多个。上述第一输入操作可以是作用于C屏的触控操作(例如用户手指触碰电子设备100的显示屏),也可以是预设隔空手势,此处不做具体限定。
在一些实施例中,用户界面11显示的图像是摄像头采集的图像的局部放大图像,用户可以拖动用户界面11显示的图像,以查看摄像头采集的原图像的其他区域。示例性的,图12J 所示的电子设备100在用户界面11实时显示摄像头采集的图像的区域1内的图像;图12J所示的图像可以接收用户的滑动操作(例如向上滑动操作);参见图12K,响应于上述滑动操作,电子设备100沿上述滑动操作的滑动方向的反方向移动区域1的位置,参见图12L,电子设备100在用户界面11显示移动后的区域1内的图像的放大图像。
需要说明的是,视觉上,响应于用户的滑动操作,用户界面11显示的图像沿用户的滑动方向移动。其中,上述滑动操作可以是触控操作,也可以是隔空手势,此处不做具体限定。
在一些实施例中,用户可以放大和缩小用户界面11显示的图像。示例性的,图12J所示的电子设备100在用户界面11实时显示摄像头采集的图像的区域1内的图像;图12J所示的图像可以接收用户的缩放操作(例如缩小操作);参见图12M,响应于上述缩放操作,电子设备100缩放区域1的尺寸,参见图12N,电子设备100在用户界面11显示缩放后的区域1内的图像的放大图像。
需要说明的是,电子设备100基于用户的缩小操作,可以放大区域1的尺寸;视觉上,响应于用户的缩小操作,用户界面11显示的图像被缩小。电子设备100基于用户的放大操作,可以缩小区域1的尺寸;视觉上,响应于用户的放大操作,用户界面11显示的图像被放大。其中,上述缩放操作可以是触控操作,也可以是隔空手势,此处不做具体限定。
在一些实施例中,参见图12J,当检测到电子设备100处于第一预设姿态,且电子设备100满足第一预设条件时,电子设备100在C屏对应的用户界面11中显示微距摄像头采集的图像。
下面以显示屏1为A屏(即第二屏)为例,对本申请实施例提供的显示方法进行介绍。
在一些实施例中,当检测到电子设备100处于第二预设姿态,且电子设备100满足第二预设条件时,电子设备100在A屏显示用户界面11。其中,第二预设姿态包括:A屏和B屏的夹角α在第二预设范围内。可选的,电子设备100还控制C屏和/或B屏熄灭。
在一些实施例中,参见图13A至图13F,电子设备100控制A屏显示用户界面11后,可以利用A屏对应的前置的低功耗摄像头的手势检测服务,实时检测用户的隔空手势操作。响应于检测到的隔空手势操作,电子设备100可以执行上述隔空手势操作对应的响应事件。这样,电子设备100控制A屏显示用户界面11后,可以继续解放用户双手,进一步实现无触摸的隔空交互。
在一些实施例中,若当前A屏处于熄灭状态,用户界面11可以为A屏(或内屏)熄灭前最近显示的用户界面。具体的,若A屏熄灭前,A屏和B屏进行独立的分屏显示,用户界面11为A屏熄灭前最近显示的用户界面;若A屏熄灭前,A屏和B屏组成的内屏进行全屏显示,用户界面11为内屏熄灭前最近的全屏显示的用户界面。
在一些实施例中,用户界面11为内屏对应的主界面。
在一些实施例中,电子设备100显示用户界面11前,启动A屏对应的前置摄像头(即第二摄像头)采集图像,用户界面11的预览显示区用于显示前置摄像头(即第二摄像头)实时采集的图像。需要说明的是,A屏对应多个前置摄像头时,用户界面11包括:电子设备100通过上述多个前置摄像头中的一个或多个摄像头采集的图像。
需要说明的是,本申请实施例中,A屏显示的用户界面11可以被称为第二用户界面,A屏显示的用户界面11的预览显示区可以被称为第二预览显示区。
需要说明的是,本申请实施例中,上述A屏对应的低功耗摄像头和第二摄像头可以为同 一摄像头,也可以为不同的摄像头,此处不做具体限定。
下面对上述第二预设条件进行介绍。
在一些实施例中,第二预设条件包括电子设备100在当前夹角值的停顿时间达到第一预设时间。例如第一预设时间为3s。
在一些实施例中,当A屏处于点亮状态时,上述第二预设条件还包括电子设备100在第二预设时间内未接收到用户作用于A屏的输入操作。例如,第二预设时间为2s。
在一些实施例中,上述第二预设条件还包括电子设备100通过A屏对应的前置的低功耗摄像头检测到人脸(或预设用户的人脸)。
在一些实施例中,上述第二预设条件还包括电子设备100通过A屏对应的前置的低功耗摄像头检测到第二预设手势。第二预设手势用于在电子设备100处于第二预设姿态时,触发A屏显示用户界面11。
在一些实施例中,第二预设范围包括:预设范围17(即[f4,f5])、预设范围19(即[f8,f9])和预设范围22(即[f13,f14])中的至少一个。
下面对上述第二预设姿态进行介绍。
在一些实施例中,第二预设姿态具体包括:A屏和B屏的夹角α减小至(和/或增大至)α2,α2在第二预设范围内。
在一些实施例中,第二预设姿态还包括电子设备100处于第二支架状态。该第二预设姿态下,第二预设范围为[d7,d8],[d7,d8]在预设范围17(即[f4,f5])内,即f4≤d7≤d8≤f5。示例性的,参考图13A和图13B,A屏和B屏的夹角α在[d7,d8]内,电子设备100控制C屏熄灭,B屏白屏或熄灭,并在A屏按照A屏的默认显示方向显示用户界面11,用户界面11包括前置摄像头采集的图像。可选的,第二预设范围内的角度大于60°且小于180°。
在一些实施例中,第二预设姿态还包括电子设备100处于第四支架状态。该第二预设姿态下,第二预设范围为[d9,d10],[d9,d10]在预设范围19(即[f8,f9])内,即f8≤d9≤d10≤f9。示例性的,参考图13C和图13D,A屏和B屏的夹角α在[d9,d10]内,电子设备100控制C屏和B屏熄灭,并在A屏按照A屏的默认显示方向显示用户界面11,用户界面11包括前置摄像头采集的图像。可选的,所述第二预设范围内的角度大于240°小于360°。
在一些实施例中,第二预设姿态还包括电子设备100处于第六架状态。该第二预设姿态下,第二预设范围为[d11,d12],[d11,d12]在预设范围22(即[f13,f14])内,即f13≤d11≤d12≤f14。示例性的,参考图13E和图13F,A屏和B屏的夹角α在[d11,d12]内,电子设备100控制内屏熄灭,并在A屏显示旋转180°后的用户界面11,用户界面11包括前置摄像头采集的图像。可选的,所述第二预设范围内的角度大于180°小于360°。
在一些实施例中,第二预设姿态下,第二预设范围不包括0°和180°。
需要说明的是,电子设备100处于第六支架状态时,A屏对应的显示方向与A屏的默认显示方向相反,因此,图13E和图13F所示的电子设备100将用户界面11旋转180°后再显示在A屏。
下面介绍A屏显示用户界面11后,A屏停止显示用户界面11的几种情况。
在一些实施例中,参见图13A和图13B,控制第二支架状态下满足第二预设条件的电子设备100的A屏显示用户界面11后,当电子设备100检测到A屏和B屏的夹角α超出[d7,d8]时,或检测到电子设备100脱离第二支架状态时,电子设备100停止在A屏显示用户界面11。
在一些实施例中,参见图13C和图13D,控制第四支架状态下满足第二预设条件的电子 设备100的A屏显示用户界面11后,当电子设备100检测到A屏和B屏的夹角α超出[d9,d10]时,或检测到电子设备100脱离第四支架状态时,电子设备100停止在A屏显示用户界面11。
在一些实施例中,参考图13E和图13F,控制第六支架状态下满足第二预设条件的电子设备100的A屏显示用户界面11后,当电子设备100检测到A屏和B屏的夹角α超出[d11,d12]时,或检测到电子设备100脱离第六支架状态时,电子设备100停止在A屏显示用户界面11。
在一些实施例中,第二预设条件包括通过前置的低功耗摄像头检测到人脸(或预设用户的人脸),控制满足第二预设条件的电子设备100的A屏显示用户界面11后,当通过前置的低功耗摄像头在第三预设时间内未检测到人脸时,电子设备100停止在A屏显示用户界面11,并关闭前置的低功耗摄像头的人脸检测服务。
在一些实施例中,电子设备100利用A屏对应的低功耗摄像头检测到用户的预设手势2,响应于上述预设手势2,电子设备100停止在A屏显示用户界面11。
在一些实施例中,电子设备100停止在A屏显示用户界面11,具体包括:电子设备100控制A屏熄灭;或者,电子设备100控制A屏显示其他预设界面,例如内屏对应的主界面的部分或全部,例如在显示用户界面11前A屏最近显示的用户界面。
在一些实施例中,参考图13A至图13F,满足第二预设条件的电子设备100的A屏显示用户界面11后,当检测到电子设备100被直接翻折为展开形态时,电子设备100控制内屏(A屏和B屏)全屏显示预设界面,例如全屏显示图14A所示的用户界面11或图14B所示的主界面12。
需要说明的是,前述A屏显示的用户界面11可以是前述图11A至图12N的相关实施例中所描述的用户界面11。不同的是,A屏显示的用户界面11中的预览显示区内的图像,是电子设备100通过A屏对应的前置摄像头采集的。具体的,可以参考前述实施例的相关描述,此处不再赘述。此外,本申请实施例中,A屏、C屏和内屏的尺寸可能不同,A屏、C屏和内屏显示的同一用户界面(例如用户界面11)包含的界面元素相同,三种显示屏显示的用户界面11的尺寸可以不同,三种显示屏显示的用户界面11的各界面元素的布局(即位置和尺寸)可以不同。各显示屏(例如C屏、A屏、内屏)显示的用户界面11的各界面元素的布局与该显示屏的尺寸相关联。
在一些实施例中,参见图12B至图12H所示的用户界面11,若用户选择将用于采集图像的A屏对应的前置摄像头(即第二摄像头)切换为后置摄像头(即第一摄像头),则响应于用户的输入操作,电子设备100还显示提示信息,上述提示信息用于提示用户翻折电子设备100,以便于电子设备100能够通过后置像头采集图像。例如提示用户将电子设备100为翻折展开形态(或第一支架状态、第三支架状态、第五支架状态)。
下面以显示屏1为A屏(即第二屏)和B屏(即第三屏)为例,对本申请实施例提供的显示方法进行介绍。
在一些实施例中,参见图15A,当电子设备100检测到电子设备100处于第三预设姿态,且电子设备100满足第三预设条件时,电子设备100启动A屏对应的前置摄像头(即第二摄像头)采集图像,并在A屏显示用户界面11,在B屏显示上述前置摄像头(即第二摄像头)采集的图像的局部放大图像,用户界面11的预览显示区用于显示上述前置摄像头采集的图像。上述局部放大图像为前置摄像头采集的图像中的预设局部特征的放大图像,例如图15A所示的人脸的放大图像。
在一些实施例中,参见图15B,当电子设备100检测到电子设备100处于第三预设姿态,且电子设备100满足第三预设条件时,电子设备100启动A屏对应的前置摄像头(即第二摄像头)采集图像,并在B屏显示用户界面11,在A屏显示上述前置摄像头(即第二摄像头)采集的图像的局部放大图像,用户界面11的预览显示区包括上述前置摄像头采集的图像。上述局部放大图像为前置摄像头采集的图像中的预设局部特征的放大图像。
参见图15A和图15B,第二支架状态下,用户可以在内屏的一个屏(例如A屏)实时预览摄像头采集的图像,同时在内屏的另一个屏(例如B屏)查看摄像头采集的图像内的预设局部特征的放大图像(例如脸部放大图),以便于用户在预览整体拍摄效果的同时看清面部细节,有效提升了用户的使用体验。
在一些实施例中,参见图15B,当电子设备100检测到电子设备100处于第三预设姿态,且电子设备100满足第三预设条件时,电子设备100启动A屏对应的前置摄像头(即第二摄像头)分别以拍摄角度1和拍摄角度2采集图像,并在A屏显示以拍摄角度1采集的图像,在B屏显示以拍摄角度2采集的图像。可选的,A屏和/或B屏也可以显示图12G所示的摄像头采集的图像之外的其他界面元素。这样,用户可以同时查看不同拍摄角度下的拍摄效果。
在一些实施例中,参见图15C,当电子设备100检测到电子设备100处于第三预设姿态,且电子设备100满足第三预设条件时,启动A屏对应的前置摄像头采集图像,并在A屏和B屏分屏显示用户界面11,即在A屏显示用户界面11的第一部分,在B屏显示用户界面11的第二部分,用户界面11的第一部分包括用户界面11的预览显示区,预览显示区用于显示前置摄像头实时采集的图像,用户界面11的第二部分包括用户界面11中除A屏的显示内容外的一个或多个界面元素。其中,第三预设姿态包括:A屏和B屏的夹角α在第三预设范围内。可选的,电子设备100还控制C屏熄灭。
需要说明的是,本申请实施例中,A屏和B屏分屏显示的用户界面11可以被称为第三用户界面,第三用户界面的预览显示区可以被称为第三预览显示区。
在一些实施例中,参见图15A至图15C,电子设备100控制A屏和B屏分屏显示后,可以利用A屏对应的低功耗摄像头的手势检测服务,实时检测用户的隔空手势操作。响应于检测到的隔空手势操作,电子设备100可以执行上述隔空手势操作对应的响应事件。这样,电子设备100控制A屏和B屏分屏显示后,可以继续解放用户双手,进一步实现无触摸的隔空交互。
下面对上述第三预设条件进行介绍。
在一些实施例中,第三预设条件包括电子设备100在当前夹角值的停顿时间达到第一预设时间。例如第一预设时间为3s。
在一些实施例中,当A屏和B屏处于点亮状态时,上述第三预设条件还包括电子设备100在第二预设时间内未接收到用户作用于A屏和B屏的输入操作。例如,第二预设时间为2s。
在一些实施例中,上述第三预设条件还包括电子设备100通过A屏对应的前置的低功耗摄像头检测到人脸(或预设用户的人脸)。
在一些实施例中,上述第三预设条件还包括电子设备100通过A屏对应的前置的低功耗摄像头检测到第三预设手势。第三预设手势用于在电子设备100处于第三预设姿态时,触发A屏和B屏进行分屏显示。
下面对上述第三预设姿态进行介绍。
在一些实施例中,第三预设姿态具体包括:A屏和B屏的夹角α减小至(和/或增大至) α3,α3在第三预设范围内。
在一些实施例中,第三预设姿态下,第三预设范围不包括0°和180°。
在一些实施例中,第三预设姿态还包括电子设备100处于第二支架状态。该第三预设姿态下,第三预设范围为[d7,d8],[d7,d8]在预设范围17内,即f4≤d7≤d8≤f5。示例性的,参考图15C,A屏和B屏的夹角α在[d7,d8]内,电子设备100控制A屏显示前置摄像头实时采集的图像,B屏显示用户界面11的其他界面元素,C屏熄灭。可选的,第三预设范围内的角度大于60°且小于180°。
需要说明的是,需要说明的是,前述A屏和B屏分屏显示的用户界面11可以是前述图11A至图12N的相关实施例中所描述的用户界面11。不同的是,A屏和B屏分屏显示的用户界面11中的预览显示区内的图像,是电子设备100通过A屏对应的前置摄像头采集的。具体的,可以参考前述实施例的相关描述,此处不再赘述。下面以图12G所示的用户界面11为例进行说明。
下面对A屏和B屏分屏显示的用户界面11进行介绍。
在一些实施例中,参见图15D,A屏显示的用户界面11的第一部分仅包括前置摄像头实时采集的图像,B屏显示的用户界面11的第二部分包括用户界面11中除上述图像外的其他所有界面元素,且B屏显示的各界面元素的布局与图12G所示的用户界面11的界面布局相关联,电子设备100存储有图12G所示的用户界面11的界面布局。
在一些实施例中,B屏显示的用户界面11的第二部分包括上述其他所有界面元素,以及上述其他所有界面元素中的一或多个界面元素的二级界面元素。示例性的,如图15E所示,B屏显示的用户界面11的第二部分包括用户界面11的试妆控件201和补光控件206,还包括试妆控件201对应的二级界面元素,即一或多个试妆选项202,还包括补光控件206对应的二级界面元素,即一或多个补光选项。参见图15E,不同于图12G所示的用户界面11的界面布局,电子设备100可以对B屏显示的用户界面11的界面元素进行重新布局,以适应B屏的尺寸和提升用户的使用体验。
在一些实施例中,A屏显示的用户界面11的第一部分包括前置摄像头实时采集的图像,以及至少一个用户界面11的其他界面元素,B屏显示的用户界面11的第二部分包括用户界面11中除A屏的显示内容外的界面元素。示例性的,如图15F所示,A屏的显示内容包括前置摄像头实时采集的图像,以及拍摄控件204、摄像头切换控件205和相册控件208。B屏的显示内容包括试妆控件201和试妆控件201对于的二级界面元素,补光控件206和补光控件206对于的二级界面元素,以及拍摄模式207。
参见图15D至图15F,第二支架状态下,用户可以在A屏实时预览摄像头采集的图像,在B屏操控摄像头的拍摄参数,以及操控摄像头采集的图像的显示效果。例如,第二支架状态下,用户可以通过A屏实时查看自己的面部状态,通过B屏显示的补光选项修改补光参数,通过B屏显示的试妆选项修改A屏显示的人脸的试妆效果,有效提升了用户的使用体验。
下面介绍A屏和B屏分屏显示预设内容(例如用户界面11)后,A屏和B屏分停止分屏显示上述预设内容的几种情况。
在一些实施例中,参见图15A至图15F,控制第二支架状态下满足第三预设条件的电子设备100的内屏分屏显示预设内容后,当电子设备100检测到A屏和B屏的夹角α超出[d7,d8]时,或检测到电子设备100脱离第二支架状态时,电子设备100停止在A屏和B屏分屏显示上述预设内容。
在一些实施例中,第三预设条件包括通过前置的低功耗摄像头检测到人脸(或预设用户的人脸),控制满足第三预设条件的电子设备100的分屏显示预设内容后,当通过前置的低功耗摄像头在第三预设时间内未检测到人脸时,电子设备100停止在A屏和B屏分屏显示上述预设内容,并关闭前置的低功耗摄像头的人脸检测服务。
在一些实施例中,电子设备100利用A屏对应的低功耗摄像头检测到用户的预设手势3,响应于上述预设手势3,电子设备100停止在A屏和B屏分屏显示上述预设内容。
在一些实施例中,电子设备100停止在A屏和B屏分屏显示上述预设内容,具体包括:电子设备100控制内屏熄灭;或者,电子设备100控制A屏和B屏分屏显示其他预设界面,例如内屏对应的主界面。
在一些实施例中,参见图15A至图15F,满足第三预设条件的电子设备100的A屏和B屏分屏显示预设内容(例如用户界面11)后,当检测到电子设备100被直接翻折为展开形态时,电子设备100控制内屏(A屏和B屏)全屏显示预设界面,例如全屏显示图14A所示的用户界面11或图14B所示的主界面12。
下面对电子设备100的软件结构进行示例性说明。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图16是本申请实施例提供的一种电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序(application,APP)层可以包括一系列应用程序包。如图16所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层(Framework)为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。如图16所示,框架层可以包括传感器管理模块(sensor manager)707、姿态识别模块(posture recognition)708、显示管理模块(display manager)709、窗口管理模块(window manager service,WMS)710。可选的,还可以包括活动管理器(activity manager service,AMS)、内容提供器,视图系统,电话管理器,资源管理器,通知管理器等(附图未示出)。
其中,窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
硬件抽象层(hardware abstraction layer,HAL)中包括有传感器服务模块(sensor service)706,该传感器服务模块706,可用于将内核层中传感器数据处理模块705的处理结果上报给框架层中的传感器管理模块707。在一些实施例中,硬件抽象层还包括摄像头检测服务模块713,该摄像头检测服务模块713,可用于将内核层中摄像头检测数据处理模块712的图像处理结果上报给框架层中的人脸识别模块714和手势识别715。
内核层(Kernel)是硬件和软件之间的层。内核层中可以包括传感器数据处理模块705。其中,该传感器数据处理模块705可用于获取硬件层(Hardware)中一个或多个传感器上报 的数据,并进行数据处理,以及将数据处理结果上报给传感器服务模块706。在一些实施例中,内核层中还可以包括摄像头检测数据处理模块712,该摄像头检测数据处理模块712可用于获取硬件层中摄像头711上报的图像,并进行图像处理,以及将图像处理结果上报给摄像头检测服务模块713。
硬件层中可以包括加速度传感器701、陀螺仪传感器702、加速度传感器703、陀螺仪传感器704等等。其中,加速度传感器701和陀螺仪传感器702可以设置于电子设备100的A屏中,加速度传感器703和陀螺仪传感器704可以设置于电子设备100的B屏中。其中,加速度传感器701可用于测量A屏的加速度数据,并上报给传感器数据处理模块705。加速度传感器703可用于测量B屏的加速度数据,并上报给传感器数据处理模块705。陀螺仪传感器702可用于测量A屏的陀螺仪数据,并上报给传感器数据处理模块705。陀螺仪传感器704可用于测量B屏的陀螺仪数据,并上报给传感器数据处理模块705。
其中,用户对电子设备100进行翻折时,硬件层中的加速度传感器701、陀螺仪传感器702、加速度传感器703和陀螺仪传感器704可以将各自测量到的传感器数据上报给内核层中的传感器数据处理模块705。其中,传感器数据处理模块705可以根据硬件层中多个传感器上报的传感器数据,计算出A屏朝向对应的向量
Figure PCTCN2022097177-appb-000034
和B屏朝向对应的向量
Figure PCTCN2022097177-appb-000035
进而计算出A屏与B屏的夹角α。然后,传感器数据处理模块705可以通过硬件抽象层中的传感器服务模块706将A屏朝向的方向向量
Figure PCTCN2022097177-appb-000036
B屏朝向的方向向量
Figure PCTCN2022097177-appb-000037
A屏与B屏的夹角α上报给框架层中的传感器管理模块707。传感器管理模块707可用于将向量
Figure PCTCN2022097177-appb-000038
向量
Figure PCTCN2022097177-appb-000039
和A夹角α给到姿态识别模块708。姿态识别模块708可以根据向量
Figure PCTCN2022097177-appb-000040
向量
Figure PCTCN2022097177-appb-000041
和夹角α,确定电子设备100的支架状态,进而基于夹角α和电子设备100的支架状态,识别到电子设备100的姿态类型。并将姿态类型发给显示管理模块709。显示管理模块709可以根据姿态类型确定每个显示屏(A屏、B屏和C屏)的显示状态,以及在显示屏1显示用户界面11。显示屏的显示状态包括点亮状态和熄灭状态,显示管理模块709可以通知窗口管理模块710创建用户界面11对应的窗口,更新窗口属性(例如大小、位置)。窗口管理模块710可以刷新窗口系统,重新绘制窗口,并通知上层应用调整窗口中显示元素的属性(例如大小,位置)。
在一些实施例中,当姿态类型为预设姿态时,电子设备100开启上述预设姿态对应的摄像头711的人脸检测服务,电子设备100利用摄像头711采集图像,并将采集的图像上报给摄像头检测数据处理模块712,摄像头检测服务模块713对上述图像进行图像处理,并将图像处理后的图像经摄像头检测服务模块713上传给人脸识别模块714,人脸识别模块714识别上述图像是否包括人脸(或预设用户的人脸),显示管理模块709可以根据姿态类型和人脸识别结果确定每个屏的显示状态和显示内容等等。例如,第一预设姿态对应后置摄像头,第二预设姿态对应前置摄像头。
在一些实施例中,当显示管理模块709根据姿态类型确定显示屏1(例如C屏)显示用户界面11后,开启显示屏1对应的摄像头711的手势检测服务。摄像头检测服务模块713将图像处理后的图像经摄像头检测服务模块713上传给手势识别模块715,手势识别模块715识别上述图像中的手势类型,显示管理模块709可以根据上述手势类型更新显示屏1的显示状态和显示内容等等。
本申请还提供了一种电子设备100,该电子设备100包括第一屏、折叠屏和第一摄像头,上述折叠屏沿折叠边可折叠形成第二屏和第三屏,第一屏和第二屏的朝向相反,第一屏的朝 向和第一摄像头的拍摄方向一致,第二屏的朝向和第二摄像头的拍摄方向一致。
参见图17,图17示出了本发明实施例提供的另一种电子设备的结构示意图。如图17所示,电子设备100可包括:检测单元和显示单元。
检测单元,用于基于检测到的第二屏和第三屏的第一夹角,确定电子设备处于第一预设姿态;
显示单元,用于基于电子设备的第一预设姿态,在第一屏显示第一用户界面;第一用户界面的第一预览显示区用于显示第一摄像头采集的图像,第一预设姿态下,第一夹角不包括0°和180°;
检测单元,还用于基于检测到的第一夹角,确定电子设备处于第二预设姿态;
显示单元,还用于基于电子设备的第二预设姿态,在第二屏显示第二用户界面;第二用户界面的第二预览显示区用于显示第二摄像头采集的图像,第二预设姿态下,第一夹角不包括0°和180°。
在一些实施例中,检测单元,还用于基于检测到的第一夹角,确定电子设备处于第三预设姿态;显示单元,还用于基于电子设备的第三预设姿态,在第二屏和第三屏进行分屏显示,第二屏的显示内容包括第三预览显示区,第三预览显示区用于显示第二摄像头采集的图像,第三预设姿态下,第一夹角不包括0°和180°。
在一些实施例中,上述基于电子设备的第一预设姿态,在第一屏显示第一用户界面,包括:当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,启动第一摄像头采集图像,并在第一屏显示第一用户界面;其中,第一预设姿态包括第一夹角在第一预设范围内,第一预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第一屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第一屏的输入操作;在第一屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第一屏对应的摄像头采集的图像中,检测到第一预设手势。
在一些实施例中,上述基于电子设备的第二预设姿态,在第二屏显示第二用户界面,包括:当检测到电子设备处于第二预设姿态,且电子设备满足第二预设条件时,启动第二摄像头采集图像,并在第二屏显示第二用户界面;其中,第二预设姿态包括第一夹角在第二预设范围内,第二预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第二预设手势。
在一些实施例中,上述基于电子设备的第三预设姿态,在第二屏和第三屏进行分屏显示,包括:当检测到电子设备处于第三预设姿态,且电子设备满足第三预设条件时,启动第二摄像头采集图像,并在第二屏和第三屏进行分屏显示;其中,第三预设姿态包括第一夹角在第三预设范围内,第三预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏和/或第三屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏或第三屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第三预设手势。
本申请还提供了一种电子设备100,该电子设备100包括电子设备包括第一屏、折叠屏和第一摄像头,折叠屏沿折叠边可折叠形成第二屏和第三屏,第一屏和第二屏的朝向相反, 第一屏的朝向和第一摄像头的拍摄方向一致。该电子设备可包括多个功能模块或单元,例如,显示单元。其中,
显示单元,用于当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,启动第一摄像头采集图像,并在第一屏显示第一用户界面,第一用户界面的第一预览显示区用于显示第一摄像头采集的图像;其中,第一预设姿态包括第二屏和第三屏的第一夹角在第一预设范围内,第一预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第一屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第一屏的输入操作;在第一屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第一屏对应的摄像头采集的图像中,检测到第一预设手势。
本申请实施例中,上述第一屏对应的摄像头与第一摄像头可以为同一摄像头,也可以为不同摄像头。可选的,上述第一屏对应的摄像头为低功耗摄像头。
在一些实施例中,电子设备还包括第二摄像头,第二屏的朝向和第二摄像头的拍摄方向一致,当折叠屏沿折叠边折叠形成第二屏和第三屏时,第三屏和第二摄像头位于折叠边的不同侧。显示单元还用于,当检测到电子设备处于第二预设姿态,且电子设备满足第二预设条件时,启动第二摄像头采集图像,并在第二屏显示第二用户界面,第二用户界面的第二预览显示区用于显示第二摄像头采集的图像;其中,第二预设姿态包括第一夹角在第二预设范围内,第二预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第二预设手势。
本申请实施例中,上述第二屏对应的摄像头与第二摄像头可以为同一摄像头,也可以为不同摄像头。可选的,上述第二屏对应的摄像头为低功耗摄像头。
在一些实施例中,电子设备还包括第二摄像头,第二屏的朝向和第二摄像头的拍摄方向一致,当折叠屏沿折叠边折叠形成第二屏和第三屏时,第三屏和第二摄像头位于折叠边的不同侧。显示单元还用于,当检测到电子设备处于第三预设姿态,且电子设备满足第三预设条件时,启动第二摄像头采集图像,并在第二屏和第三屏进行分屏显示,第二屏的显示内容包括第三预览显示区,第三预览显示区用于显示第二摄像头采集的图像;其中,第三预设姿态包括第一夹角在第三预设范围内,第三预设条件包括以下一项或多项:第一夹角在当前夹角值的停顿时间达到第一预设时间;当第二屏和/或第三屏处于点亮状态时,电子设备在第二预设时间内未接收到作用于第二屏或第三屏的输入操作;在第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在第二屏对应的摄像头采集的图像中,检测到第三预设手势。
在一些实施例中,第一预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏朝向与地理坐标系的Z轴的夹角小于90°,第一预设范围内的角度大于0°且小于120°;或者,第一预设姿态还包括:第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第一预设范围内的角度大于180°且小于300°;或者,第一预设姿态还包括:第三夹角与第二夹角的差值在第五预设范围内,第三夹角为第二屏与水平面的夹角,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第一预设范围内的角度大于0°且小于180°。
在一些实施例中,第一预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏朝向与地理坐标系的Z轴的夹角小于90°,第一预设范围内的角度小于120°;上述显示单元在第一屏显示第一用户界面,包括:显示单元在第一屏显示旋转180°后的第一用户界 面。
在一些实施例中,电子设备还包括识别单元,上述显示单元在第一屏显示第一用户界面之前,识别单元用于识别第一摄像头采集的图像中预设局部特征所在的第一区域,第一预览显示区用于显示第一区域内的图像的放大图像。
在一些实施例中,电子设备还包括接收单元,上述显示单元在第一屏显示第一用户界面之后,接收单元用于接收用户的第一输入操作;显示单元还用于响应于第一输入操作,在第一用户界面中显示以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;其中,试妆控件用于为第一预览显示区显示的图像中的人脸添加预设化妆效果;拍摄控件用于触发电子设备保存第一预览显示区显示的图像;摄像头切换控件用于切换采集图像的摄像头;补光控件用于补充环境光;相册控件用于触发电子设备显示相册应用的用户界面;显示框用于显示第一摄像头采集的图像中的预设局部特征的放大图像。
在一些实施例中,第一摄像头为紫外线摄像头,第一摄像头采集的图像用于凸显涂抹防晒霜的区域。
在一些实施例中,显示单元还用于,当检测到电子设备处于第一预设姿态,且电子设备满足第一预设条件时,控制第二屏和第三屏熄灭。
在一些实施例中,第二预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角小于90°,第二预设范围内的角度大于60°且小于180°;或者,第二预设姿态还包括:第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第二预设范围内的角度大于240°且小于360°;或者,第二预设姿态还包括:第三夹角与第二夹角的差值在第五预设范围内,第三夹角为第二屏与水平面的夹角,第三屏的朝向与地理坐标系的Z轴的夹角大于90°,第二预设范围内的角度大于180°且小于360°。
在一些实施例中,第一预设姿态和第二预设姿态下,第一夹角不包括0°和180°。
在一些实施例中,显示单元还用于,当检测到电子设备处于第二预设姿态,且电子设备满足第二预设条件时,控制第一屏熄灭。
在一些实施例中,第三预设姿态还包括:第三屏与水平面的第二夹角在第四预设范围内,第三屏的朝向与地理坐标系的Z轴的夹角小于90°,第三预设范围内的角度大于60°且小于180°。
在一些实施例中,第三屏的显示内容包括第二摄像头采集的图像中的预设局部特征的放大图像。
在一些实施例中,上述显示单元在第二屏和第三屏进行分屏显示,包括:显示单元在第二屏和第三显示屏对第三用户界面进行分屏显示;第二屏的显示内容包括第三用户界面的第三预览显示区,以及第三用户界面中除第三预览显示区之外的零个、一个或多个界面元素,第三屏的显示内容包括第三用户界面中除第二屏的显示内容之外的一个或多个界面元素;第三用户界面还包括以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;其中,试妆控件用于为第三用户界面的预览显示区内的图像中的人脸添加预设化妆效果;拍摄控件用于触发电子设备保存第三用户界面的预览显示区内的图像;摄像头切换控件用于切换采集图像的摄像头;补光控件用于补充环境光;相册控件用于触发电子设备显示相册应用的用户界面;显示框用于显示第二摄像头采集的图像中的预设局部特征的放大图像。
本申请的各实施方式可以任意进行组合,以实现不同的技术效果。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。
总之,以上所述仅为本发明技术方案的实施例而已,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (24)

  1. 一种显示方法,其特征在于,应用于电子设备,所述电子设备包括第一屏、折叠屏和第一摄像头,所述折叠屏沿折叠边可折叠形成第二屏和第三屏,所述第一屏和所述第二屏的朝向相反,所述第一屏的朝向和所述第一摄像头的拍摄方向一致,所述第二屏的朝向和所述第二摄像头的拍摄方向一致,所述方法包括:
    基于检测到的所述第二屏和所述第三屏的第一夹角,确定所述电子设备处于第一预设姿态;基于所述电子设备的第一预设姿态,所述电子设备在所述第一屏显示第一用户界面;所述第一用户界面的第一预览显示区用于显示所述第一摄像头采集的图像,所述第一预设姿态下,所述第一夹角不包括0°和180°;
    基于检测到的所述第一夹角,确定所述电子设备处于第二预设姿态;基于所述电子设备的第二预设姿态,所述电子设备在所述第二屏显示第二用户界面;所述第二用户界面的第二预览显示区用于显示所述第二摄像头采集的图像,所述第二预设姿态下,所述第一夹角不包括0°和180°。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    基于检测到的所述第一夹角,确定所述电子设备处于第三预设姿态;基于所述电子设备的第三预设姿态,所述电子设备在所述第二屏和所述第三屏进行分屏显示,所述第二屏的显示内容包括第三预览显示区,所述第三预览显示区用于显示所述第二摄像头采集的图像,所述第三预设姿态下,所述第一夹角不包括0°和180°。
  3. 根据权利要求1或2所述的方法,其特征在于,所述基于所述电子设备的第一预设姿态,所述电子设备在所述第一屏显示第一用户界面,包括:
    当检测到所述电子设备处于所述第一预设姿态,且所述电子设备满足第一预设条件时,所述电子设备启动所述第一摄像头采集图像,并在所述第一屏显示所述第一用户界面;
    其中,所述第一预设姿态包括所述第一夹角在第一预设范围内,所述第一预设条件包括以下一项或多项:
    所述第一夹角在当前夹角值的停顿时间达到第一预设时间;当所述第一屏处于点亮状态时,所述电子设备在第二预设时间内未接收到作用于所述第一屏的输入操作;在所述第一屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在所述第一屏对应的摄像头采集的图像中,检测到第一预设手势。
  4. 根据权利要求1或2所述的方法,其特征在于,所述基于所述电子设备的第二预设姿态,所述电子设备在所述第二屏显示第二用户界面,包括:
    当检测到所述电子设备处于所述第二预设姿态,且所述电子设备满足第二预设条件时,所述电子设备启动所述第二摄像头采集图像,并在所述第二屏显示所述第二用户界面;
    其中,所述第二预设姿态包括所述第一夹角在第二预设范围内,所述第二预设条件包括以下一项或多项:
    所述第一夹角在当前夹角值的停顿时间达到第一预设时间;当所述第二屏处于点亮状态 时,所述电子设备在第二预设时间内未接收到作用于所述第二屏的输入操作;在所述第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在所述第二屏对应的摄像头采集的图像中,检测到第二预设手势。
  5. 根据权利要求1或2所述的方法,其特征在于,所述基于所述电子设备的第三预设姿态,所述电子设备在所述第二屏和所述第三屏进行分屏显示,包括:
    当检测到所述电子设备处于所述第三预设姿态,且所述电子设备满足第三预设条件时,所述电子设备启动所述第二摄像头采集图像,并在所述第二屏和所述第三屏进行分屏显示;
    其中,所述第三预设姿态包括所述第一夹角在第三预设范围内,所述第三预设条件包括以下一项或多项:
    所述第一夹角在当前夹角值的停顿时间达到第一预设时间;当所述第二屏和/或所述第三屏处于点亮状态时,所述电子设备在第二预设时间内未接收到作用于所述第二屏或所述第三屏的输入操作;在所述第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在所述第二屏对应的摄像头采集的图像中,检测到第三预设手势。
  6. 一种显示方法,其特征在于,应用于电子设备,所述电子设备包括第一屏、折叠屏和第一摄像头,所述折叠屏沿折叠边可折叠形成第二屏和第三屏,所述第一屏和所述第二屏的朝向相反,所述第一屏的朝向和所述第一摄像头的拍摄方向一致,所述方法包括:
    当检测到所述电子设备处于第一预设姿态,且所述电子设备满足第一预设条件时,所述电子设备启动所述第一摄像头采集图像,并在所述第一屏显示第一用户界面,所述第一用户界面的第一预览显示区用于显示所述第一摄像头采集的图像;
    其中,所述第一预设姿态包括所述第二屏和所述第三屏的第一夹角在第一预设范围内,所述第一预设条件包括以下一项或多项:
    所述第一夹角在当前夹角值的停顿时间达到第一预设时间;当所述第一屏处于点亮状态时,所述电子设备在第二预设时间内未接收到作用于所述第一屏的输入操作;在所述第一屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在所述第一屏对应的摄像头采集的图像中,检测到第一预设手势。
  7. 根据权利要求6所述的方法,其特征在于,所述电子设备还包括第二摄像头,所述第二屏的朝向和所述第二摄像头的拍摄方向一致,当所述折叠屏沿所述折叠边折叠形成所述第二屏和所述第三屏时,所述第三屏和第二摄像头位于所述折叠边的不同侧,所述方法还包括:
    当检测到所述电子设备处于第二预设姿态,且所述电子设备满足第二预设条件时,所述电子设备启动所述第二摄像头采集图像,并在所述第二屏显示第二用户界面,所述第二用户界面的第二预览显示区用于显示所述第二摄像头采集的图像;
    其中,所述第二预设姿态包括所述第一夹角在第二预设范围内,所述第二预设条件包括以下一项或多项:
    所述第一夹角在当前夹角值的停顿时间达到第一预设时间;当所述第二屏处于点亮状态时,所述电子设备在第二预设时间内未接收到作用于所述第二屏的输入操作;在所述第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在所述第二屏对应的摄像头采集的图像中,检测到第二预设手势。
  8. 根据权利要求6所述的方法,其特征在于,所述电子设备还包括第二摄像头,所述第二屏的朝向和所述第二摄像头的拍摄方向一致,当所述折叠屏沿所述折叠边折叠形成所述第二屏和所述第三屏时,所述第三屏和第二摄像头位于所述折叠边的不同侧,所述方法还包括:
    当检测到所述电子设备处于第三预设姿态,且所述电子设备满足第三预设条件时,所述电子设备启动所述第二摄像头采集图像,并在所述第二屏和所述第三屏进行分屏显示,所述第二屏的显示内容包括第三预览显示区,所述第三预览显示区用于显示所述第二摄像头采集的图像;
    其中,所述第三预设姿态包括所述第一夹角在第三预设范围内,所述第三预设条件包括以下一项或多项:
    所述第一夹角在当前夹角值的停顿时间达到第一预设时间;当所述第二屏和/或所述第三屏处于点亮状态时,所述电子设备在第二预设时间内未接收到作用于所述第二屏或所述第三屏的输入操作;在所述第二屏对应的摄像头采集的图像中,检测到人脸或预设用户的人脸;在所述第二屏对应的摄像头采集的图像中,检测到第三预设手势。
  9. 根据权利要求6至8任一项所述的方法,其特征在于,所述第一预设姿态还包括:所述第三屏与水平面的第二夹角在第四预设范围内,所述第三屏朝向与地理坐标系的Z轴的夹角小于90°,所述第一预设范围内的角度大于0°且小于120°;
    或者,所述第一预设姿态还包括:所述第二夹角在所述第四预设范围内,所述第三屏的朝向与所述地理坐标系的Z轴的夹角大于90°,所述第一预设范围内的角度大于180°且小于300°;
    或者,所述第一预设姿态还包括:第三夹角与所述第二夹角的差值在第五预设范围内,所述第三夹角为所述第二屏与所述水平面的夹角,所述第三屏的朝向与所述地理坐标系的Z轴的夹角大于90°,所述第一预设范围内的角度大于0°且小于180°。
  10. 根据权利要求6至8任一项所述的方法,其特征在于,所述第一预设姿态还包括:所述第三屏与水平面的第二夹角在第四预设范围内,所述第三屏朝向与地理坐标系的Z轴的夹角小于90°,所述第一预设范围内的角度小于120°;所述在所述第一屏显示第一用户界面,包括:
    在所述第一屏显示旋转180°后的所述第一用户界面。
  11. 根据权利要求6至8任一项所述的方法,其特征在于,所述第一预设姿态包括第一支架状态和第五支架状态,所述当检测到所述电子设备处于第一预设姿态,且所述电子设备满足第一预设条件时,所述电子设备启动所述第一摄像头采集图像,并在所述第一屏显示第一用户界面,包括:
    当检测到所述电子设备处于所述第一支架状态,且所述电子设备满足所述第一预设条件时,所述电子设备启动所述第一摄像头采集图像,并在所述第一屏显示所述第一用户界面;
    所述在所述第一屏显示第一用户界面之后,还包括:
    当检测到所述电子设备由所述第一支架状态切换为所述第五支架状态时,所述电子设备在所述第一屏显示旋转180°后的所述第一用户界面。
  12. 根据权利要求6至8任一项所述的方法,其特征在于,所述在所述第一屏显示第一用户界面之前,还包括:
    所述电子设备识别所述第一摄像头采集的图像中预设局部特征所在的第一区域,所述第一预览显示区用于显示所述第一区域内的图像的放大图像。
  13. 根据权利要求6至8任一项所述的方法,其特征在于,所述在所述第一屏显示第一用户界面之后,还包括:
    接收用户的第一输入操作;
    响应于所述第一输入操作,所述电子设备在所述第一用户界面中显示以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;
    其中,所述试妆控件用于为所述第一预览显示区显示的图像中的人脸添加预设化妆效果;所述拍摄控件用于触发所述电子设备保存所述第一预览显示区显示的图像;所述摄像头切换控件用于切换采集图像的摄像头;所述补光控件用于补充环境光;所述相册控件用于触发所述电子设备显示相册应用的用户界面;所述显示框用于显示所述第一摄像头采集的图像中的预设局部特征的放大图像。
  14. 根据权利要求6至8任一项所述的方法,其特征在于,所述第一摄像头为紫外线摄像头,所述第一摄像头采集的图像用于凸显涂抹防晒霜的区域。
  15. 根据权利要求6至8任一项所述的方法,其特征在于,当检测到所述电子设备处于第一预设姿态,且所述电子设备满足第一预设条件时,所述方法还包括:
    所述电子设备控制所述第二屏和所述第三屏熄灭。
  16. 根据权利要求7所述的方法,其特征在于,所述第二预设姿态还包括:所述第三屏与水平面的第二夹角在第四预设范围内,所述第三屏的朝向与地理坐标系的Z轴的夹角小于90°,所述第二预设范围内的角度大于60°且小于180°;
    或者,所述第二预设姿态还包括:所述第二夹角在所述第四预设范围内,所述第三屏的朝向与所述地理坐标系的Z轴的夹角大于90°,所述第二预设范围内的角度大于240°且小于360°;
    或者,所述第二预设姿态还包括:第三夹角与所述第二夹角的差值在第五预设范围内,所述第三夹角为所述第二屏与所述水平面的夹角,所述第三屏的朝向与所述地理坐标系的Z轴的夹角大于90°,所述第二预设范围内的角度大于180°且小于360°。
  17. 根据权利要求7所述的方法,其特征在于,所述第一预设姿态和所述第二预设姿态下,所述第一夹角不包括0°和180°。
  18. 根据权利要求7所述的方法,其特征在于,当检测到所述电子设备处于第二预设姿态,且所述电子设备满足第二预设条件时,所述方法还包括:
    所述电子设备控制所述第一屏熄灭。
  19. 根据权利要求8所述的方法,其特征在于,所述第三预设姿态还包括:所述第三屏与水平面的第二夹角在第四预设范围内,所述第三屏的朝向与地理坐标系的Z轴的夹角小于90°,所述第三预设范围内的角度大于60°且小于180°。
  20. 根据权利要求8所述的方法,其特征在于,所述第三屏的显示内容包括所述第二摄像头采集的图像中的预设局部特征的放大图像。
  21. 根据权利要求8所述的方法,其特征在于,所述在所述第二屏和所述第三屏进行分屏显示,包括:
    在所述第二屏和所述第三显示屏对第三用户界面进行分屏显示;所述第二屏的显示内容包括第三用户界面的所述第三预览显示区,以及所述第三用户界面中除所述第三预览显示区之外的零个、一个或多个界面元素,所述第三屏的显示内容包括所述第三用户界面中除所述第二屏的显示内容之外的一个或多个界面元素;
    所述第三用户界面还包括以下一个或多个界面元素:试妆控件、拍摄控件、摄像头切换控件、补光控件、相册控件、显示框;其中,所述试妆控件用于为所述第三用户界面的预览显示区内的图像中的人脸添加预设化妆效果;所述拍摄控件用于触发所述电子设备保存所述第三用户界面的预览显示区内的图像;所述摄像头切换控件用于切换采集图像的摄像头;所述补光控件用于补充环境光;所述相册控件用于触发所述电子设备显示相册应用的用户界面;所述显示框用于显示所述第二摄像头采集的图像中的预设局部特征的放大图像。
  22. 一种电子设备,包括存储器,一个或多个处理器,多个应用程序,以及一个或多个程序;其中,所述一个或多个程序被存储在所述存储器中;其特征在于,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备实现如权利要求1-21任一项所述的方法。
  23. 一种计算机可读存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-21中任一项所述的显示方法。
  24. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-21任一项所述的显示方法。
PCT/CN2022/097177 2021-06-09 2022-06-06 显示方法及相关装置 WO2022257889A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US18/041,547 US20240012451A1 (en) 2021-06-09 2022-06-06 Display method and related apparatus
EP22819487.4A EP4181494A4 (en) 2021-06-09 2022-06-06 DISPLAY METHOD AND ASSOCIATED APPARATUS
KR1020237005512A KR20230038290A (ko) 2021-06-09 2022-06-06 디스플레이 방법 및 관련 장치

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202110642450.0 2021-06-09
CN202110642450 2021-06-09
CN202110920022.X 2021-08-11
CN202110920022.XA CN115460318B (zh) 2021-06-09 2021-08-11 显示方法及相关装置

Publications (1)

Publication Number Publication Date
WO2022257889A1 true WO2022257889A1 (zh) 2022-12-15

Family

ID=84294439

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/097177 WO2022257889A1 (zh) 2021-06-09 2022-06-06 显示方法及相关装置

Country Status (5)

Country Link
US (1) US20240012451A1 (zh)
EP (1) EP4181494A4 (zh)
KR (1) KR20230038290A (zh)
CN (1) CN115460318B (zh)
WO (1) WO2022257889A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117082165A (zh) * 2023-09-19 2023-11-17 荣耀终端有限公司 拍照操作方法、终端设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750094A (zh) * 2012-06-13 2012-10-24 胡锦云 图像采集方法
CN111124561A (zh) * 2019-11-08 2020-05-08 华为技术有限公司 应用于具有折叠屏的电子设备的显示方法及电子设备
CN111263005A (zh) * 2020-01-21 2020-06-09 华为技术有限公司 一种折叠屏的显示方法及相关装置
WO2020238451A1 (zh) * 2019-05-27 2020-12-03 维沃移动通信有限公司 终端控制方法和终端
WO2021063311A1 (zh) * 2019-09-30 2021-04-08 华为技术有限公司 具有折叠屏的电子设备的显示控制方法及电子设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9173570B2 (en) * 2012-04-12 2015-11-03 Thomas Nathan Millikan Viewing and processing multispectral images
US11409488B2 (en) * 2019-02-19 2022-08-09 Samsung Electronics Co., Ltd. Electronic device and display control method thereof
KR102685608B1 (ko) * 2019-08-07 2024-07-17 삼성전자주식회사 카메라 프리뷰 이미지를 제공하기 위한 전자 장치 및 그의 동작 방법
KR102692813B1 (ko) * 2019-08-20 2024-08-08 삼성전자 주식회사 전자 장치 및 전자 장치의 상태에 기반하여 동작 모드 제어 방법
KR102675255B1 (ko) * 2019-10-07 2024-06-14 삼성전자 주식회사 전자 장치에서 카메라의 조명을 제공하는 방법 및 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750094A (zh) * 2012-06-13 2012-10-24 胡锦云 图像采集方法
WO2020238451A1 (zh) * 2019-05-27 2020-12-03 维沃移动通信有限公司 终端控制方法和终端
WO2021063311A1 (zh) * 2019-09-30 2021-04-08 华为技术有限公司 具有折叠屏的电子设备的显示控制方法及电子设备
CN111124561A (zh) * 2019-11-08 2020-05-08 华为技术有限公司 应用于具有折叠屏的电子设备的显示方法及电子设备
CN111263005A (zh) * 2020-01-21 2020-06-09 华为技术有限公司 一种折叠屏的显示方法及相关装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4181494A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117082165A (zh) * 2023-09-19 2023-11-17 荣耀终端有限公司 拍照操作方法、终端设备及存储介质
CN117082165B (zh) * 2023-09-19 2024-02-23 荣耀终端有限公司 拍照操作方法、终端设备及存储介质

Also Published As

Publication number Publication date
KR20230038290A (ko) 2023-03-17
CN115460318A (zh) 2022-12-09
CN115460318B (zh) 2024-09-06
EP4181494A4 (en) 2024-03-13
EP4181494A1 (en) 2023-05-17
US20240012451A1 (en) 2024-01-11

Similar Documents

Publication Publication Date Title
CN114787773B (zh) 应用于具有折叠屏的电子设备的显示方法及电子设备
WO2021213164A1 (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2021027747A1 (zh) 一种界面显示方法及设备
WO2020168970A1 (zh) 一种控制屏幕显示的方法和电子设备
JP2023514962A (ja) 折り畳み可能画面のための表示方法および関連装置
WO2021000881A1 (zh) 一种分屏方法及电子设备
WO2021036770A1 (zh) 一种分屏处理方法及终端设备
CN112598594A (zh) 颜色一致性矫正方法及相关装置
CN110830645B (zh) 一种操作方法和电子设备及计算机存储介质
WO2021238370A1 (zh) 显示控制方法、电子设备和计算机可读存储介质
WO2021190524A1 (zh) 截屏处理的方法、图形用户接口及终端
CN114115769A (zh) 一种显示方法及电子设备
WO2023103951A1 (zh) 一种折叠屏的显示方法及相关装置
CN114579016A (zh) 一种共享输入设备的方法、电子设备及系统
WO2022156473A1 (zh) 一种播放视频的方法及电子设备
WO2023051511A1 (zh) 一种图标移动方法、相关图形界面及电子设备
WO2022143180A1 (zh) 协同显示方法、终端设备及计算机可读存储介质
WO2022257889A1 (zh) 显示方法及相关装置
CN115808997A (zh) 一种预览方法、电子设备及系统
WO2024067551A1 (zh) 界面显示方法及电子设备
CN116339569A (zh) 分屏显示的方法、折叠屏设备和计算机可读存储介质
WO2023045932A1 (zh) 设备控制方法及相关装置
CN115480849A (zh) 用户界面布局方法及相关设备
WO2024017332A1 (zh) 控制部件的方法及相关装置
WO2024001900A1 (zh) 一种折叠屏的显示方法以及相关设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22819487

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022819487

Country of ref document: EP

Effective date: 20230207

WWE Wipo information: entry into national phase

Ref document number: 18041547

Country of ref document: US

ENP Entry into the national phase

Ref document number: 20237005512

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE