EP3279763B1 - Verfahren zur steuerung einer anzeige und elektronische vorrichtung - Google Patents

Verfahren zur steuerung einer anzeige und elektronische vorrichtung Download PDF

Info

Publication number
EP3279763B1
EP3279763B1 EP17184551.4A EP17184551A EP3279763B1 EP 3279763 B1 EP3279763 B1 EP 3279763B1 EP 17184551 A EP17184551 A EP 17184551A EP 3279763 B1 EP3279763 B1 EP 3279763B1
Authority
EP
European Patent Office
Prior art keywords
electronic device
display
graphic element
image
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17184551.4A
Other languages
English (en)
French (fr)
Other versions
EP3279763A1 (de
Inventor
Yu-Sun CHEONG
Byung-Jin Kang
Yong-Jin Kwon
Gae-Youn Kim
Dae-myung Kim
Kwon-Ho SONG
Dong-Oh LEE
Suk-Jae Lee
Kwang-Hyun Cho
Byeng-Seok Choi
Ju-Yeong Lee
Hyun-Ju HONG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to EP22166971.6A priority Critical patent/EP4043997A1/de
Priority to EP19158122.2A priority patent/EP3521971B1/de
Publication of EP3279763A1 publication Critical patent/EP3279763A1/de
Application granted granted Critical
Publication of EP3279763B1 publication Critical patent/EP3279763B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1647Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1677Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3215Monitoring of peripheral devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3278Power saving in modem or I/O interface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3287Power saving characterised by the action undertaken by switching off individual functional units in the computer system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • H04M1/0216Foldable in one direction, i.e. using a one degree of freedom hinge
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • H04M1/0268Details of the structure or mounting of specific components for a display module assembly including a flexible display panel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04102Flexible digitiser, i.e. constructional details for allowing the whole digitising part of a device to be flexed or rolled like a sheet of paper
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information

Definitions

  • the present disclosure relates to a flexible electronic device and a display control method thereof.
  • a flexible electronic device may refer to an electronic device in which the form or shape of the device can be altered, similar to the manner in which paper is alterable.
  • the flexible electronic device may be deformed or otherwise altered by force applied by a user, taking on a format of being folded.
  • WO-2016/052814-A1 discloses a mobile terminal having a flexible display unit which is bendable, foldable, and rollable in at least a portion thereof, includes a flexible display unit configured to display first screen information, a sensing unit configured to sense a bending operation of the flexible display unit, and a control unit configured to control the flexible display unit to display second screen information different from the first screen information together with the first screen information, based on a bending operation of the flexible display unit, wherein the control unit divides the flexible display unit into at least two regions and displays the first screen information and the second screen information in the regions, separately.
  • US-2013/321264-A1 discloses a mobile terminal including a flexible display unit and a related control method.
  • the mobile terminal may include a flexible display unit configured to display first screen information that is flexible in response to an external physical force, a sensing unit configured to sense flexure of the flexible display unit and a controller configured to control the flexible display unit to output second screen information containing information associated with the first screen information on one region of the flexible display unit in response to the flexure.
  • US-2014/282222-A1 discloses a mobile terminal including a touch screen display configured to display a plurality of items included in a list; and a controller configured to: sense a predetermined-type first touch input with respect to the touch screen display, partition the touch screen display into at least first and second regions in response to the first touch input, display at least some of the items that have been displayed on the touch screen display prior to the sensing of the first touch input, as they are, to the first region, display at least one other item including a first item or a last item in the list to the second region, sense a predetermined-type second touch input different from the first touch input with respect to the touch screen display, and move together the items displayed on the first and second regions in response to the sensing of the second touch input.
  • US-2014/055429-A1 discloses a flexible display apparatus.
  • the flexible display apparatus includes a display unit, a sensor configured to sense a bending of the flexible display apparatus, and a controller configured to display first contents on a first screen of the display unit, and to reconfigure and display the first contents on a second screen generated on an area of the display unit based on the bending.
  • US-2016/198100-A1 discloses a display device for displaying an image and a method by which the display device operates to display an image.
  • the display device may include a display configured to output a screen image, an image sensor configured to acquire an image signal, a bending detection sensor configured to detect a bending motion or a bent state of the display device, and a control unit configured to control the display to display an image, which is generated based on the image signal, in a region according to a position at which the display is bent on the screen image if the bending detection sensor detects the bending motion or a bent state.
  • a folding-type flexible electronic device may have a length longer than that of a conventional electronic device, and may therefore provide a screen having an increased length.
  • the folding-type flexible electronic device may implement a method that can advantageously, variously and efficiently utilize a screen having an elongated shape relative to average electronic devices today.
  • an electronic device including a flexible display, and a processor.
  • the processor is configured to: control, in response to detection of a user input, the flexible display to divide a display area into a first area and a second area; control the flexible display to display a first graphic element related to a camera application in the first area, and a second graphic element related to controlling the camera application in the second area, the camera application having setting values configurable to alter images captured by the camera application, and the second graphic element further includes a graphical user interface including an image related to a current location of the electronic device; and control the camera application responsive to detecting an input to the second graphic element; wherein: the detected input to the second graphic element includes selection of the image related to the current location, and wherein, as a consequence of the selection, the processor is further configured to alter at least one setting value of the camera application to match at least one setting value indicated by the selected image.
  • a method in an electronic device comprising: detecting a user input, in response to detecting the user input, controlling, by a processor, a flexible display to divide a display area into a first area and a second area, controlling the flexible display to display a first graphic element related to a camera application in the first area, and a second graphic element related to controlling the camera application in the second area, the camera application having setting values configurable to alter images captured by the camera application, and the second graphic element further includes a graphical user interface including an image related to a current location of the electronic device; and controlling the camera application responsive to detecting an input to the second graphic element; wherein the detected input to the second graphic element includes selection of the image related to the current location, the method further comprising altering at least one setting value of the camera application to match at least one setting value indicated by the selected image.
  • multiple display areas may be configured on the screen of the flexible display, and each of the multiple display areas may implement display of a different scheme, allowing the elongated screen to be used in diversely, variously and efficiently.
  • first element When an element (e.g., first element) is referred to as being (operatively or communicatively) "connected,” or “coupled,” to another element (e.g., second element), it may be directly connected or coupled directly to the other element or any other element (e.g., third element) may be interposed between them.
  • another element e.g., second element
  • any other element e.g., third element
  • the expression “configured to” may be exchanged with, for example, “suitable for”, “having the capacity to”, “adapted to”, “made to”, “capable of', or “designed to” in terms of hardware or software, according to circumstances.
  • the expression “device configured to” may mean that the device, together with other devices or components, “is able to”.
  • the phrase “processor adapted (or configured) to perform A, B, and C” may mean a dedicated processor (e.g. embedded processor) for performing the corresponding operations or a general-purpose processor (e.g., Central Processing Unit (CPU) or Application Processor (AP)) that can perform the corresponding operations by executing one or more software programs stored in a memory device.
  • a dedicated processor e.g. embedded processor
  • a general-purpose processor e.g., Central Processing Unit (CPU) or Application Processor (AP)
  • An electronic device may include at least one of, for example, a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic book reader (e-book reader), a desktop PC, a laptop PC, a netbook computer, a workstation, a server, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an MPEG-1 audio layer-3 (MP3) player, a medical device, a camera, and a wearable device.
  • a smart phone a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic book reader (e-book reader), a desktop PC, a laptop PC, a netbook computer, a workstation, a server, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an MPEG-1 audio layer-3 (MP3) player, a medical device, a camera, and a wearable device.
  • PC Personal Computer
  • PMP Portable Multimedia Player
  • MP3 MPEG-1 audio layer-3
  • the wearable device may include at least one of an accessory type (e.g., a watch, a ring, a bracelet, an anklet, a necklace, glasses, a contact lens, or a Head-Mounted Device (HMD)), a fabric- or clothing-integrated type (e.g., an electronic clothing), a body-mounted type (e.g., a skin pad or tattoo), and a bio-implantable type (e.g., an implantable circuit).
  • an accessory type e.g., a watch, a ring, a bracelet, an anklet, a necklace, glasses, a contact lens, or a Head-Mounted Device (HMD)
  • a fabric- or clothing-integrated type e.g., an electronic clothing
  • a body-mounted type e.g., a skin pad or tattoo
  • a bio-implantable type e.g., an implantable circuit
  • the electronic device may include at least one of, for example, a television, a Digital Video Disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washing machine, an air cleaner, a set-top box, a home automation control panel, a security control panel, a media box (e.g., Samsung HomeSyncTM, Apple TVTM, or Google TVTM), a game console (e.g., XboxTM and PlayStationTM), an electronic dictionary, an electronic key, a camcorder, and an electronic photo frame.
  • DVD Digital Video Disk
  • the electronic device may include at least one of various medical devices (e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood pressure measuring device, a body temperature measuring device, and/or the like), a Magnetic Resonance Angiography (MRA), a Magnetic Resonance Imaging (MRI), a Computed Tomography (CT) machine, and an ultrasonic machine), a navigation device, a Global Navigation Satellite System (GNSS), an Event Data Recorder (EDR), a Flight Data Recorder (FDR), a vehicle infotainment device, electronic devices for a ship (e.g., a navigation device for a ship, a gyro-compass, and/or the like), avionics, security devices, an automotive head unit, a robot for home or industry, a drone, an Automatic Teller's Machine (ATM) in banks, Point Of Sales (POS) in a shop, and an Internet of Things device (e.g., a
  • the electronic device may include at least one of a part of a piece of furniture, a building/structure, or a motor vehicle, an electronic board, an electronic signature receiving device, a projector, and various kinds of measuring instruments (e.g., a water meter, an electric meter, a gas meter, a radio wave meter, and/or the like).
  • the electronic device may be flexible, or may be a combination of two or more of the aforementioned various devices.
  • the electronic device according to an embodiment of the present disclosure is not limited to the aforementioned devices.
  • the term "user" may indicate a person using an electronic device or a device (e.g. an artificial intelligence electronic device) using an electronic device.
  • FIGs. 1A to 1C are views illustrating an electronic device according to various embodiments of the present disclosure.
  • FIG. 1A is a view illustrating a partially unfolded front of the electronic device
  • FIG. 1B is a view illustrating an unfolded rear of the electronic device
  • FIG. 1C is a view illustrating a folded rear of the electronic device.
  • a display 160 may be disposed on a front surface 102 of the electronic device 101.
  • the display 160 may be formed so as to occupy most of the front surface 102 of the electronic device 101.
  • a main home screen may be displayed on the display 160.
  • the main home screen may be a first screen or a Graphical User Interface (GUI) displayed on the display 160 and facilitates user interaction when the electronic device 101 is powered on.
  • GUI Graphical User Interface
  • At least one of shortcut icons for executing frequently-used applications, a main menu switching key, time, and weather may be displayed on the main home screen.
  • a menu screen may be displayed on the display 160.
  • a status bar may be displayed that includes at least one state of the electronic device 101 such as the state of battery charging, the strength of a received signal, a current time point, and/or the like.
  • the status bar may be displayed at an upper end part of the display 160.
  • a home button, a menu button, and/or a back button may be formed at the edge (e.g., a "bezel” or a "case part around the display 160") of the front surface 102 of the electronic device 101, which is disposed beside/below the display 160.
  • the home button may be activated to display the main home screen on the display 160. Also, the home button may be activated to cause the display 160 to display recently-used applications or a task manager.
  • the menu button may be used to provide a connection menu capable of being displayed on the display 160.
  • the connection menu may include at least one of a widget addition menu, a background screen change menu, a search menu, an editing menu, and an environment setting menu.
  • the back button may be activated to display a screen executed just before a screen being currently executed, or may be used to terminate the most recently-used application.
  • a first camera 190 and/or at least one sensor may be disposed on the front surface of the electronic device 101.
  • the at least one sensor may include one or more of a distance sensor, an illuminance sensor, a proximity sensor, and/or the like.
  • the at least one sensor may be disposed at the edge of the front surface 102 of the electronic device 101, which is disposed beside/above the display 160.
  • a sub-display 162, a second camera 192, a flash, a speaker, and/or at least one sensor such as a distance sensor and/or the like may be disposed on a rear surface 103 of the electronic device 101.
  • the electronic device 101 When in a folded configuration (e.g., a state of being folded), the electronic device 101 may be configured to display graphic elements, user interface(s), and/or information of a preset condition or type (e.g., message reception/transmission information, status information of the electronic device, and/or the like.) on the sub-display 162.
  • the sub-display 162 may be configured to detect pressure (e.g., indicating a tap and/or knock input of a user). For example, when the user 'knocks' (e.g., double knocking or tapping) the sub-display 162, the electronic device 101 may be configured to display graphic elements, user interface(s) and/or other designated information on the sub-display 162.
  • the sub-display 162 may be configured to detect fingerprint information.
  • one or more features may be disposed on a lateral surface 104, including a power/lock button, a volume button having a volume increase button and a volume reduction button, a terrestrial Multimedia Broadcasting (DMB) antenna for receiving a broadcast signal, at least one microphone, and/or the like.
  • DMB terrestrial Multimedia Broadcasting
  • FIGs. 2A and 2B are views each illustrating an electronic device according to various embodiments of the present disclosure.
  • the electronic device 201a may include a second camera 292a, and may be configured such that a display 260a is not exposed to an exterior of the device (e.g., 'outside') with respect to a lengthwise end part of the electronic device 201 when the electronic device 201a is folded.
  • a folded state of the device may be characterized by the lengthwise end parts of the electronic device 201 contacting each other, or otherwise coming as physically close as possible to one another.
  • the electronic device 201b may include a second camera 292b, and may be alternatively configured such that a part 262b of a display 260b is exposed to an exterior of the device in order to function as a sub-display, even when the electronic device 201b is folded.
  • FIGs. 3A and 3B are views illustrating an electronic device according to various embodiments of the present disclosure.
  • FIG. 3A is a view illustrating an unfolded front of the electronic device
  • FIG. 3B is a view illustrating a cross section along a lengthwise direction of the electronic device.
  • the electronic device 301 may include a strain sensor 342 (e.g., a strain gauge), a first sensor 345, a second sensor 346, a display 360, a hinge 386, an upper Printed-Circuit Board (PCB) 382, a lower PCB 384, and a battery 395.
  • a strain sensor 342 e.g., a strain gauge
  • the strain sensor 342 may be disposed at a folded position 380 of the electronic device 301, and may output a "strain value" used to indirectly measure a folding angle of the electronic device 301.
  • An angle sensor may be disposed at the folded position 380 of the electronic device 301, and may directly measure the folding angle of the electronic device 301 or the hinge 386.
  • the first sensor 345 may be disposed at the edge of a front surface of the electronic device 301, which is located beside/above the display 360, and the second sensor 346 may be disposed at the edge of a front surface of the electronic device 301, which is located beside/below the display 360.
  • the first and second sensors 345 and 346 may provide information for calculating the folding angle of the electronic device 301.
  • each of the first and second sensors 345 and 346 may include at least one of a distance sensor and a gyroscope sensor.
  • the display 360 may include a flexible panel 361 for displaying an image and a third sensor 362 for detecting at least one of pressure, fingerprint, and/or the like.
  • the upper PCB 382 and the lower PCB 384 may be separated from each other with the hinge 386 interposed in between, and may be electrically connected to one other through a flexible connector.
  • the hinge 386 (e.g., a free stop hinge) may maintain a folded state of the electronic device 301 at various angles.
  • the electronic device 401 may include a bus 410, a processor 420, a memory 430, a sensor module 440, an input/output interface 450, a display 460 (e.g., the display 160), and a communication interface 470.
  • a bus 410 may be included in the electronic device 401, or the electronic device 401 may additionally include other elements.
  • the electronic device 401 may further include at least one of a frame buffer 463, a Graphics Processing Unit (GPU) 464, and a Touch Screen Panel (TSP) 466 (or also referred to as a "touch panel").
  • GPU Graphics Processing Unit
  • TSP Touch Screen Panel
  • the bus 410 may include a circuit that interconnects the elements 410 to 470 and delivers a communication (e.g., a control message or data) between the elements 410 to 470.
  • the processor 420 may include one or more of a CPU, an AP, and a Communication Processor (CP). The processor 420 may perform, for example, calculations or data processing related to control over and/or communication by at least one of the other elements of the electronic device 401.
  • the memory 430 may include a volatile memory and/or a non-volatile memory.
  • the memory 430 may store, for example, commands or data related to at least one of the other elements of the electronic device 401.
  • the memory 430 may store software and/or a program.
  • the program may include, for example, a kernel, middleware, an Application Programming Interface (API), and/or an application program (or an application).
  • API Application Programming Interface
  • the kernel may control or manage system resources (e.g., the bus 410, the processor 420, the memory 430, and/or the like) used to execute operations or functions implemented by the other programs (e.g., the middleware, the API, and the application program).
  • the kernel may provide an interface capable of controlling or managing the system resources by accessing the individual elements of the electronic device 401 by using the middleware, the API, or the application program.
  • the middleware may serve as an intermediary that enables the API or the application program to communicate with the kernel and to exchange data therewith.
  • the middleware may process one or more task requests received from the application program according to a priority.
  • the middleware may assign a priority, which enables the use of system resources (e.g., the bus 410, the processor 420, the memory 430, and/or the like) of the electronic device 401, to at least one of the application programs, and may process the one or more task requests according to the assigned priority.
  • the API is an interface through which the application controls a function provided by the kernel or the middleware, and may include, for example, at least one interface or function (e.g., command) for file control, window control, image processing, character control, or the like.
  • the sensor module 440 may measure a physical quantity, or may detect an operation state of the electronic device 401, and may convert the measured physical quantity or the detected operation state into an electrical signal.
  • the sensor module 440 may include at least one of, for example, an angle sensor 442, a distance sensor 444, and a gyroscope sensor 446.
  • the sensor module 440 may include at least one of a GNSS sensor, a Global Positioning System (GPS) sensor, a gesture sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor (e.g., a Red- Green-Blue (RGB) sensor), a biometric sensor, a temperature/humidity sensor, an illuminance sensor, and an Ultraviolet (UV) sensor.
  • GPS Global Positioning System
  • gesture sensor e.g., a gesture sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor (e.g., a Red- Green-Blue (RGB) sensor), a biometric sensor, a temperature/humidity sensor, an illuminance sensor, and an Ultraviolet (UV) sensor.
  • GPS Global Positioning System
  • a gesture sensor e.g., a gesture sensor, an atmospheric pressure sensor, a magnetic sensor, an
  • the sensor module 440 may include, for example, an e-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an Infrared (IR) sensor, an iris sensor, and/or a fingerprint sensor.
  • the sensor module 440 may further include a control circuit for controlling one or more sensors included therein.
  • the electronic device 401 may further include a processor configured to control the sensor module 440 as a part of or separately from the processor 420, and may control the sensor module 440 while the processor 420 is in a sleep state.
  • the input/output interface 450 may deliver a command or data, which is input from a user or another external device, to the element(s) other than the input/output interface 450 within the electronic device 401, or may output, to the user or another external device, commands or data received from the element(s) other than the input/output interface 450 within the electronic device 401.
  • the input/output interface 450 may include at least one of, for example, a speaker, a receiver, an earphone, and a microphone.
  • the display 460 may include at least one of a Liquid Crystal Display (LCD), a Light-Emitting Diode (LED) display, an Organic Light-Emitting Diode (OLED) display, a MicroElectroMechanical Systems (MEMS) display, an electronic paper display, and a projector.
  • the display 460 may display various pieces of content (e.g., text, images, videos, icons, symbols, and/or the like.) to the user.
  • the frame buffer 463 may store pixel values or pixel color values to be output to the display 460, and may be implemented by a memory, the GPU 464, a memory within the display 460 or a display controller, or a virtual apparatus such as a frame buffer apparatus employing Linux.
  • the GPU 464 may generate a screen including various objects, such as an item, an image, text, and/or the like.
  • the GPU 464 may calculate at least one attribute value among a coordinate value, a form, a size, a color, and/or the like by using which each object is to be displayed according to the layout of a screen, and may generate a screen, which has various layouts and includes the objects, on the basis of the calculated attribute value.
  • a screen or an application screen may refer to the whole or part of an image shown on a surface (or a display area) of the display 460.
  • the application screen may be referred to as a "graphical interface", a "GUI”, an "application window”, an "application area”, and/or the like.
  • the TSP 466 may receive a touch input, a gesture input, a proximity input, or a hovering input provided by an electronic pen or a body part of the user.
  • the TSP 466 may be included in the display 460.
  • the TSP 466 may detect a touch/hovering input by using at least one of, for example, a capacitive scheme, a resistive scheme, an infrared scheme, and a surface acoustic wave scheme.
  • the TSP 466 may further include a control circuit.
  • the TSP 466 may further include a tactile layer and may provide a tactile reaction to the user.
  • the communication interface 470 may establish, for example, communication between the electronic device 401 and an external device (e.g., an external electronic device 404 or a server 406).
  • the communication interface 470 may be connected to a network 402 through wireless or wired communication and may communicate with the external device (e.g., the second external electronic device 404 or the server 406).
  • the types of the wireless communication may include, for example, cellular communication which uses at least one of Long-Term Evolution (LTE), LTE-Advanced (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Universal Mobile Telecommunications System (UMTS), WiBro (Wireless Broadband), and Global System for Mobile Communications (GSM).
  • LTE Long-Term Evolution
  • LTE-A Long-Term Evolution
  • CDMA Code Division Multiple Access
  • WCDMA Wideband CDMA
  • UMTS Universal Mobile Telecommunications System
  • WiBro Wireless Broadband
  • GSM Global System for Mobile Communications
  • the types of the wireless communication may include at least one of, for example, WiFi, Bluetooth, Bluetooth Low Energy (BLE), Zigbee, Near Field Communication (NFC), magnetic secure transmission, Radio Frequency (RF), and Body Area Network (BAN).
  • BLE Bluetooth Low Energy
  • NFC Near Field Communication
  • RF Radio Frequency
  • BAN Body Area Network
  • the types of the wireless communication may include a GNSS.
  • the GNSS may be, for example, a Global Positioning System (GPS), a Global Navigation Satellite System (Glonass), a Beidou Navigation Satellite System (hereinafter "Beidou”), or a European Global Satellite-based Navigation System (Galileo).
  • GPS Global Positioning System
  • Glonass Global Navigation Satellite System
  • Beidou Beidou Navigation Satellite System
  • Galileo European Global Satellite-based Navigation System
  • the types of the wired communication may include at least one of, for example, a Universal Serial Bus (USB), a High Definition Multimedia Interface (HDMI), Recommended Standard 232 (RS-232), Power Line communication (PLC), and a Plain Old Telephone Service (POTS).
  • Examples of the network 402 may include at least one of telecommunication networks, such as a computer network (e.g., a Local Area Network (LAN), or a Wide Area Network (WAN)), the Internet, and a telephone network.
  • the external electronic device 404 may be of a type identical to or different from that of the electronic device 401. According to various embodiments of the present disclosure, all or some of operations performed by the electronic device 401 may be performed by another electronic device or multiple electronic devices (e.g., the external electronic device 404 or the server 406). According to an embodiment of the present disclosure, when the electronic device 401 needs to perform some functions or services automatically or by a request, the electronic device 401 may send, to another device (e.g., the external electronic device 404 or the server 406), a request for performing at least some functions related to the functions or services, instead of performing the functions or services by itself, or additionally.
  • another device e.g., the external electronic device 404 or the server 406
  • Another electronic device may execute the requested functions or the additional functions, and may deliver a result of the execution to the electronic device 401.
  • the electronic device 401 may process the received result without any change or additionally and may provide the requested functions or services.
  • use may be made of, for example, cloud computing technology, distributed computing technology, or client-server computing technology.
  • FIG. 4B is a block diagram illustrating a configuration of an electronic device 201 according to various embodiments of the present disclosure.
  • the electronic device 201 may include, for example, the whole or part of the electronic device 401 illustrated in FIG. 4A .
  • the electronic device 201 may include at least one processor (e.g., an AP) 210, a communication module 220, a subscriber identification module 224, a memory 230, a sensor module 240, an input apparatus 250, a display 260, an interface 270, an audio module 280, a camera module 291, a power management module 295, a battery 296, an indicator 297, and a motor 298.
  • processor e.g., an AP
  • the processor 210 may control multiple hardware or software elements connected to the processor 210 and may perform the processing of and arithmetic operations on various data, by running, for example, an OS or an application program.
  • the processor 210 may be implemented by, for example, a System on Chip (SoC).
  • SoC System on Chip
  • the processor 210 may further include a Graphics Processing Unit (GPU) and/or an image signal processor.
  • the processor 210 may include at least some (e.g., a cellular module 221) of the elements illustrated in FIG. 4B .
  • the processor 210 may load, into a volatile memory, instructions or data received from at least one (e.g., a non-volatile memory) of the other elements and may process the loaded instructions or data, and may store the resulting data in a non-volatile memory.
  • the communication module 220 may include, for example, the cellular module 221, a Wi-Fi module 223, a Bluetooth (BT) module 225, a GNSS module 227, an NFC module 228, and an RF module 229.
  • the cellular module 221 may provide a voice call, a video call, a text message service, an Internet service, and/or the like through a communication network.
  • the cellular module 221 may identify or authenticate the electronic device 201 in the communication network by using the subscriber identification module (e.g., a Subscriber Identity Module (SIM) card) 224.
  • SIM Subscriber Identity Module
  • the cellular module 221 may perform at least some of the functions that the processor 210 may provide.
  • the cellular module 221 may include a CP.
  • at least some (e.g., two or more) of the cellular module 221, the Wi-Fi module 223, the BT module 225, the GNSS module 227, and the NFC module 228 may be included in one Integrated Chip (IC) or IC package.
  • the RF module 229 may transmit and receive, for example, communication signals (e.g., RF signals).
  • the RF module 229 may include, for example, a transceiver, a Power Amplifier Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), and an antenna.
  • PAM Power Amplifier Module
  • LNA Low Noise Amplifier
  • at least one of the cellular module 221, the Wi-Fi module 223, the BT module 225, the GNSS module 227, and the NFC module 228 may transmit and receive RF signals through a separate RF module.
  • the subscriber identification module 224 may include, for example, a card including a subscriber identity module or an embedded SIM, and may contain unique identification information (e.g., an Integrated Circuit Card Identifier (ICCID)) or subscriber information (e.g., an International Mobile Subscriber Identity (IMSI)).
  • ICCID Integrated Circuit Card Identifier
  • IMSI International Mobile Subscriber Identity
  • the memory 230 may include, for example, an internal memory 232 or an external memory 234.
  • the internal memory 232 may include at least one of, for example, a volatile memory (e.g., a Dynamic Random Access Memory (DRAM), a Static RAM (SRAM), a Synchronous DRAM (SDRAM), and/or the like); and a non-volatile memory (e.g., a One Time Programmable Read-Only Memory (OTPROM), a Programmable ROM (PROM), an Erasable and Programmable ROM (EPROM), an Electrically Erasable and Programmable ROM (EEPROM), a mask ROM, a flash ROM, a flash memory, a hard drive, and a Solid State Drive (SSD)).
  • a volatile memory e.g., a Dynamic Random Access Memory (DRAM), a Static RAM (SRAM), a Synchronous DRAM (SDRAM), and/or the like
  • a non-volatile memory e.g.,
  • the external memory 234 may include a flash drive, for example, a Compact Flash (CF), a Secure Digital (SD), a Micro-Secure Digital (Micro-SD), a Mini-Secure Digital (Mini-SD), an extreme Digital (xD), a Multi-Media Card (MMC), a memory stick, or the like.
  • the external memory 234 may be functionally or physically connected to the electronic device 201 through various interfaces.
  • the sensor module 240 may measure a physical quantity or may detect an operation state of the electronic device 201, and may convert the measured physical quantity or the detected operation state into an electrical signal.
  • the sensor module 240 may include at least one of, for example, a gesture sensor 240A, a gyro sensor 240B, an atmospheric pressure sensor 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, a proximity sensor 240G, a color sensor 240H (e.g., a Red-Green-Blue (RGB) sensor), a biometric sensor 2401, a temperature/humidity sensor 240J, an illuminance sensor 240K, and an Ultraviolet (UV) sensor 240M.
  • a gesture sensor 240A e.g., a gyro sensor 240B
  • an atmospheric pressure sensor 240C e.g., a MEMS pressure sensor 240C
  • a magnetic sensor 240D e.g.,
  • the sensor module 240 may include, for example, an E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an Infrared (IR) sensor, an iris sensor, and/or a fingerprint sensor.
  • the sensor module 240 may further include a control circuit for controlling one or more sensors included therein.
  • the electronic device 201 may further include a processor configured to control the sensor module 240 as a part of or separately from the processor 210, and may control the sensor module 240 while the processor 210 is in a sleep state.
  • the input apparatus 250 may include, for example, a touch panel 252, a (digital) pen sensor 254, a key 256, and an ultrasonic input unit 258.
  • the touch panel 252 may use at least one of, for example, a capacitive scheme, a resistive scheme, an infrared scheme, and a surface acoustic wave scheme.
  • the touch panel 252 may further include a control circuit.
  • the touch panel 252 may further include a tactile layer and may provide a tactile reaction to the user.
  • the (digital) pen sensor 254 may include, for example, a recognition sheet that is a part of the touch panel or is separated from the touch panel.
  • the key 256 may be, for example, a physical button, an optical key, and a keypad.
  • the ultrasonic input unit 258 may sense an ultrasonic wave generated by an input through a microphone (e.g., a microphone 288), and may confirm data corresponding to the sensed ultrasonic wave.
  • the display 260 may include a panel 262, a hologram unit 264, a projector 266, and/or a control circuit for controlling the same.
  • the panel 262 may be implemented to be, for example, flexible, transparent, or wearable.
  • the panel 262 together with the touch panel 252 may be implemented as one or more modules.
  • the panel 262 may include a pressure sensor (or a force sensor) capable of measuring the strength of pressure of a user's touch.
  • the pressure sensor and the touch panel 252 may be integrated into one unit, or the pressure sensor may be implemented by one or more sensors separated from the touch panel 252.
  • the hologram unit 264 may display a three-dimensional image in the air by using the interference of light.
  • the projector 266 may display an image by projecting light onto a screen.
  • the screen may be located, for example, inside or outside the electronic device 201.
  • the interface 270 may include, for example, a High-Definition Multimedia Interface (HDMI) 272, a Universal Serial Bus (USB) 274, an optical interface 276, and a D-subminiature (D-sub) 278.
  • the interface 270 may be included in, for example, the communication interface 470 illustrated in FIG. 4A .
  • the interface 270 may include, for example, a Mobile High-definition Link (MHL) interface, a Secure Digital (SD) card/Multi-Media Card (MMC) interface, or an Infrared Data Association (IrDA) standard interface.
  • MHL Mobile High-definition Link
  • SD Secure Digital
  • MMC Multi-Media Card
  • IrDA Infrared Data Association
  • the audio module 280 may bidirectionally convert between a sound and an electrical signal. At least some elements of the audio module 280 may be included in, for example, the input/output interface 450 illustrated in FIG. 4A .
  • the audio module 280 may process sound information which is input or output through, for example, a speaker 282, a receiver 284, an earphone 286, the microphone 288, or the like.
  • the camera module 291 is, for example, a device capable of capturing a still image and a moving image.
  • the camera module 291 may include one or more image sensors (e.g., a front sensor or a back sensor), a lens, an Image Signal Processor (ISP), and a flash (e.g., an LED, a xenon lamp, or the like).
  • the power management module 295 may manage, for example, power of the electronic device 201.
  • the power management module 295 may include a Power Management Integrated Circuit (PMIC), a charger IC, or a battery or fuel gauge.
  • the PMIC may use a wired and/or wireless charging method. Examples of the wireless charging method may include, for example, a magnetic resonance method, a magnetic induction method, an electromagnetic method, and/or the like.
  • the PMIC may further include additional circuits (e.g., a coil loop, a resonance circuit, a rectifier, and/or the like) for wireless charging.
  • the battery gauge may measure, for example, a residual quantity of the battery 296, and a voltage, a current, or a temperature during the charging.
  • the battery 296 may include, for example, a rechargeable battery and/or a solar battery.
  • the indicator 297 may display a particular state (e.g., a booting state, a message state, a charging state, or the like) of the electronic device 201 or a part (e.g., the processor 210) of the electronic device 201.
  • the motor 298 may convert an electrical signal into a mechanical vibration, and may generate a vibration, a haptic effect, or the like.
  • the electronic device 201 may include, for example, a mobile television (TV) support unit (e.g., a GPU) that may process media data according to a standard, such as Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), or mediaFLOTM.
  • DMB Digital Multimedia Broadcasting
  • DVD Digital Video Broadcasting
  • mediaFLOTM mediaFLOTM
  • each of the above-described elements of hardware according to the present disclosure may include one or more components, and the names of the corresponding elements may vary based on the type of electronic device.
  • the electronic device e.g., the electronic device 201
  • the electronic device 201 may omit some elements or may further include additional elements, or some of the elements of the electronic device may be combined into one entity, which may perform functions identical to those of the relevant elements before the combination.
  • FIGs. 5A to 5G are views illustrating a program module according to various embodiments of the present disclosure.
  • FIG. 5A is a block diagram illustrating a configuration of a program module according to various embodiments of the present disclosure.
  • the program module may include an OS for controlling resources related to the electronic device (e.g., the electronic device 101, 201, or 401) and/or various applications executed in the OS.
  • the OS may be, for example, AndroidTM, iOSTM, WindowsTM, SymbianTM, TizenTM, BadaTM, and/or the like.
  • At least some of the program module may be preloaded on the electronic device, or may be downloaded from an external electronic device (e.g., the electronic device 404, the server 406, and/or the like).
  • a folding event converter 520 may analyze raw data received from a sensor 510 (e.g., a strain sensor 512, a distance sensor 513, a gyroscope sensor 514, and/or the like), and may calculate a folding state.
  • a sensor 510 e.g., a strain sensor 512, a distance sensor 513, a gyroscope sensor 514, and/or the like
  • folding may refer to "bending”
  • folding event may refer to a “folding gesture.”
  • a folding event dispatcher/handler 528 may deliver a folding state/event to a system 565 or an application 560.
  • a folding state manager 522 may deliver a folding state/event to a relevant module.
  • a state memory module 524 may store a folding state/event.
  • a notification/message manager 541 may provide a user with an event, such as an arrival message, an appointment, a proximity notification, and/or the like.
  • An event logger 542 may record events, and may display the events in chronological order.
  • a telephony module 543 may manage a voice/video call function of the electronic device.
  • a timer 544 may provide a timer function.
  • a location manager 545 may manage, for example, location information of the electronic device.
  • a power manager 546 may operate in conjunction with, for example, a power controller 548 and may manage the capacity of a battery or power, and may provide power information utilized for an operation of the electronic device.
  • a system event receiver 532 may receive an event from the notification/message manager 541, the event logger 542, the telephony module 543, the timer 544, or the power manager 546, and may deliver the received event to a context manager 534.
  • An input handler 530 may receive an input from a TSP 515, a mouse 516, or a key 517, and may deliver the received input to the context manager 534.
  • the context manager 534 may manage a system service on the basis of an input, a system event, a folding state, and/or the like.
  • a display controller 574 may control on/off of a screen.
  • a frame buffer 572 may store pixel values or pixel color values to be output to the display.
  • a graphic composer 570 may generate a screen including various objects, such as an item, an image, text, and/or the like.
  • a window manager 550 may manage GUI resources used for a screen.
  • the application 560 may include applications which provide, for example, a home, a dialer, an SMS/MMS, an Instant Message (IM), a browser, a camera, an alarm, a contact, a voice dial, an email, a calendar, a media player, an album, a watch, health care (e.g., which measures an exercise quantity, a blood sugar level, or the like), and environmental information (e.g., information on atmospheric pressure, humidity, or temperature).
  • IM Instant Message
  • At least part of the program module may be implemented (e.g., executed) in software, firmware, hardware (e.g., a processor), or at least two combinations thereof, and may include a module, a program, a routine, a set of instructions, or a process for performing one or more functions.
  • the folding event converter 520 may analyze raw data from the sensor 510 and may calculate a folding state. For example, the folding event converter 520 may calculate a folding angle of the electronic device 301 by comparing an angle formed between the axis of the first sensor 345 and the ground, which has been measured by the first sensor 345 including a gyroscope sensor, with an angle formed between the axis of the second sensor 346 and the ground which has been measured by the second sensor 346 including a gyroscope sensor.
  • the folding event converter 520 may know that the electronic device 301 is folded and the first sensor 345 and the second sensor 346 are in a state of facing each other, and may calculate a folding angle of the electronic device 301 since a distance between the first sensor 345 and the second sensor 346 is corresponding to (or proportional to) the folding angle.
  • a folding state may be directly measured by an angle sensor 511, or may be determined through calculation performed by the folding event converter 520.
  • the folding state may be expressed by an angle itself, may be expressed by one state corresponding to a predetermined angle range, or may be expressed by the trend of increase/reduction in angle.
  • the folding state manager 522 may deliver a folding state/event to the event handler 528, and the folding state/event may be processed by the registered event handler 528.
  • the event handler 528 may be registered by various modules, such as the system 565, the application 560, a background service, and/or the like, and the registered event handler 528 may process a folding state/event according to a priority.
  • a priority may be determined in order of the system 565, a visible (or activated) application, an application which is being executed but is not directly visible to the user, and a background service which does not have a visible UI.
  • a folding state/event may be processed by an event handler having the next priority.
  • the sequential processing of a folding state/event may have an effect described below.
  • the system 565 may display an alpha screen or a gadget.
  • the alpha screen may be implemented by a window that is capable of freely moving and displays at least one piece of content/information among multimedia content, user-customized information, and real-time information.
  • the system 565 may not process a folding event, and when the event handler 528 having the next priority is registered in a video application, the video application may stop or start the reproduction of a video according to a folding/unfolding event.
  • the folding state manager 522 may directly deliver a folding state/event to the system 565 or the application 560 without passing through the event handler. For example, the folding state manager 522 may determine whether a folding state/event is to be delivered and/or a subject to which the folding state/event is to be delivered, based on context information of the electronic device which has been acquired from the context manager 534.
  • a screen of a display 580 may be divided into multiple areas, notably without utilizing a scheme for virtually dividing the display 580 among screen division schemes.
  • coordinates on the display 580 corresponding to an identical point (e.g., at a same relative position on both display partitions) in application windows 583 and 584 are different, according to locations of the application windows 583 and 584. Therefore, the window manager 550 may adjust the application windows 583 and 584 so as to correspond to the divided areas.
  • the window manager 550 may adjust the virtual displays 581 and 582 so as to correspond to the divided areas.
  • FIG. 6A is an illustrative view explaining utilization of a ratio for screen division in an electronic device, according to various embodiments of the present disclosure.
  • a default screen (or a full screen) 600 of a display may have a screen ratio of 24:9. It is understood that the ratio disclosed herein is exemplary, and a ratio of the default screen 600 may be changed with or without alterations in the physical dimensions of the screen. Also, it may be understood that a first window 605 occupying the entire area of the display utilizes the ratio of 24:9.
  • the default screen 600 is illustrated as presently an unfolded screen of the display, and may be switched to display of screens 610, 620, and 630 having various division ratios in response to an event for screen division.
  • the default screen 600 in a state of displaying the default screen 600, the default screen 600 may be switched to a first screen 610 on which a second window 607 having a ratio of 8:9 is displayed in an overlay form in the first window 605 having a ratio of 24:9 in response to the type of event related to screen division.
  • the size of the first window 615 may be reduced to a ratio of 16:9 and the second screen 620 on which the second window 617 having a ratio of 8:9 shares a common boundary line with the first window 615 may be displayed.
  • the second window (e.g., an alpha screen) 607 displayed in an overlay form may be invoked in response to folding interaction, and the second window (e.g., an alpha screen) 607 may be hidden in response to unfolding interaction.
  • the fixed second window (e.g., an alpha screen) 617 may be displayed in response to a lining input, which traverses a screen, as in the case of the second screen 620.
  • the second window (e.g., an alpha screen) 617 may be cancelled in response to a return input, for example, a lining input which traverses a screen in an opposite direction, and the current state may be changed to the state of displaying the default screen 600.
  • a return input for example, a lining input which traverses a screen in an opposite direction
  • the third screen on which at least two windows 625 and 627 all having a ratio of 4:3 are disposed with a folded part of the display as a reference may be displayed in response to the type of event related to screen division.
  • switching from the above-described default screen 600 to the first screen 610 or the second screen 620, and switching from the above-described default screen 600 to the third screen 630 may be achieved with an event for screen division as a reference.
  • the event for screen division may include a change of a folding state, an input (or a touch/hovering/swipe input) which traverses at least part in one direction on the display 460, an input which moves from a first end along a widthwise direction of the display 460 or a position close to the first end to a second end located on the opposite side of the first end or a position close to the second end, a touch input in the case where a position at which the touch input has occurred continues to be touched during a predetermined time period, a change in touch strength, a folding/bending/motion input for folding or bending the electronic device, a voice input, the selection of a software button (or a toggle button) displayed on the display 460, and/or the like.
  • At least two screen ratios which configure the display 460 may be set in response to the detection of the event. For example, whenever a toggle button displayed on the display 460 is selected, at least two screen ratios which configure the display 460 may be changed.
  • switching between the first screen 610 or the second screen 620 and the third screen 630 may be achieved in response to the detection of the event.
  • returning from one of the first screen 610 to the third screen 630 to the default screen 600 may be achieved in response to an event for screen return.
  • the event for screen return may include an input (or a touch/hovering/swipe input) which traverses at least part in a direction opposite to one direction on the display 460, an input which moves from a second end located on the opposite side of a first end along a widthwise direction of the display 460 to the first end or a position close to the first end, an unfolding/motion input for unfolding the electronic device, a voice input, the selection of a software button (or a toggle button) displayed on the display 460, and/or the like.
  • FIG. 6B is a view illustrating various states of an electronic device according to various embodiments of the present disclosure.
  • the flexible electronic device may facilitate implementation of various folding or unfolding states, as illustrated in FIG. 6B .
  • the flexible electronic device may implement a diverse arrangement of folded and unfolded states, as illustrated therein.
  • the folded state 600a illustrates the electronic device being completely folded, such that both end parts of the electronic device contact each other or come as close as possible to each other.
  • An unfolded state 600b illustrates the electronic device being completely unfolded.
  • a state 600c illustrates a particular state including displaying a window for an overlay scheme, when the electronic device is bent at a preset angle or more with one axis as a reference.
  • a compact standing state is illustrated at 600d, in which the electronic device stands partially upright, for example, when partially inwardly folded towards the display, roughly into one-half portions.
  • an arc standing state is shown in 600e in which the electronic device may be stood, when folded so as to facilitate presentation of a landscape orientation of the display.
  • a folding ratio may be implemented correlating to the fact that the electronic device is folded in half, with the middle-axis of the electronic device utilized as a reference point.
  • folding ratios may be implemented differently, in which other parts of the display may serve as the axis when the display is folded.
  • the flexible electronic device may be folded or bent with one axis as a reference.
  • one axis may be preset or 'optional' (e.g., user designated).
  • the preset axis may indicate or correlate to a device in which a particular area (e.g., a partial area including the axis) of the display of the flexible electronic device is implemented with the capacity to be folded or otherwise bent.
  • the 'optional' axis may indicate that the entire area (or length) of the display of the flexible electronic device has the capacity to be folded or bent.
  • FIG. 6B the electronic device is illustrated as being folded along an axis passing through the center of the electronic device as a reference. That said, it should be understood that the location of this folding axis is not limited to the example embodiments shown in FIG. 6B .
  • FIGs. 7A to 7E are views illustrating an operating method of an electronic device according to various embodiments of the present disclosure.
  • the electronic device may detect a preset user input while folded (e.g., in a folded state).
  • the user input may include detecting knocking (or tapping) on the sub-display 162 once or preset number of knocks or taps, clicking on the power/lock button once or a present number of clicks, and/or a voice command.
  • the electronic device may detect a user input selecting a camera application, or initiating capturing of video.
  • the electronic device may detect a user input which selects a camera application or capturing of a video.
  • the electronic device may initiate capturing of a video in response to the user input.
  • the electronic device may wait (e.g., listen) for a preset user input while simultaneously turning off a screen and/or stopping the capturing of a video.
  • the electronic device may initiate capturing of a video while folded. For example, in response to the user input, the electronic device may execute the camera application to capture the video, or may resume the capturing of video which was previously paused or otherwise stopped.
  • a user may therefore capture a video when the device is folded, utilizing the electronic device in a 'clipped' form as in FIG. 7D .
  • this allows the device to be clipped to the user (e.g., by a pocket or a strap, and/or the like), allowing the device to function as an 'action camera,' as illustrated in FIG. 7D .
  • the electronic device when the electronic device detects the preset user input or a user input of another type, the electronic device may stop the capturing of the video.
  • FIG. 8 is a view illustrating a method for providing a screen ratio by an electronic device according to various embodiments of the present disclosure.
  • the electronic device may display an original image at a default screen ratio of 4:3 on the display (e.g., the display 460).
  • the electronic device may display an image, which is implemented by cropping the original image at a screen ratio of 21:9 on the display.
  • the electronic device may display an image, which is implemented by cropping the original image, at a screen ratio of 24:9 on the display.
  • the electronic device may display an image at the default screen ratio, and may facilitate a user's holding of the electronic device by providing a space left on the left and right sides of the image, in which no image is displayed.
  • FIGs. 9A to 9C are views illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • a screen division may facilitate image editing.
  • a processor e.g., the processor 210 or 420 of the electronic device 901 (e.g., the electronic device 101, 201, or 401) may automatically configure a first area 912 and a second area 914 in at least a partial area (or a full screen) of the display 906 of the electronic device 901, in response to an input (e.g., a swipe input which at least partially traverses a screen) requesting execution of screen division.
  • an input e.g., a swipe input which at least partially traverses a screen
  • the processor may display a first graphic element 920 (e.g., a screen/graphical interface of a camera application), which includes a preview image, in the first area 912, and may display a second graphic element 930, which includes at least one preview image to which a special effect (e.g., a brightness effect, a monotone effect, a tone effect, a saturation effect, and/or the like) is applied, in the second area 914.
  • a special effect e.g., a brightness effect, a monotone effect, a tone effect, a saturation effect, and/or the like
  • the processor may automatically display the first graphic element 920 in the first area 912 in response to an input (e.g., a swipe input which at least partially traverses a screen) requesting screen division, and may display a second graphic element 940, which includes captured images (or icons) stored in memory of device in the second area 914.
  • an input e.g., a swipe input which at least partially traverses a screen
  • a second graphic element 940 which includes captured images (or icons) stored in memory of device in the second area 914.
  • the screen division may facilitate image or video capturing options.
  • the processor may automatically display the first graphic element 920 (e.g., a preview for image capture) in the first area 912 in response to an input (e.g., a swipe input which at least partially traverses a screen) requesting screen division, and may display a second graphic element 950, which includes multiple selectable objects for setting various configurable attributes of the camera application, in the second area 914, and receive inputs to these attributes altering how an image or video is to be captured.
  • an input e.g., a swipe input which at least partially traverses a screen
  • FIGs. 10A to 10E are views each explaining a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • a processor e.g., the processor 210 or 420 of the electronic device 1001 (e.g., the electronic device 101, 201, or 401) may display a first graphic element 1020 (e.g., a screen/graphical interface of a message/Social Network Service (SNS) application), which is related to a first application (e.g., the message/SNS application), in at least a partial area (or a full screen) of a display 1006 of the electronic device 1001.
  • SNS message/Social Network Service
  • the processor may configure a first area 1012 and a second area 1014 in at least a partial area in response to a touch input 1050 traversing the display 1006 at least partially in a first direction (e.g., a "swipe" or “swiping" motion).
  • the processor may reduce and display the first graphic element 1020 in the first area 1012.
  • the processor may display a second graphic element 1030 (e.g., a screen/graphical interface of a camera application), which may be related to a second application (e.g., the camera application), in the second area 1014.
  • the processor may display a graphic element 1032, which includes the captured images (or icons) stored in a memory in the second area 1014 responsive to an input (e.g., the selection of an image-capturing button of a camera application) which selects image-capturing.
  • an input e.g., the selection of an image-capturing button of a camera application
  • the processor may transmit at least one file, image, or document corresponding to or otherwise represented by the at least one object 1034, to an external electronic device through a communication interface (e.g., the communication interface 470) that is functionally connected to the electronic device 1001.
  • a communication interface e.g., the communication interface 470
  • the processor may display the at least one object 1034 on the first graphic element 1020 as seen therein.
  • the processor may display an icon (e.g., a thumbnail or a reduced image), which corresponds to the transmitted image, in a chat window of a message/SNS application.
  • FIGs. 11A to 11E are views illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure, particularly for recommending various information related to a currently detected location.
  • a processor e.g., the processor 210 or 420 of the electronic device 1101 (e.g., the electronic device 101, 201, or 401) may display a first graphic element 1120 (e.g., a screen/graphical interface of a camera application), which is related to a first application (e.g., the camera application), in at least a partial area (or a full screen) of a display 1106 of the electronic device 1101.
  • a processor e.g., the processor 210 or 420 of the electronic device 1101 (e.g., the electronic device 101, 201, or 401) may display a first graphic element 1120 (e.g., a screen/graphical interface of a camera application), which is related to a first application (e.g., the camera application), in at least a partial area (or a full screen) of a display 1106 of the electronic device 1101.
  • a first graphic element 1120 e.g., a screen/graphical interface of a camera application
  • the processor may search a memory (e.g., the memory 430) or a server (e.g., the server 406) for recommendation information related to a current location, and may display an indicator 1115, indicating the existence of location-based recommendation, according to a result of the search.
  • a memory e.g., the memory 430
  • a server e.g., the server 406
  • the processor may configure a first area 1111 and a second area 1114 in the at least partial area (or a full screen) in response to an input selecting the indicator 1115.
  • the processor may reduce and display the first graphic element 1120 in the first area 1111.
  • the processor may display a second graphic element 1130 (e.g., a list of images/icons), which includes multiple objects 1132 recommended in relation to the current location, in the second area 1114.
  • the processor may display images (or icons of the images), which have been obtained by image-capturing a subject 1122 to be image-captured in a preview image included in the first graphic element 1120, in the second area 1114.
  • the processor may make a search on the basis of at least one piece of information among current location information and preview image information.
  • the processor may determine images, which are to be recommended, or priorities of the images on the basis of at least one of image-capturing near the current location, the number of references/recommendations, the accuracy of matching with the subject 1122 to be image-captured, and the existence of an image-capturing setting value.
  • the processor may display a graphic element 1134 (e.g., at least one image-capturing setting value of a relevant image, and/or the like), corresponding to one object 1132 (e.g., an image or an icon) of the second graphic element 1130, in the second area 1114 in response to an input which selects the object 1132.
  • the graphic element 1134 may include at least one image-capturing setting value among International Standardization Organization (ISO), an F-number, a shutter speed, white balance information, and a focus distance.
  • ISO International Standardization Organization
  • the processor may apply at least one setting value of the graphic element 1134 to the first application in response to an input 1162 (e.g., a swipe input directed to the first graphic element 1120 from the graphic element 1134) related to the graphic element 1134.
  • the processor may control a camera application to change at least one setting value of the camera application to at least one setting value of the selected graphic element 1134 or the selected image.
  • the processor may output information for guiding to an image-capturing place of the relevant image (e.g., the selected graphic element 1134 or the selected image), or may output an alarm/notification when the current location coincides with or is close to the image-capturing place.
  • an image-capturing place of the relevant image e.g., the selected graphic element 1134 or the selected image
  • the processor may display a graphic element 1136, which includes an image-capturing place 1172 of the selected object 1132 or the relevant image, a current location 1174, and a path 1176 for moving to the image-capturing place, in the second area 1114.
  • the graphic element 1136 may include a map 1170 or a screen/graphical interface of a map application.
  • the processor may output an alarm/notification, or may control a first application so as to activate the first application.
  • FIGs. 12A, 12B , 13A and 13B are views illustrating image-capturing methods according to various states of an electronic device according to various embodiments of the present disclosure.
  • the electronic device e.g., the electronic device 101, 201, or 401 may be switched from a state in which the electronic device is completely unfolded, as illustrated in FIG. 12A , to a compact standing state in which the electronic device is inwardly folded in half (e.g., folded so the screens approach one another), as illustrated in FIG. 12B , which may facilitate a user to capture a 'selfie' image in a hands-free state. That is, the device may be set on a surface away from the user, but facing the user and any other subjects of the image.
  • the user may execute image-capturing using a timer image-capturing function (by, for example, selection of a displayed menu or item), and/or a user input utilizable from a distance (e.g., a voice command sensed by a microphone, the display of a preset gesture sensed by a camera, and/or the like).
  • a timer image-capturing function by, for example, selection of a displayed menu or item
  • a user input utilizable from a distance e.g., a voice command sensed by a microphone, the display of a preset gesture sensed by a camera, and/or the like.
  • the electronic device e.g., the electronic device 101, 201, or 401
  • the electronic device is switched from a state in which the electronic device is completely unfolded as illustrated in FIG. 13A to an "arc" standing state, in which the electronic device is folded into a landscape mode as illustrated in FIG. 13B , and thereby may allow more persons to be image-captured.
  • FIG. 14 is a flowchart illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • the image providing method may include operations 1410 to 1440.
  • the display control method is described as being performed by the electronic device (e.g., the electronic device 101, 201, or 401), the display control method may be performed by a processor (e.g., the processor 210 or 420) of the electronic device or by a controller of the electronic device.
  • the electronic device may configure a first area and a second area on a display (e.g., the display 460) that is functionally connected to the electronic device. For example, the electronic device may divide at least part of an entire display area (or a full screen) of the display into multiple areas (e.g., a first area and a second area).
  • a display e.g., the display 460
  • the electronic device may divide at least part of an entire display area (or a full screen) of the display into multiple areas (e.g., a first area and a second area).
  • the electronic device may detect at least one of an input (or a touch/hovering/swipe input) which at least partially traverses the display, an input which moves from a first end along a widthwise direction of the display or a position close to the first end to a second end located on the opposite side of the first end or a position close to the second end, a folding/bending/motion input for folding or bending the electronic device, and a voice input.
  • the electronic device may configure a first area and a second area on the display in response to the input. For example, at least part of an entire display area (or a full screen) of the display may be divided into multiple areas (e.g., a first area and a second area) in a lengthwise direction. For example, the electronic device may divide the entire display area (or the full screen) of the display into a first area and a second area in the lengthwise direction in response to the input.
  • the electronic device may display a first graphic element, which is related to a first application, in the first area of the display.
  • the electronic device may display a screen generated by the first application or by the execution of the first application, a screen/graphical interface of the first application, or an image/document/text/moving image reproduced/displayed by the first application, as the first graphic element in the first area.
  • the electronic device may display a second graphic element, which is related to control over the first application, in the second area of the display.
  • the electronic device may display a screen generated by the first application or a second application or by the execution thereof, a screen/graphical interface of the first or second application, or an image/document/text/moving image reproduced/displayed by the first or second application, as the second graphic element in the second area.
  • the first and second graphic elements may be identical or different.
  • the electronic device may display a graphical interface, that includes multiple objects corresponding to multiple images, as the second graphic element in the second area.
  • the graphical interface may correspond to a list of icons.
  • the electronic device may display a graphical interface, that includes at least one selectable object for setting at least one attribute of the first graphic element, as the second graphic element in the second area.
  • the attribute may include at least one of the color, size, and shape/form/font of text/image.
  • the electronic device may control the first application in response to an input related to the second graphic element.
  • the electronic device may detect an input which selects one of multiple objects which correspond to multiple images and are included in a graphical interface corresponding to the second graphic element.
  • the electronic device may control the first application to display an image or a web page, which corresponds to the selected object, in the first area in response to the input.
  • the electronic device may detect an input which selects one of multiple selectable objects for setting multiple attributes of the first graphic element or first application which are included in a graphical interface corresponding to the second graphic element.
  • the electronic device may control the first application to change the attribute of the first graphic element or first application, which corresponds to the selected object, in response to the input.
  • the electronic device may detect an input for changing a part of a document corresponding to the second graphic element.
  • the electronic device may control the first application to display the changed part of the document in each of the first and second areas in response to the input.
  • the electronic device may detect an input which selects one of multiple objects included in the second graphic element.
  • the electronic device may control the first application to transmit a file or an image corresponding to the selected object to an external device through a communication interface (e.g., the communication interface 470), that is functionally connected to the electronic device, in response to the input.
  • a communication interface e.g., the communication interface 470
  • the electronic device may detect an input related to a graphical interface of the second application corresponding to the second graphic element.
  • the electronic device may control the first application to transmit at least part of the graphical interface of the second application or the input (e.g., a handwriting input) (or at least part of the graphical interface of the second application including the handwriting input) to an external device through the communication interface in response to the input.
  • the input e.g., a handwriting input
  • the electronic device may detect an input related to a graphical interface of the second application corresponding to the second graphic element.
  • the electronic device may control the first application to transmit at least part of the graphical interface of the second application or the input (e.g., a handwriting input) (or at least part of the graphical interface of the second application including the handwriting input) to an external device through the communication interface in response to the input.
  • the electronic device may receive an input (e.g., a handwriting input) related to the graphical interface of the second application from an external device through the first application.
  • the electronic device may display the input on the graphical interface of the second application in response to the input.
  • the electronic device may detect an input which selects an image which is related to a current location and is included in the graphical interface corresponding to the second graphic element.
  • the electronic device may control the first application to change at least one setting value of the first application to at least one setting value of the selected image in response to the input.
  • the electronic device may control the first application to output information, which is related to an image-capturing place of the selected image, through an input/output interface (e.g., the input/output interface 450) or the display in response to the input.
  • FIG. 15 is a flowchart illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • the image providing method may include operations 1510 to 1550.
  • the display control method is described as being performed by the electronic device (e.g., the electronic device 101, 201, or 401), the display control method may be performed by a processor (e.g., the processor 210 or 420) of the electronic device or by a controller of the electronic device.
  • Operations 1520 to 1550 are similar to operations 1410 to 1440 illustrated in FIG. 14 , respectively, and thus, a repeated description thereof will be omitted.
  • the electronic device may display one of a first graphic element related to a first application and a second graphic element related to control over the first application, in at least a partial area (or a full screen) of a display (e.g., the display 160 or 460) that is functionally connected to the electronic device.
  • a display e.g., the display 160 or 460
  • the electronic device may configure a first area and a second area in at least a partial area of the display in response to a first input.
  • the electronic device may divide the at least partial area into a first area and a second area.
  • the first input may be at least one of a touch input, an input (or a touch/hovering/swipe input) which at least partially traverses the display, a folding/bending/motion input for folding or bending the electronic device, and a voice input.
  • the electronic device may display the first graphic element in the first area.
  • the electronic device may display the second graphic element in the second area.
  • the electronic device may control the first application in response to a second input related to the second graphic element.
  • the second input may be a touch input which selects an object included in the second graphic element.
  • the electronic device may control the first application to display a graphic element/image/document/page, which corresponds to the second input, in the first area.
  • the electronic device may control the first application to change an attribute of the first graphic element according to the second input.
  • FIGs. 16A to 16D are views illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • the user may display a screen 1606 so as to distinguish an area shared with another person from an area (i.e., a non-shared area) for controlling the shared area, through an operation of folding the electronic device 1601 (e.g., the electronic device 101, 201, or 401).
  • an area i.e., a non-shared area
  • a processor e.g., the processor 210 or 420 of the electronic device 1601 may display a second graphic element 1630 (e.g., a file, a document or a page search window within the document, a list of images/icons, and/or the like) related to control of a first application (e.g., a word processing application or a presentation application), in at least a partial area (or a full screen) of the display 1606 of the electronic device 1601.
  • a first application e.g., a word processing application or a presentation application
  • the processor may configure a first area 1612 and a second area 1614 in the at least partial area in response to an input corresponding to or including folding the electronic device 1601.
  • the processor may display, in the first area 1612, a first graphic element 1620 (e.g., a first document or a first page within the first document), which corresponds to a first object 1632 (e.g., a first icon) included in the second graphic element 1630, according to a user selection or an automatic/default configuration which is related to the first application.
  • the processor may reduce and display the second graphic element 1630 in the second area 1614.
  • the processor may display a graphic element 1622 (e.g., a second document or a second page within the second document), which corresponds to a second object 1634 (e.g., a second icon) of the second graphic element 1630, in the first area 1612 in response to a second input selecting the second object 1634.
  • the processor may display, in the second area 1614, a graphical interface 1640 including at least one selectable object for setting at least one attribute of the graphic element 1622 or at least one object 1624 included in the graphic element 1622.
  • the processor may change the color of the object 1624 included in the graphic element 1622.
  • the processor may display the graphical interface 1640 together with the first graphic element 1620 in place of the second graphic element 1630 illustrated in FIG. 16B .
  • the processor may project the graphic element 1622 to the outside of the electronic device 1601 through a projector functionally coupled with or integrated into the electronic device 1601.
  • the processor may detect an input for controlling the first application through the graphical interface 1640 while projecting the graphic element 1622.
  • the processor may control the first application in response to the input.
  • FIG. 17 is a flowchart illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • the image providing method may include operations 1710 to 1760.
  • the display control method is described as being performed by the electronic device (e.g., the electronic device 101, 201, or 401), the display control method may be performed by a processor (e.g., the processor 210 or 420) of the electronic device or by a controller of the electronic device.
  • Operations 1710 to 1750 are similar to operations 1510 to 1550 illustrated in FIG. 15 , respectively, and thus, a repeated description thereof will be omitted.
  • the electronic device may display one of a first graphic element related to a first application and a second graphic element related to control over the first application in at least a partial area (or a full screen) of a display (e.g., the display 160 or 460) that is functionally connected to the electronic device.
  • a display e.g., the display 160 or 460
  • the electronic device may configure a first area and a second area in at least a partial area of the display in response to a first input.
  • the electronic device may divide the at least partial area into a first area and a second area.
  • the first input may be at least one of a touch input, an input (or a touch/hovering/swipe input) which at least partially traverses the display in a first direction, a folding/bending/motion input for folding or bending the electronic device, and a voice input.
  • the electronic device may display the first graphic element in the first area.
  • the electronic device may display the second graphic element in the second area.
  • the electronic device may control the first application in response to a second input related to the second graphic element.
  • the second input may be a touch input which selects an object included in the second graphic element.
  • the electronic device may control the first application to display a graphic element/image/document/page, which corresponds to the object, in the first area while maintaining the second graphic element as it is.
  • the electronic device may control the first application to display a changed part of a document in each of the first graphic element and the second graphic element (or each of the first area and the second area) in response to an input for changing a part of the document.
  • the electronic device may control the first application to expand and display one of the first and second graphic elements in an area obtained by adding the first and second areas together or in at least a partial area (or a full screen) of the display in response to a third input.
  • the third input may be at least one of a touch input having a direction/pattern/trajectory opposite to the first input, an input (or a touch/hovering/swipe input) which at least partially traverses the display in a second direction opposite to the first direction, an unfolding/motion input for unfolding the electronic device, and a voice input.
  • the electronic device may receive a third input related to the second graphic element from an external electronic device through the first application.
  • the electronic device may display the third input on the second graphic element.
  • FIGs. 18A to 18F are views illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • a processor e.g., the processor 210 or 420 of the electronic device 1801 (e.g., the electronic device 101, 201, or 401) may display a first graphic element 1820 (e.g., a web document/web page), which is related to the first application (e.g., a web browser), in at least a partial area (or a full screen) of a display 1806 of the electronic device 1801.
  • a first graphic element 1820 e.g., a web document/web page
  • the first application e.g., a web browser
  • the processor may configure a first area 1812 and a second area 1814 in the at least partial area in response to a touch input 1850 (depicted in FIG. 18A ) which at least partially traverses the display 1806 in a first direction.
  • the first and second areas 1812 and 1814 may be divided by a boundary line 1855, which is capable of being further adjusted upwards or downwards to modify the screen space allotted to each display screen.
  • the processor may display a first graphic element 1820 and a second graphic element 1830, both corresponding to an identical first web document/web page, in each of the first and second areas 1812 and 1814, respectively.
  • the processor may display a graphic element 1822 (e.g., a second web document/web page), corresponding to the first object 1832 in the first area 1812, while simultaneously maintaining the second graphic element 1830 as it is.
  • a graphic element 1822 e.g., a second web document/web page
  • the processor may display a graphic element 1824 (e.g., a third web document/web page), corresponding to the second object 1834, in the first area 1812 while maintaining the second graphic element 1830 as it is.
  • a graphic element 1824 e.g., a third web document/web page
  • the processor may expand and display a currently activated graphic element 1824 in an area generated by recombining the first and second areas 1812 and 1814 into what may be at least a partial area or a full screen of the display 1806.
  • the first application may correspond to a document editing application, and the first and second graphic elements may both correspond to an identical editable document.
  • the processor may display a first graphic element 1820a and a second graphic element 1830a, which both correspond to an identical editable document, in the first area 1812 and the second area 1814, respectively.
  • the first and second areas 1812 and 1814 may be divided by a boundary line 1855 capable of moving upwards or downwards.
  • the processor may control the first application to display the changed part 1832a of the document on each of the first and second graphic elements 1820a and 1830a (or each of the first and second areas 1812 and 1814), in response to an input (e.g., a change of text attributes, such as an underscore, a color change, and/or the like) for changing a part of the document.
  • an input e.g., a change of text attributes, such as an underscore, a color change, and/or the like
  • the processor may underline a "World Wide Web," part 1832a of the first graphic element 1820a, and simultaneously, may underline an identical part (i.e., "World Wide Web,") of the second graphic element 1830a.
  • the processor may selectively scroll one of the first and second graphic elements 1820a and 1830a in response to a user input. For example, in response to a user input for the first graphic element 1820a, the processor may scroll the first graphic element 1820a upwards or downwards, while simultaneously maintaining the current state of the second graphic element 1830a as it is (e.g., without scrolling).
  • FIGs. 19A to 19G are views illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • the electronic device 1901 When a user is engaged in a conversation with another person (i.e., in the case of a video call), the electronic device 1901 (e.g., the electronic device 101, 201, or 401) may display a screen of a display 1906 as to distinguish an area (i.e., transmitted to another person) shared with the other person from an area displaying an image received from the other person. For example, the electronic device 1901 may display a captured image or video stream of the user in the shared area, and simultaneously, may transmit the captured image or video stream to another person in real time.
  • the electronic device 1901 may display a screen of a display 1906 as to distinguish an area (i.e., transmitted to another person) shared with the other person from an area displaying an image received from the other person.
  • the electronic device 1901 may display a captured image or video stream of the user in the shared area, and simultaneously, may transmit the captured image or video stream to another person in real time.
  • a processor e.g., the processor 210 or 420 of the electronic device 1901 may configure a first area 1912 and a second area 1914 for display, the combination of the two areas utilizing at least a partial area (or full screen, as depicted) of the display 1906 of the electronic device 1901.
  • the processor may display, in the first area 1912, a first graphic element 1920 (e.g., an image or video stream received from an external electronic device) related to a first application (e.g., a telephone/call application, a voice/video call application, a voice/image/video conference application, and/or the like), and may display a second graphic element 1930 related to the first application in the second area 1914.
  • a first graphic element 1920 e.g., an image or video stream received from an external electronic device
  • a first application e.g., a telephone/call application, a voice/video call application, a voice/image/video conference application, and/or the like
  • a second graphic element 1930 related to the first application in
  • the processor may display a graphic element 1940 (e.g., a file, a document or a page search window within the document, a list of images/icons, and/or the like) including multiple selectable objects in the second area 1914.
  • a graphic element 1940 e.g., a file, a document or a page search window within the document, a list of images/icons, and/or the like
  • the processor may transmit at least one file/image/document corresponding to the at least one object 1942 to an external electronic device (e.g., the external electronic device 404 or the server 406) through a communication interface (e.g., the communication interface 470) that is functionally connected to the electronic device 1901.
  • an external electronic device e.g., the external electronic device 404 or the server 406
  • a communication interface e.g., the communication interface 470
  • the selected object 1942 is dragged to the first area 1912 to trigger transmission of information corresponding to the selected object 1942.
  • the processor may display, in the first area 1912, the first graphic element 1920 (e.g., an image received from an external electronic device) related to the first application (e.g., a telephone/call application, a voice/video call application, a voice/image/video conference application, and/or the like), and may display a graphic element 1950 (e.g., a video and a screen/graphical interface of a video player application) related to a second application in the second area 1914.
  • the first graphic element 1920 e.g., an image received from an external electronic device
  • the first application e.g., a telephone/call application, a voice/video call application, a voice/image/video conference application, and/or the like
  • a graphic element 1950 e.g., a video and a screen/graphical interface of a video player application
  • the processor may control the first application to transmit at least one of at least part of the graphic element 1950, a video thereof, and video-related information (e.g., moving image download/streaming information/address, moving image reproduction information/position/time, and/or the like) thereof to an external electronic device through the communication interface in response to an input (e.g., a reproduction command, a moving image transmission command, and/or the like) related to the graphic element 1950.
  • video-related information e.g., moving image download/streaming information/address, moving image reproduction information/position/time, and/or the like
  • the processor may display, in the first area 1912, the first graphic element 1920 (e.g., an image received from an external electronic device) related to the first application (e.g., a telephone/call application, a voice/video call application, a voice/image/video conference application, and/or the like), and may display a graphic element 1960 (e.g., a map, and a screen/graphical interface of a map application) related to a second application (e.g., a map application) in the second area 1914.
  • the first graphic element 1920 e.g., an image received from an external electronic device
  • the first application e.g., a telephone/call application, a voice/video call application, a voice/image/video conference application, and/or the like
  • a graphic element 1960 e.g., a map, and a screen/graphical interface of a map application
  • a second application e.g., a map application
  • the processor may control the first application to transmit an input 1962 (e.g., a handwriting input) (or at least part of a graphical interface of the second application including the handwriting input, and map download information/address) to an external device through the communication interface in response to the input 1962 (e.g., a handwriting input) related to the second graphic element 1960.
  • an input 1962 e.g., a handwriting input
  • the processor may control the first application to transmit an input 1962 (e.g., a handwriting input) (or at least part of a graphical interface of the second application including the handwriting input, and map download information/address) to an external device through the communication interface in response to the input 1962 (e.g., a handwriting input) related to the second graphic element 1960.
  • an external electronic device 1971 may display, on a display 1976, a second graphic element 1980 (e.g., a map, a screen/graphical interface of a map application, and/or the like), which is acquired based on information received from the electronic device 1901, and the input 1962 (e.g., a handwriting input).
  • the external electronic device 1971 may transmit an input 1982 (e.g., a handwriting input) (or at least part of a graphical interface of a second application including the handwriting input) to the electronic device 1901 in response to the input 1982 (e.g., a handwriting input) related to the second graphic element 1960.
  • the processor may receive an input 1982 (or input data) related to the second graphic element 1960 from the external electronic device 1971 through the first application.
  • the processor may control the second application to display the input 1982 on the second graphic element 1960 or to display the input 1982, in response to the input 1982.
  • FIGs. 20A to 20D are views illustrating a folding gesture according to various embodiments of the present disclosure.
  • an electronic device 2001 may be folded by manual operation of a user such that a first part 2012 and a second part 2014 (e.g., an upper part, a lower part, and/or the like) contact each other and are situated as close as possible one another in a lengthwise direction of the electronic device 2001.
  • a first part 2012 and a second part 2014 e.g., an upper part, a lower part, and/or the like
  • an end part in the lengthwise direction of the electronic device 2001 may or may not be exposed to the outside depending on the axis of folding.
  • a part of the display 2006 may be exposed to the outside as a sub-display for implementing additional functionality.
  • the electronic device 2001 may be returned to the unfolded state by a manual operation of the user.
  • Examples of a folding gesture may include a gesture which bends the electronic device 2001 at a preset angle or more as illustrated in FIG. 20B , a gesture which completely folds the electronic device 2001 such that the first and second parts 2012 and 2014 contact each other or come as close as possible to each other as illustrated in FIG. 20C , and a gesture which again unfolds the electronic device 2001 in the opposite direction/to the original state (or at a predetermined angle or less) after bending the electronic device 2001 at a preset angle or more.
  • the electronic device 2001 may detect an angle formed between an axis (e.g., x-axis) in the lengthwise direction of the electronic device 2001 and the first part 2012 (or the second part 2014).
  • a bending angle of the electronic device 2001 illustrated in FIG. 20A may be detected to be 0 degrees
  • that of the electronic device 2001 illustrated in FIG. 20C may be detected to be 180 degrees.
  • examples of an unfolding gesture may include a gesture which unfolds the bent electronic device 2001 at a preset angle or less as illustrated in FIG. 20B or FIG. 20D .
  • the first and second areas 1812 and 1814 have been described as being configured in response to the touch input 1850 which at least partially traverses the display 1806 in the first direction, but the first and second areas 1812 and 1814 may be configured by a folding/unfolding gesture.
  • the configuration of areas may be cancelled by a folding/unfolding gesture instead of the touch input 1852.
  • the first and second areas 1012 and 1014 have been described as being configured in response to the touch input 1050 which at least partially traverses the display 1006 in the first direction, but the first and second areas 1012 and 1014 may be configured by a folding/unfolding gesture.
  • FIGs. 21A and 21B are views illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • a processor e.g., the processor 210 or 420 of the electronic device 2101 (e.g., the electronic device 101, 201, or 401) may configure a first area 2112 and a second area 2114 for display in at least a partial area (or a full screen) of a display 2106 of the electronic device 2101.
  • the processor may display a first graphic element 2120 (e.g., a screen/graphical interface of a message/SNS application) related to a first application (e.g., a message/SNS application) in a first area 2112.
  • the processor may display a second graphic element 2130 (e.g., a web document/web page) related to a second application (e.g., a web browser) in a second area 2114.
  • the processor may transmit an image/file/document generated by capturing an image (e.g., a screenshot of) the second area 2114 or a second graphic element 2130 to an external electronic device through a communication interface (e.g., the communication interface 470) that is functionally connected to the electronic device 2101.
  • the processor may display an object 2134 (or a reduced image), which corresponds to the transmitted image/file/document, on the first graphic element 2120.
  • the processor may display an icon (or a reduced image), which corresponds to the transmitted image, in a chat window of a message/SNS application.
  • FIGs. 22A to 22H are views illustrating a method for controlling a display by an electronic device according to various embodiments of the present disclosure.
  • a processor e.g., the processor 210 or 420 of the electronic device 2201 (e.g., the electronic device 101, 201, or 401) may display a first graphic element 2220 (e.g., a web document/web page) related to a first application (e.g., a web browser) in at least a partial area (or a full screen) of a display 2206 of the electronic device 2201.
  • a first graphic element 2220 e.g., a web document/web page
  • a first application e.g., a web browser
  • the processor may configure a first area 2212 and a second area 2214 for display in the at least partial area in response to an input (e.g., a folding/unfolding gesture) indicating or requesting screen division.
  • the processor may display the first graphic element 2220 and a second graphic element 2230, which both correspond to an identical first web document/web page, in the first and second areas 2212 and 2214, respectively.
  • the processor may detect an input selecting a first object 2232 (e.g., a first hyperlink) of the second graphic element 2230.
  • a first object 2232 e.g., a first hyperlink
  • the processor may display a third graphic element 2234 (e.g., a second web document/web page) corresponding to the first object 2232, in the second area 2214 while maintaining the first graphic element 2220 as it is.
  • a third graphic element 2234 e.g., a second web document/web page
  • the processor may detect an input 2250 (e.g., a swipe input on the first area 2212, or a scroll input) for displaying another part of the first graphic element 2220 instead of a part 2222 of the first graphic element 2220 which is displayed in the first area 2212.
  • an input 2250 e.g., a swipe input on the first area 2212, or a scroll input
  • the processor may display another part 2224 of the first graphic element 2220 in the first area 2212 while maintaining the third graphic element 2234 as it is.
  • module may refer to a unit including hardware, software, or firmware, and for example, may be used interchangeably with a term, such as a logic, a logical block, a component, or a circuit.
  • the “module” may be an integrated component, or a minimum unit for performing one or more functions or a part thereof.
  • the “module” may be mechanically or electronically implemented, and may include, for example, an Application-Specific Integrated Circuit (ASIC) chip, a Field-Programmable Gate Array (FPGA), or a programmable-logic device which performs certain operations and has been known or is to be developed in the future.
  • ASIC Application-Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • At least part of the device (e.g., modules or functions thereof) or the method (e.g., operations) according to various embodiments of the present disclosure may be implemented by an instruction, which is stored in a computer-readable storage medium (e.g., the memory 430), in the form of a program module.
  • a processor e.g., the processor 420
  • the processor may perform a function corresponding to the instruction.
  • the computer-readable recoding medium may include magnetic media, such as a hard disk, a floppy disk, and a magnetic tape; optical media, such as a Compact Disc Read Only Memory (CD-ROM) and a Digital Versatile Disc (DVD); magneto-optical media, such as a floptical disk; an internal memory; and/or the like.
  • the instructions may include codes made by a compiler or codes which can be executed by an interpreter.
  • the module or program module may include at least one of the aforementioned elements, may further include other elements, or some of the aforementioned elements may be omitted.
  • Operations executed by the module, program module, or other elements according to various embodiments of the present disclosure may be executed sequentially, in parallel, repeatedly, or in a heuristic manner. Alternatively, at least some operations may be executed in a different order or may be omitted, or other operations may be added.
  • the instructions are configured to cause at least one processor to perform at least one operation when executed by the at least one processor, and the at least one operation includes configuring a first area and a second area on the display; displaying a first graphic element related to a first application in the first area of the display; displaying a second graphic element related to control over the first application in the second area of the display; and controlling the first application in response to an input related to the second graphic element.
  • Example embodiments of the present disclosure are provided to describe technical contents of the present disclosure and to help the understanding of the present disclosure, and are not intended to limit the present disclosure. Therefore, it should be construed that all modifications and changes or various other embodiments which are based on the technical idea of the present disclosure fall within the present disclosure.
  • the control unit or processor may include a microprocessor or any suitable type of processing circuitry, such as one or more general-purpose processors (e.g., ARM-based processors), a Digital Signal Processor (DSP), a Programmable Logic Device (PLD), an Application-Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), a Graphical Processing Unit (GPU), a video card controller, and/or the like.
  • general-purpose processors e.g., ARM-based processors
  • DSP Digital Signal Processor
  • PLD Programmable Logic Device
  • ASIC Application-Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • GPU Graphical Processing Unit
  • any of the functions and steps provided in the Figures may be implemented in hardware, software or a combination of both and may be performed in whole or in part within the programmed instructions of a computer.
  • a "processor” or “microprocessor” may be hardware in the claimed disclosure.
  • unit or “module” referred to herein is to be understood as comprising hardware such as a processor or microprocessor configured for a certain desired functionality, or a non-transitory medium comprising machine executable code, and does not constitute software per se.
  • processor or "microprocessor” constitute hardware in the claimed invention.
  • An activity performed automatically is performed in response to executable instruction or device operation without user direct initiation of the activity.

Claims (10)

  1. Elektronische Vorrichtung (1101), umfassend:
    eine flexible Anzeige (1106);
    und einen Prozessor (210, 420), der konfiguriert ist, um:
    in Reaktion auf die Detektion einer Nutzereingabe die flexible Anzeige zum Teilen eines Anzeigebereichs in einen ersten Bereich (1111) und einen zweiten Bereich (1114) zu steuern;
    die flexible Anzeige zum Anzeigen eines ersten graphischen Elements (1120), das sich auf eine Kameraanwendung im ersten Bereich bezieht, und eines zweiten graphischen Elements (1130), das sich auf die Steuerung der Kameraanwendung im zweiten Bereich bezieht, zu steuern, wobei die Kameraanwendung Einstellungswerte aufweist, die konfigurierbar sind, um von der Kameraanwendung erfasste Bilder zu ändern, und das zweite graphische Element (1130) ferner eine graphische Nutzerschnittstelle enthält, die ein Bild (1132) enthält, das sich auf den aktuellen Ort der elektronischen Vorrichtung bezieht;
    und die Kameranwendung in Reaktion auf das Detektieren einer Eingabe in das zweite graphische Element zu steuern,
    wobei:
    die detektierte Eingabe in das zweite graphische Element die Auswahl des Bilds (1132) enthält, das sich auf den aktuellen Ort bezieht, und wobei, infolge der Auswahl, der Prozessor (210, 420) ferner konfiguriert ist, um mindestens einen Einstellungswert der Kameraanwendung zu ändern, so dass er mit mindestens einem Einstellungswert übereinstimmt, der vom ausgewählten Bild (1132) angezeigt wird.
  2. Elektronische Vorrichtung nach Anspruch 1, wobei:
    das zweite graphische Element eine graphische Schnittstelle, enthaltend eine Anzeige mehrerer Objekte (1132), umfasst, die detektierte Eingabe eine Auswahl eines der mehreren Objekte (1132) umfasst und wobei der Prozessor (210, 420) ferner konfiguriert ist, um für das ausgewählte der mehreren Objekte in Reaktion auf die detektierte Eingabe, ein graphisches Element (1134) auszuwählen, das einen bilderfassenden Einstellungswert zur Anzeige im zweiten Bereich (1114) umfasst.
  3. Elektronische Vorrichtung nach Anspruch 1, wobei:
    der Prozessor (210, 420) ferner konfiguriert ist, um auf der Grundlage eines von der Kameraanwendung erfassten und im ersten graphischen Element (1120) enthaltenen Bilds eine Suche nach Bildern zur Anzeige im zweiten Bereich (1114) anzustellen.
  4. Elektronische Vorrichtung nach Anspruch 3, wobei:
    der Prozessor (210, 420) ferner konfiguriert ist, um zu empfehlende Bilder oder Prioritäten der Bilder auf der Grundlage mindestens eines einer Bilderfassung nahe des aktuellen Orts, der Anzahl von Referenzen oder Empfehlungen, der Genauigkeit der Übereinstimmung mit dem bildzuerfassenden Subjekt (1122) und der Existenz eines bilderfassenden Einstellungswerts zu bestimmen.
  5. Elektronische Vorrichtung nach Anspruch 1, wobei der Prozessor (210, 420) ferner konfiguriert ist, um Informationen entsprechend einem Ort, an dem das ausgewählte Bild in Reaktion auf die detektierte Eingabe erfasst wurde, über die Kameraanwendung auszugeben.
  6. Verfahren zum Steuern einer Kameraanwendung einer elektronischen Vorrichtung (1101), umfassend:
    Detektieren einer Nutzereingabe, in Reaktion auf das Detektieren der Nutzereingabe Steuern, durch einen Prozessor (210, 420) einer flexiblen Anzeige (1106), um einen Anzeigebereich in einen ersten Bereich (1111) und einen zweiten Bereich (1114) zu teilen;
    Steuern der flexiblen Anzeige (1106) zum Anzeigen eines ersten graphischen Elements (1120), das sich auf die Kameraanwendung im ersten Bereich bezieht, und eines zweiten graphischen Elements (1130), das sich auf die Steuerung der Kameraanwendung im zweiten Bereich bezieht, wobei die Kameraanwendung Einstellungswerte aufweist, die konfigurierbar sind, um von der Kameraanwendung erfasste Bilder zu ändern, und das zweite graphische Element (1130) ferner eine graphische Nutzerschnittstelle enthält, die ein Bild (1132) enthält, das sich auf den aktuellen Ort der elektronischen Vorrichtung bezieht;
    und Steuern der Kameranwendung in Reaktion auf das Detektieren einer Eingabe in das zweite graphische Element (1130);
    wobei die detektierte Eingabe in das zweite graphische Element (1130) die Auswahl des Bilds (1132) enthält, das sich auf den aktuellen Ort bezieht, und wobei das Verfahren das Ändern mindestens eines Einstellungswerts der Kameraanwendung umfasst, so dass er mit mindestens einem Einstellungswert übereinstimmt, der vom ausgewählten Bild (1132) angezeigt wird.
  7. Verfahren nach Anspruch 6, wobei das zweite graphische Element eine graphische Schnittstelle, enthaltend eine Anzeige mehrerer Objekte (1132), umfasst, die detektierte Eingabe eine Auswahl eines der mehreren Objekte (1132) umfasst und wobei das Verfahren ferner für das ausgewählte der mehreren Objekte in Reaktion auf die detektierte Eingabe das Anzeigen im zweiten Bereich (1114) eines graphischen Elements (1134) umfasst, das einen bilderfassenden Einstellungswert umfasst.
  8. Verfahren nach Anspruch 6, ferner umfassend das Verwenden des Prozessors (210, 420) zum Durchführen einer Suche auf der Grundlage eines von der Kamera erfassten und im ersten graphischen Element (1120) enthaltenen Bilds nach Bildern zur Anzeige im zweiten Bereich (1114).
  9. Verfahren nach Anspruch 8, ferner umfassend das Verwenden des Prozessors (210, 420) zum Bestimmen von zu empfehlenden Bilder oder Prioritäten der Bilder auf der Grundlage mindestens eines einer Bilderfassung nahe des aktuellen Orts, der Anzahl von Referenzen oder Empfehlungen, der Genauigkeit der Übereinstimmung mit einem bildzuerfassenden Subjekt (1122) und der Existenz eines bilderfassenden Einstellungswerts.
  10. Verfahren nach Anspruch 6, wobei das Verfahren ferner umfasst das Ausgeben von Informationen über die Kameraanwendung, die einem Ort, an dem das ausgewählte Bild in Reaktion auf die detektierte Eingabe (1132) erfasst wurde, entsprechen.
EP17184551.4A 2016-08-03 2017-08-02 Verfahren zur steuerung einer anzeige und elektronische vorrichtung Active EP3279763B1 (de)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP22166971.6A EP4043997A1 (de) 2016-08-03 2017-08-02 Verfahren zum steuern einer anzeige, speichermedium und elektronische vorrichtung
EP19158122.2A EP3521971B1 (de) 2016-08-03 2017-08-02 Verfahren zur steuerung einer anzeige, speichermedium und elektronische vorrichtung

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160099173A KR102649254B1 (ko) 2016-08-03 2016-08-03 디스플레이 제어 방법, 저장 매체 및 전자 장치

Related Child Applications (2)

Application Number Title Priority Date Filing Date
EP19158122.2A Division EP3521971B1 (de) 2016-08-03 2017-08-02 Verfahren zur steuerung einer anzeige, speichermedium und elektronische vorrichtung
EP22166971.6A Division EP4043997A1 (de) 2016-08-03 2017-08-02 Verfahren zum steuern einer anzeige, speichermedium und elektronische vorrichtung

Publications (2)

Publication Number Publication Date
EP3279763A1 EP3279763A1 (de) 2018-02-07
EP3279763B1 true EP3279763B1 (de) 2019-02-20

Family

ID=59564097

Family Applications (3)

Application Number Title Priority Date Filing Date
EP22166971.6A Pending EP4043997A1 (de) 2016-08-03 2017-08-02 Verfahren zum steuern einer anzeige, speichermedium und elektronische vorrichtung
EP17184551.4A Active EP3279763B1 (de) 2016-08-03 2017-08-02 Verfahren zur steuerung einer anzeige und elektronische vorrichtung
EP19158122.2A Active EP3521971B1 (de) 2016-08-03 2017-08-02 Verfahren zur steuerung einer anzeige, speichermedium und elektronische vorrichtung

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP22166971.6A Pending EP4043997A1 (de) 2016-08-03 2017-08-02 Verfahren zum steuern einer anzeige, speichermedium und elektronische vorrichtung

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP19158122.2A Active EP3521971B1 (de) 2016-08-03 2017-08-02 Verfahren zur steuerung einer anzeige, speichermedium und elektronische vorrichtung

Country Status (6)

Country Link
US (1) US11048379B2 (de)
EP (3) EP4043997A1 (de)
KR (1) KR102649254B1 (de)
CN (2) CN117762200A (de)
MY (1) MY193324A (de)
WO (1) WO2018026206A1 (de)

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD353539S (en) * 1990-07-20 1994-12-20 Norden Pac Development Ab Combined tube and cap
USD804443S1 (en) * 2015-10-02 2017-12-05 Samsung Electronics Co., Ltd. Mobile terminal
USD828321S1 (en) 2015-11-04 2018-09-11 Lenovo (Beijing) Co., Ltd. Flexible smart mobile phone
JP1589999S (de) * 2016-12-28 2017-11-06
JP1590263S (de) 2016-12-28 2017-11-06
CN107704167B (zh) * 2017-09-15 2019-08-13 珠海格力电器股份有限公司 一种数据分享方法、装置及电子设备
USD885381S1 (en) * 2017-10-11 2020-05-26 Samsung Display Co., Ltd. Display device
USD893475S1 (en) * 2017-10-11 2020-08-18 Samsung Display Co., Ltd. Display device
KR102255469B1 (ko) * 2018-03-06 2021-05-24 삼성전자주식회사 플렉서블한 디스플레이를 포함하는 전자 장치 및 그 동작 방법
CN108459805B (zh) * 2018-03-30 2021-11-16 努比亚技术有限公司 屏幕截图方法、移动终端及计算机可读存储介质
CN108646960B (zh) * 2018-04-25 2020-09-15 维沃移动通信有限公司 一种文件处理方法及柔性屏终端
CN108628515B (zh) * 2018-05-08 2020-06-16 维沃移动通信有限公司 一种多媒体内容的操作方法和移动终端
KR102485127B1 (ko) 2018-07-06 2023-01-05 삼성전자주식회사 카메라의 방향에 따라 프리뷰 이미지의 위치를 변경하기 위한 전자 장치 및 방법
CN110719347A (zh) * 2018-07-13 2020-01-21 Oppo广东移动通信有限公司 电子设备及电子设备的控制方法
KR102577051B1 (ko) * 2018-07-17 2023-09-11 삼성전자주식회사 분할 화면을 제공하기 위한 전자 장치 및 방법
USD902900S1 (en) * 2018-08-01 2020-11-24 Samsung Electronics Co., Ltd. Mobile phone
CN108962036B (zh) * 2018-09-07 2021-10-19 武汉天马微电子有限公司 一种可折叠显示装置
CN111081143B (zh) * 2018-10-22 2022-08-05 北京小米移动软件有限公司 显示控制方法、装置、电子设备和计算机可读存储介质
CN116088782A (zh) * 2018-10-29 2023-05-09 中兴通讯股份有限公司 一种多屏显示控制方法、装置、设备及可读存储介质
CN111124327A (zh) * 2018-10-31 2020-05-08 中兴通讯股份有限公司 屏幕控制方法、多屏终端及计算机可读存储介质
CN109542328B (zh) * 2018-11-30 2021-04-06 北京小米移动软件有限公司 用户界面显示方法、装置、终端及存储介质
USD920373S1 (en) * 2018-12-21 2021-05-25 Mitsubishi Electric Corporation Display screen with graphical user interface
USD926738S1 (en) * 2019-02-22 2021-08-03 Samsung Electronics Co., Ltd. Mobile phone
CN110058649A (zh) * 2019-03-28 2019-07-26 维沃移动通信有限公司 一种界面显示方法及终端设备
CN111752509A (zh) 2019-03-29 2020-10-09 北京小米移动软件有限公司 显示屏幕的显示控制方法、装置及存储介质
US11308618B2 (en) 2019-04-14 2022-04-19 Holovisions LLC Healthy-Selfie(TM): a portable phone-moving device for telemedicine imaging using a mobile phone
CN109951733B (zh) * 2019-04-18 2021-10-22 北京小米移动软件有限公司 视频播放方法、装置、设备及可读存储介质
CN110138933A (zh) * 2019-04-22 2019-08-16 珠海格力电器股份有限公司 基于折叠屏的拍照面板布局控制方法、系统及智能终端
CN110996034A (zh) * 2019-04-25 2020-04-10 华为技术有限公司 一种应用控制方法及电子装置
CN110401766B (zh) 2019-05-22 2021-12-21 华为技术有限公司 一种拍摄方法及终端
US11379104B2 (en) * 2019-06-07 2022-07-05 Microsoft Technology Licensing, Llc Sharing user interface customization across applications
CN110312073B (zh) * 2019-06-25 2021-03-16 维沃移动通信有限公司 一种拍摄参数的调节方法及移动终端
CN110381282B (zh) * 2019-07-30 2021-06-29 华为技术有限公司 一种应用于电子设备的视频通话的显示方法及相关装置
CN112416190B (zh) * 2019-08-23 2022-05-06 珠海金山办公软件有限公司 一种显示文档的方法及装置
CN110673697B (zh) * 2019-09-23 2021-08-27 Oppo广东移动通信有限公司 电子设备的控制方法、装置、电子设备及存储介质
KR20210035447A (ko) * 2019-09-24 2021-04-01 삼성전자주식회사 폴더블 전자 장치 및 이를 이용한 멀티 윈도우 운용 방법
USD953283S1 (en) * 2019-09-29 2022-05-31 Beijing Xiaomi Mobile Software Co., Ltd. Mobile phone
US11138912B2 (en) 2019-10-01 2021-10-05 Microsoft Technology Licensing, Llc Dynamic screen modes on a bendable computing device
USD973679S1 (en) * 2019-10-28 2022-12-27 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
CN113010076A (zh) 2019-10-30 2021-06-22 华为技术有限公司 一种显示要素的显示方法和电子设备
USD959400S1 (en) * 2019-11-07 2022-08-02 Beijing Xiaomi Mobile Software Co., Ltd. Mobile phone
CN111078091A (zh) * 2019-11-29 2020-04-28 华为技术有限公司 分屏显示的处理方法、装置及电子设备
CN111262994B (zh) * 2020-01-09 2021-07-16 三星电子(中国)研发中心 一种折叠屏智能设备的任务选择方法和装置
USD963604S1 (en) * 2020-04-08 2022-09-13 Beijing Xiaomi Mobile Software Co., Ltd. Mobile phone
USD964306S1 (en) * 2020-04-09 2022-09-20 Beijing Xiaomi Mobile Software Co., Ltd. Mobile phone
KR20210144157A (ko) * 2020-05-21 2021-11-30 주식회사 라인어스 전자 가격 표시기 및 그 제어 방법
KR20210151605A (ko) * 2020-06-05 2021-12-14 삼성전자주식회사 이미지 캡쳐 방법 및 이를 위한 전자 장치
KR20220016362A (ko) * 2020-07-30 2022-02-09 삼성디스플레이 주식회사 표시 장치
KR20220017284A (ko) * 2020-08-04 2022-02-11 삼성전자주식회사 전자 장치 및 그의 화면을 제어하는 방법
CN114363462B (zh) * 2020-09-30 2023-01-06 华为技术有限公司 一种界面显示方法、电子设备及计算机可读介质
KR20220077516A (ko) * 2020-12-02 2022-06-09 삼성전자주식회사 멀티 이미지 표시 방법 및 전자 장치
USD976278S1 (en) * 2021-01-08 2023-01-24 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD987661S1 (en) * 2021-01-13 2023-05-30 Samsung Electronics Co., Ltd. Foldable electronic device with transitional graphical user interface
USD987658S1 (en) * 2021-01-13 2023-05-30 Samsung Electronics Co., Ltd. Electronic device with transitional graphical user interface
USD999184S1 (en) * 2021-03-15 2023-09-19 Samsung Display Co., Ltd. Display device
USD1002568S1 (en) * 2021-03-30 2023-10-24 Samsung Display Co., Ltd. Display device
US11907023B2 (en) * 2021-04-23 2024-02-20 Ricoh Company, Ltd. Information processing system, information processing apparatus, terminal device, and display method
CN117561708A (zh) * 2021-05-17 2024-02-13 Oppo广东移动通信有限公司 电子设备、可折叠滑块设备和信息显示方法
CN114578898A (zh) * 2022-02-18 2022-06-03 维沃移动通信有限公司 显示方法、电子设备及可读存储介质
CN114780001B (zh) * 2022-04-19 2023-04-25 青岛海信智慧生活科技股份有限公司 一种多路开关设备的控制方法、终端设备及服务器
USD986914S1 (en) * 2022-08-22 2023-05-23 Hangzhou Ruisheng Software Co., Ltd. Display screen with graphical user interface
CN116700572B (zh) * 2022-11-22 2024-02-27 荣耀终端有限公司 设备互联交互方法、电子设备和存储介质

Family Cites Families (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5745711A (en) * 1991-10-23 1998-04-28 Hitachi, Ltd. Display control method and apparatus for an electronic conference
JP4367057B2 (ja) 2003-09-01 2009-11-18 ソニー株式会社 制作端末装置,コンピュータプログラム,および関連付け方法
US7953462B2 (en) 2008-08-04 2011-05-31 Vartanian Harry Apparatus and method for providing an adaptively responsive flexible display device
KR101524594B1 (ko) 2009-02-05 2015-06-03 삼성전자주식회사 프로젝터기능을 가지는 휴대단말에서 화면데이터 이원화 제어 방법 및 시스템
EP3734407A1 (de) * 2011-02-10 2020-11-04 Samsung Electronics Co., Ltd. Tragbare vorrichtung mit einer berührungsbildschirmanzeige und verfahren zur steuerung davon
US8787016B2 (en) 2011-07-06 2014-07-22 Apple Inc. Flexible display devices
US8929085B2 (en) 2011-09-30 2015-01-06 Apple Inc. Flexible electronic devices
KR101818114B1 (ko) * 2011-10-10 2018-01-12 엘지전자 주식회사 이동 단말기 및 그것의 사용자 인터페이스 제공 방법
KR102148717B1 (ko) * 2011-12-05 2020-08-28 삼성전자주식회사 휴대용 단말기의 디스플레이 제어 방법 및 장치
KR101302292B1 (ko) 2012-01-05 2013-09-03 (주)이스트소프트 렌더링엔진 자동변환을 위한 웹브라우저를 기록한 컴퓨터 판독가능한 기록매체 및 렌더링엔진 자동변환방법
KR20130080937A (ko) * 2012-01-06 2013-07-16 삼성전자주식회사 플랙서블 디스플레이를 구비하는 단말장치의 화면 표시장치 및 방법
KR101943357B1 (ko) * 2012-06-01 2019-01-29 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
KR102104588B1 (ko) * 2012-07-11 2020-04-24 삼성전자주식회사 플렉서블 디스플레이 장치 및 그 동작 방법
KR101916416B1 (ko) 2012-07-30 2018-11-08 삼성전자주식회사 플렉서블 디스플레이 장치 및 그 디스플레이 방법
KR102043810B1 (ko) 2012-08-20 2019-11-12 삼성전자주식회사 플렉서블 디스플레이 장치 및 그 제어 방법
KR102004409B1 (ko) * 2012-08-23 2019-07-29 삼성전자주식회사 플렉서블 디스플레이 장치 및 그 제어 방법
KR101989016B1 (ko) * 2012-08-29 2019-06-13 삼성전자주식회사 전자장치에서 영상통화중 파일 전송 방법 및 장치
KR102049855B1 (ko) 2013-01-31 2019-11-28 엘지전자 주식회사 이동 단말기 및 이의 제어 방법
KR102089951B1 (ko) * 2013-03-14 2020-04-14 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
KR20150007910A (ko) 2013-07-11 2015-01-21 삼성전자주식회사 사용자 인터렉션을 제공하는 사용자 단말 장치 및 그 방법
JP5815613B2 (ja) 2013-08-01 2015-11-17 株式会社オプティム ユーザ端末、画面共有方法、ユーザ端末用プログラム
KR102256677B1 (ko) 2013-12-02 2021-05-28 삼성디스플레이 주식회사 플렉서블 표시장치 및 이의 영상 표시방법
KR20150064621A (ko) 2013-12-03 2015-06-11 현대자동차주식회사 트렁크 개폐장치
US11054929B2 (en) * 2013-12-19 2021-07-06 Korea Electronics Technology Institute Electronic device and a control method thereof
KR20150072503A (ko) 2013-12-19 2015-06-30 전자부품연구원 전자기기 및 전자기기의 제어방법
KR101588294B1 (ko) * 2013-12-30 2016-01-28 삼성전자주식회사 사용자 인터렉션을 제공하는 사용자 단말 장치 및 그 방법
CN104765446A (zh) * 2014-01-07 2015-07-08 三星电子株式会社 电子设备和控制电子设备的方法
KR102331956B1 (ko) 2014-02-10 2021-11-29 삼성전자주식회사 사용자 단말 장치 및 이의 디스플레이 방법
KR20150135038A (ko) 2014-05-23 2015-12-02 삼성전자주식회사 폴더블 전자 장치 및 그 제어 방법
EP2965184A4 (de) * 2014-05-23 2016-08-31 Samsung Electronics Co Ltd Zusammenklappbare vorrichtung und verfahren zur steuerung davon
KR102276108B1 (ko) * 2014-05-26 2021-07-12 삼성전자 주식회사 폴더형 표시부를 가지는 전자 장치 및 이의 운영 방법
KR102210632B1 (ko) 2014-06-09 2021-02-02 엘지전자 주식회사 벤딩 오퍼레이션을 실행하는 디스플레이 디바이스 및 그 제어 방법
KR20160038510A (ko) * 2014-09-30 2016-04-07 엘지전자 주식회사 이동단말기 및 그 제어방법
KR102390647B1 (ko) * 2014-11-25 2022-04-26 삼성전자주식회사 전자장치 및 전자장치의 객체 제어 방법
KR102353498B1 (ko) * 2014-12-16 2022-01-20 삼성전자주식회사 기능 제공 방법 및 그 전자 장치
KR102358749B1 (ko) * 2014-12-26 2022-02-07 엘지전자 주식회사 디지털 디바이스 및 그 제어 방법
KR102137543B1 (ko) * 2015-01-07 2020-08-13 삼성전자주식회사 벤딩 가능한 사용자 단말 장치 및 이의 디스플레이 방법
EP3043345B1 (de) * 2015-01-07 2020-09-09 Samsung Electronics Co., Ltd. Anzeigevorrichtung und betriebsverfahren dafür
KR102296323B1 (ko) * 2015-01-14 2021-09-01 삼성전자주식회사 전자 장치 및 전자 장치에서의 정보 처리 방법
KR20160088764A (ko) 2015-01-16 2016-07-26 삼성전자주식회사 플렉서블 디바이스 및 그 동작 방법

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
CN107688370A (zh) 2018-02-13
US20180039387A1 (en) 2018-02-08
EP3279763A1 (de) 2018-02-07
CN107688370B (zh) 2023-11-28
EP3521971A1 (de) 2019-08-07
KR20180015532A (ko) 2018-02-13
CN117762200A (zh) 2024-03-26
US11048379B2 (en) 2021-06-29
MY193324A (en) 2022-10-04
EP3521971B1 (de) 2022-04-27
EP4043997A1 (de) 2022-08-17
WO2018026206A1 (en) 2018-02-08
KR102649254B1 (ko) 2024-03-20

Similar Documents

Publication Publication Date Title
EP3279763B1 (de) Verfahren zur steuerung einer anzeige und elektronische vorrichtung
US10534534B2 (en) Method for controlling display, storage medium, and electronic device
EP3483715B1 (de) Elektronische vorrichtung und verfahren zur anzeigesteuerung
KR102497195B1 (ko) 컨텐츠를 처리하는 방법 및 이를 위한 전자 장치 및 저장 매체
KR102391772B1 (ko) 터치 감응 디스플레이를 포함하는 전자 장치 및 이 전자 장치를 동작하는 방법
US20180242446A1 (en) Foldable electronic device and control method thereof
US11210050B2 (en) Display control method, storage medium and electronic device
EP3441844B1 (de) Flexible vorrichtung und betriebsverfahren dafür
US9912880B2 (en) Method and apparatus for adjusting color
US9952661B2 (en) Method for providing screen magnification and electronic device thereof
EP3141982B1 (de) Elektronische vorrichtung zur messung von eingangsdruck und verfahren zum betreiben der elektronischen vorrichtung
EP3082028A2 (de) Vorrichtung und verfahren zur bereitstellung von informationen über einen teil einer anzeige
US10691335B2 (en) Electronic device and method for processing input on view layers
KR102274944B1 (ko) 오브젝트를 식별하는 전자 장치 및 방법
KR102520398B1 (ko) 사용자 데이터를 저장하는 전자 장치 및 그 방법
US10845940B2 (en) Electronic device and display method of electronic device
KR102351317B1 (ko) 전자문서 표시 방법 및 전자 장치
KR102332674B1 (ko) 콘텐츠 변경 알림 방법 및 장치
EP4180930B1 (de) Verfahren zur auswahl von inhalt und elektronische vorrichtung dafür
KR102503942B1 (ko) 디스플레이의 일부분을 통하여 정보를 제공하는 방법 및 장치
KR20180020473A (ko) 전자 장치 및 전자 장치 제어 방법

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180321

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RIC1 Information provided on ipc code assigned before grant

Ipc: H04W 4/02 20180101ALN20180713BHEP

Ipc: G06F 1/16 20060101AFI20180713BHEP

Ipc: G06F 3/0488 20130101ALI20180713BHEP

Ipc: G06F 3/0484 20130101ALI20180713BHEP

Ipc: G06F 3/0482 20130101ALI20180713BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: H04W 4/02 20180101ALN20180820BHEP

Ipc: G06F 3/0484 20130101ALI20180820BHEP

Ipc: G06F 3/0482 20130101ALI20180820BHEP

Ipc: G06F 3/0488 20130101ALI20180820BHEP

Ipc: G06F 1/16 20060101AFI20180820BHEP

INTG Intention to grant announced

Effective date: 20180912

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017002217

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1099050

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190315

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190620

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190520

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190620

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190521

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190520

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1099050

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017002217

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

26N No opposition filed

Effective date: 20191121

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190802

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20190831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190802

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190831

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200831

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20170802

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190220

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20230721

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230720

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230720

Year of fee payment: 7