WO2024053988A1 - Dispositif électronique ayant un afficheur flexible et procédé de fourniture d'un objet de commande sur la base d'un état de préhension de celui-ci - Google Patents

Dispositif électronique ayant un afficheur flexible et procédé de fourniture d'un objet de commande sur la base d'un état de préhension de celui-ci Download PDF

Info

Publication number
WO2024053988A1
WO2024053988A1 PCT/KR2023/013229 KR2023013229W WO2024053988A1 WO 2024053988 A1 WO2024053988 A1 WO 2024053988A1 KR 2023013229 W KR2023013229 W KR 2023013229W WO 2024053988 A1 WO2024053988 A1 WO 2024053988A1
Authority
WO
WIPO (PCT)
Prior art keywords
control object
electronic device
area
display
state
Prior art date
Application number
PCT/KR2023/013229
Other languages
English (en)
Korean (ko)
Inventor
이윤선
김상헌
김주영
김창환
문현정
배성찬
임연욱
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220133910A external-priority patent/KR20240033613A/ko
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to US18/488,464 priority Critical patent/US20240077956A1/en
Publication of WO2024053988A1 publication Critical patent/WO2024053988A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Definitions

  • Various embodiments of the present disclosure provide an electronic device with a flexible display (or foldable display) and a method for providing a control object based on the user's grip state in the electronic device. do.
  • Electronic devices may have a limited size for portability, and as a result, the size of the display is also limited. Accordingly, recently, various types of electronic devices that provide expanded screens through multi displays have been developed. For example, electronic devices are equipped with a plurality of displays and provide an expanded screen through multi-display. For example, electronic devices are gradually increasing in screen size from displays of limited size, and are designed to provide users with various services (or functions) through large screens.
  • electronic devices may have new form factors such as multi-display (e.g., dual display) devices (e.g., foldable devices).
  • Foldable devices are equipped with a foldable (or bendable) display (e.g., flexible display or foldable display) and can be used folded or unfolded.
  • UI user interface
  • the electronic device can provide a fixed user interface regardless of the user's grip state.
  • the electronic device may provide a user interface configured identically regardless of the user's grip state in the unfolded or folded state.
  • user accessibility to the user interface may be reduced.
  • a situation may arise where it is difficult for the user to select (or touch) a control object provided in an area that is out of reach of the user's fingers.
  • a method of providing a control object (or controller) related to an application running on an electronic device to an optimization area according to the state in which the user holds the electronic device. and an electronic device supporting the same is provided.
  • An electronic device includes a display; Memory; and a processor operatively connected to the display and the memory.
  • the processor may control the display to display an application execution screen in a designated state of the electronic device.
  • the processor may operate to detect at least one control object in the execution screen.
  • the processor may operate to determine the user's grip state.
  • the processor may operate to identify a target control object from the at least one control object based on the specified state and the grip state.
  • the processor may operate to provide a duplicate control object corresponding to the target control object to an optimization area corresponding to a grip state.
  • a method of operating an electronic device may include performing an operation of displaying an execution screen of an application in a designated state of the electronic device.
  • the operating method may include performing an operation of detecting at least one control object in the execution screen.
  • the operation method may include performing an operation to determine the user's grip state.
  • the operating method may include performing an operation of identifying a target control object from the at least one control object based on the specified state and the grip state.
  • the operating method may include performing an operation of providing a duplicate control object corresponding to the target control object to an optimization area corresponding to a grip state.
  • various embodiments of the present disclosure may include a computer-readable recording medium on which a program for executing the method on a processor is recorded.
  • a non-transitory computer-readable storage medium (or computer program product) storing one or more programs.
  • one or more programs when executed by a processor of an electronic device, include: displaying an execution screen of an application in a specified state of the electronic device; detecting at least one control object on the execution screen; An operation of determining a user's grip state, an operation of identifying a target control object from the at least one control object based on the specified state and the grip state, and matching a duplicate control object corresponding to the target control object to the grip state. It may include instructions that perform operations provided in the optimization area.
  • a control object (or controller) related to an application running on the electronic device is generated by a user holding the electronic device.
  • the user's finger movement distance can be minimized by providing redundancy in the optimization area according to the state.
  • control objects in the most inaccessible area in the current grip state are provided in duplicate in the most optimized area, so that the user's fingers do not reach the area. User accessibility to provided control objects can be improved.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments.
  • Figure 2 is a block diagram of a display module according to an embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of an electronic device according to an embodiment of the present disclosure.
  • FIG. 4 is a diagram illustrating an example of an electronic device according to an embodiment of the present disclosure.
  • FIG. 5 is a diagram illustrating an example of an electronic device according to an embodiment of the present disclosure.
  • FIG. 6 is a diagram illustrating an example of an electronic device according to an embodiment of the present disclosure.
  • FIGS. 7A and 7B are diagrams illustrating an example of an electronic device according to an embodiment of the present disclosure.
  • FIG. 8A is a diagram illustrating an unfolded state of an electronic device according to an embodiment of the present disclosure.
  • FIG. 8B is a diagram illustrating a folded state of an electronic device according to an embodiment of the present disclosure.
  • FIG. 8C is a diagram illustrating a partially folded state or an intermediate state of an electronic device according to an embodiment of the present disclosure.
  • FIG. 9 is a diagram schematically showing the configuration of an electronic device according to an embodiment of the present disclosure.
  • FIG. 10 is a flowchart illustrating a method of operating an electronic device according to an embodiment of the present disclosure.
  • FIG. 11 is a diagram illustrating an example of a designated state of an electronic device according to an embodiment of the present disclosure.
  • FIG. 12 is a flowchart illustrating a method of operating an electronic device according to an embodiment of the present disclosure.
  • FIG. 13 is a diagram illustrating an example of virtual division of a display area in an electronic device according to an embodiment of the present disclosure.
  • FIG. 14 is a diagram illustrating an example of providing duplicate control objects in an electronic device according to an embodiment of the present disclosure.
  • FIG. 15A is a diagram for explaining a control object and an example in which it is provided according to an embodiment of the present disclosure.
  • FIG. 15B is a diagram for explaining a control object and an example in which it is provided according to an embodiment of the present disclosure.
  • FIG. 16 is a flowchart illustrating a method of operating an electronic device according to an embodiment of the present disclosure.
  • FIG. 17 is a diagram illustrating an example of providing a target control object based on a grip position in an electronic device according to an embodiment of the present disclosure.
  • FIG. 18 is a diagram illustrating an example of providing a target control object based on a grip position in an electronic device according to an embodiment of the present disclosure.
  • FIG. 19A is a diagram illustrating an example of determining a target control object in an electronic device according to an embodiment of the present disclosure.
  • FIG. 19B is a diagram illustrating an example of determining a target control object in an electronic device according to an embodiment of the present disclosure.
  • FIG. 20 is a diagram illustrating an example of providing control objects in duplicate on an execution screen in an electronic device according to an embodiment of the present disclosure.
  • FIG. 21 is a diagram illustrating an example of providing control objects in duplicate on an execution screen in an electronic device according to an embodiment of the present disclosure.
  • FIG. 22 is a diagram illustrating an example of providing a target control object based on a grip position in an electronic device according to an embodiment of the present disclosure.
  • FIG. 23 is a diagram illustrating an example of providing a target control object based on a grip position in an electronic device according to an embodiment of the present disclosure.
  • FIG. 24 is a diagram illustrating an example of determining a target control object in an electronic device according to an embodiment of the present disclosure.
  • FIG. 25 is a diagram illustrating an example of providing control objects in duplicate on an execution screen in an electronic device according to an embodiment of the present disclosure.
  • FIG. 26 is a diagram illustrating an example of providing control objects in duplicate on an execution screen in an electronic device according to an embodiment of the present disclosure.
  • FIG. 27A is a diagram illustrating an example of selecting a target control object based on user interaction in an electronic device according to an embodiment of the present disclosure.
  • FIG. 27B is a diagram illustrating an example of selecting a target control object based on user interaction in an electronic device according to an embodiment of the present disclosure.
  • FIG. 28 is a diagram illustrating an example of selecting a target control object based on user interaction in an electronic device according to an embodiment of the present disclosure.
  • FIG. 29 is a diagram illustrating an example of selecting a target control object based on user interaction in an electronic device according to an embodiment of the present disclosure.
  • FIG. 30 is a diagram illustrating an example of providing a duplicate control object based on user interaction in an electronic device according to an embodiment of the present disclosure.
  • FIG. 31 is a diagram illustrating an example in which a duplicate control object is provided in an electronic device according to an embodiment of the present disclosure.
  • FIG. 32 is a diagram illustrating an example of detecting a holding state in an electronic device according to an embodiment of the present disclosure.
  • FIG. 33 is a diagram illustrating an example of detecting a holding state in an electronic device according to an embodiment of the present disclosure.
  • FIG. 34 is a diagram illustrating an example of detecting a holding state in an electronic device according to an embodiment of the present disclosure.
  • FIG. 35 is a diagram illustrating an example of detecting a holding state in an electronic device according to an embodiment of the present disclosure.
  • FIG. 36 is a diagram illustrating an example of detecting a holding state in an electronic device according to an embodiment of the present disclosure.
  • FIG. 37 is a diagram illustrating an example of detecting a holding state in an electronic device according to an embodiment of the present disclosure.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with at least one of the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • a first network 198 e.g., a short-range wireless communication network
  • a second network 199 e.g., a long-distance wireless communication network.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
  • the processor 120 for example, executes software (e.g., program 140) to operate at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and various data processing or operations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores instructions or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • software e.g., program 140
  • the processor 120 stores instructions or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
  • the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • the processor 120 is a main processor 121 (e.g., a central processing unit (CPU) or an application processor (AP)) or an auxiliary processor (e.g., a central processing unit (CPU) or an application processor (AP)) that can be operated independently or together. 123) (e.g., graphic processing unit (GPU), neural processing unit (NPU), image signal processor (ISP), sensor hub processor, or communication processor (CP, communication processor)) may be included.
  • the electronic device 101 includes a main processor 121 and a auxiliary processor 123, the auxiliary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
  • the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
  • the auxiliary processor 123 may, for example, replace the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or when the main processor 121 While in an active (e.g., application execution) state, at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or At least some of the functions or states related to the communication module 190 can be controlled.
  • coprocessor 123 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 101 itself on which the artificial intelligence model is performed, or may be performed through a separate server (e.g., server 108).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
  • Memory 130 may include volatile memory 132 or non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system (OS) 142, middleware 144, or applications 146. there is.
  • OS operating system
  • middleware middleware
  • applications 146. there is.
  • the input module 150 may receive commands or data to be used in a component of the electronic device 101 (e.g., the processor 120) from outside the electronic device 101 (e.g., a user).
  • the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101.
  • the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 can visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
  • the electronic device 102 e.g., speaker or headphone
  • the sensor module 176 detects the operating state (e.g., power or temperature) of the electronic device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that can be used to connect the electronic device 101 directly or wirelessly with an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, a secure digital (SD) card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD secure digital
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 can capture still images and moving images.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 can manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • Communication module 190 is configured to provide a direct (e.g., wired) communication channel or wireless communication channel between electronic device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108). It can support establishment and communication through established communication channels. Communication module 190 operates independently of processor 120 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • processor 120 e.g., an application processor
  • the communication module 190 may be a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
  • a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
  • the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., a telecommunication network such as a LAN or wide area network (WAN)).
  • a first network 198 e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 e.g., legacy It may communicate with an external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., a telecommunication network such as a LAN or wide area network
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 to communicate within a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technologies include high-speed transmission of high-capacity data (eMBB, enhanced mobile broadband), minimization of terminal power and access to multiple terminals (mMTC, massive machine type communications), or high-reliability and low-latency (URLLC, ultra-reliable and low-latency). communications) can be supported.
  • the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing.
  • MIMO massive array multiple-input and multiple-output
  • the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199). According to one embodiment, the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 197 may transmit or receive signals or power to or from the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for the communication method used in the communication network, such as the first network 198 or the second network 199, is connected to the plurality of antennas by, for example, the communication module 190. can be selected. Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 197.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high-frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high-frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side)
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 or 104 may be of the same or different type as the electronic device 101.
  • all or part of the operations performed in the electronic device 101 may be executed in one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 may perform the function or service instead of executing the function or service on its own.
  • one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of Things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199.
  • the electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • Electronic devices may be of various types.
  • Electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances.
  • Electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, second, or first or second may be used simply to distinguish one element from another, and may be used to distinguish such elements in other respects, such as importance or order) is not limited.
  • One (e.g. first) component is said to be “coupled” or “connected” to another (e.g. second) component, with or without the terms “functionally” or “communicatively”.
  • any of the components can be connected to the other components directly (e.g. wired), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as logic, logic block, component, or circuit, for example. It can be used as A module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions. For example, according to one embodiment, the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document are one or more instructions stored in a storage medium (e.g., built-in memory 136 or external memory 138) that can be read by a machine (e.g., electronic device 101). It may be implemented as software (e.g., program 140) including these.
  • a processor e.g., processor 120
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and this term refers to cases where data is semi-permanently stored in the storage medium. There is no distinction between temporary storage cases.
  • Computer program products are commodities and can be traded between sellers and buyers.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)) or via an application store (e.g. Play Store TM ) or on two user devices (e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • a machine-readable storage medium e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play Store TM
  • two user devices e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • at least a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server.
  • each component (e.g., module or program) of the above-described components may include a single or plural entity, and some of the plurality of entities may be separately placed in other components. there is.
  • one or more of the components or operations described above may be omitted, or one or more other components or operations may be added.
  • multiple components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar manner as those performed by the corresponding component of the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order. may be removed, omitted, or one or more other operations may be added.
  • Figure 2 is a block diagram 200 of a display module according to an embodiment of the present disclosure.
  • the display module 160 may include a display 210 and a display driver IC (DDI) 230 for controlling the display 210.
  • the DDI 230 may include an interface module 231, a memory 233 (eg, buffer memory), an image processing module 235, or a mapping module 237.
  • the DDI 230 transmits image information, including image data or an image control signal corresponding to a command for controlling the image data, to other components of the electronic device 101 through the interface module 231. It can be received from.
  • image information may be stored in the processor 120 (e.g., main processor 121 (e.g., application processor) or a secondary processor 123 (e.g., graphics processing unit) that operates independently of the functions of the main processor 121. ) can be received from.
  • DDI 230 can communicate with the touch circuit 250 or sensor module 176 through the interface module 231. Additionally, the DDI 230 may store at least some of the received image information in the memory 233, for example, on a frame basis.
  • the image processing module 235 may pre-process or post-process at least a portion of the image data based on the characteristics of the image data or the characteristics of the display 210. For example, adjust resolution, brightness, or size).
  • the mapping module 237 may generate a voltage value or current value corresponding to the image data pre-processed or post-processed through the image processing module 235.
  • the generation of a voltage value or a current value is, for example, the properties of pixels of the display 210 (e.g., an array of pixels (RGB stripe or pentile structure), or each subpixel). size).
  • At least some pixels of the display 210 are, for example, driven based at least in part on the voltage value or the current value to display visual information (e.g., text, image, or icon) corresponding to the image data on the display 210. It can be displayed through .
  • the display module 160 may further include a touch circuit 250.
  • the touch circuit 250 may include a touch sensor 251 and a touch sensor IC 253 for controlling the touch sensor 251.
  • the touch sensor IC 253 may control the touch sensor 251 to detect a touch input or a hovering input for a specific position of the display 210.
  • the touch sensor IC 253 may detect a touch input or hovering input by measuring a change in a signal (eg, voltage, light amount, resistance, or charge amount) for a specific position of the display 210.
  • the touch sensor IC 253 may provide information (e.g., location, area, pressure, or time) about the detected touch input or hovering input to the processor 120.
  • At least a portion of the touch circuit 250 is part of the DDI 230, the display 210, or another device disposed outside the display module 160. It may be included as part of a component (e.g., auxiliary processor 123).
  • the display module 160 may further include at least one sensor (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illumination sensor) of the sensor module 176, or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display module 160 (eg, the display 210 or the DDI 230) or a part of the touch circuit 250.
  • the sensor module 176 embedded in the display module 160 includes a biometric sensor (e.g., a fingerprint sensor)
  • the biometric sensor records biometric information associated with a touch input through a portion of the display 210. (e.g. fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information associated with a touch input through part or the entire area of the display 210. You can.
  • the touch sensor 251 or the sensor module 176 may be disposed between pixels of a pixel layer of the display 210, or above or below the pixel layer.
  • FIGS. 3, 4, 5, and 6 are diagrams illustrating an example of an electronic device according to an embodiment of the present disclosure.
  • Figures 3, 4, 5, or 6 may show an example of changing the display type (or state of the electronic device) depending on the display type in the electronic device 101 according to an embodiment.
  • an electronic device 101 includes a foldable (or bendable) display (e.g., a foldable display or flexible display). ) may show an example of changing the form of a display (e.g., the display 210 in FIG. 2).
  • the electronic device 101 may be implemented in various forms, and depending on the implementation form of the electronic device 101, the display Can be folded and unfolded in various ways.
  • the example of the electronic device 101 shown in FIGS. 3 and 4 may represent an example in which the electronic device 101 each has one folding axis.
  • the electronic device 101 includes two display surfaces (eg, a first display surface and a second display surface) in the form of a display (eg, display 210 in Figure 2).
  • An example of change can be shown.
  • the example of the electronic device 101 shown in FIGS. 5 and 6 may represent an example in which the electronic device 101 has two folding axes.
  • the shape of the display 210 is changed in the electronic device 101 including three display surfaces (e.g., a first display surface, a second display surface, and a third display surface).
  • the various embodiments are not limited to this, and this is an example, and the number of folding axes that the electronic device 101 can have is not limited.
  • the electronic device 101 may include a foldable device that can be folded and unfolded.
  • the electronic device 101 is equipped with a foldable (or bendable) display (eg, a foldable display or flexible display) and can be used by folding or unfolding.
  • the in-folding type electronic device 101 has a folded point when folded (e.g., in a folded state (or closed state) based on the shape of FIG. 3 or FIG. 5). At least a portion of the first display surface (or first area) and the second display surface (or second area) of the display 210 may be in a closed state in contact with each other based on (e.g., a folding axis or hinge axis). .
  • the out-folding type electronic device 101 is folded (e.g., in a folded state based on the shape of FIG. 4) based on the folding point (e.g., folding axis or hinge axis). At least a portion of the first portion and the second portion of the housing (eg, cover) may be in a closed state in contact with each other.
  • the in/out-folding type electronic device 101 is folded (e.g., in a folded state based on the shape of FIG. 6) with the housing based on a first folded point (e.g., the first folding axis). At least a portion of the first portion and the second portion (e.g., cover) are in contact with each other, and the second display surface (or area) and at least a portion of the third display surface (or third area) may be in a closed state in contact with each other.
  • a first folded point e.g., the first folding axis
  • Figures 3, 4, 5, or 6 show a state in which the display of the electronic device 101 is folded at a certain angle (e.g., about 180 degrees) or less with respect to the folding axis (or hinge axis). (e.g. may indicate a partially closed state).
  • the electronic device 101 such as the example of Figure 3, Figure 4, Figure 5, or Figure 6, is operationally connected to a processor (e.g., processor 120 of Figure 1) and a display 210.
  • a processor e.g., processor 120 of Figure 1
  • a display 210 e.g., a liquid crystal display (LCD)
  • DDI display driver IC
  • the first display surface and the second display surface may be connected to one DDI.
  • the electronic device 101 includes a first DDI operatively or electrically connected to the first display surface, and a second DDI operatively or electrically connected to the second display surface. It can be included.
  • the first display surface and the second display surface may be operatively or electrically connected to each other, and may be formed by one display 210 (eg, a foldable display or flexible display).
  • the display 210 of the electronic device 101 may be folded or unfolded in various ways (e.g., in-folding, out-folding, or in/out folding) depending on the implementation type. there is.
  • the electronic device 101 includes a housing that holds a display 210 including a first display surface 310 (or first area) and a second display surface 320 (or second area). It may include (350).
  • the housing 350 may include a foldable structure (e.g., a hinge structure), and the first part 301 and the second part 303 are in a folded state (or In the closed state, they can be formed to face away from each other, and in the unfolded state (or open state), they can be formed to face in the same direction.
  • a foldable structure e.g., a hinge structure
  • the electronic device 101 passes through the center of the electronic device 101 (e.g., the center of the display 210, or between the first display surface 310 and the second display surface 320). It may include a vertical folding axis 390. The electronic device 101 may be folded, unfolded, or bent based on the folding axis 390. According to one embodiment, FIG. 3 shows a form in which the display 210 (e.g., the first display surface 310 and the second display surface 320) of the electronic device 101 is folded inward so as not to be exposed to the outside. You can.
  • the display 210 e.g., the first display surface 310 and the second display surface 320
  • any two parts included in the display 210 of the electronic device 101 face each other (or face each other), which may mean that the two parts are completely parallel or almost parallel.
  • the electronic device 101 when the electronic device 101 is completely folded, it may mean that the two parts of the electronic device 101 are arranged almost closely, although they do not necessarily need to be in contact.
  • the first display surface 310 of the display 210 of electronic device 101 and the second display surface 320 of the display 210 is visually exposed to the outside and forms a flat surface like one display 210, and the area exposed to the outside of the display 210 is the largest, or It can indicate when approaching the largest area.
  • the electronic device 101 includes a housing that holds a display 210 including a first display surface 410 (or first area) and a second display surface 420 (or second area). It may include (450).
  • the housing 450 may include a foldable structure (e.g., a hinge structure), and the first part 401 and the second part 403, in the folded state, face in opposite directions to each other, In the unfolded state, they can be formed to face each other in the same direction.
  • a foldable structure e.g., a hinge structure
  • the electronic device 101 may include a vertical folding axis 490 passing through the center of the electronic device 101.
  • the electronic device 101 may be folded, unfolded, or bent based on the folding axis 490.
  • FIG. 4 shows the display 210 (e.g., the first display surface 410 and the second display surface 420) of the electronic device 101 being visually exposed to the outside of the electronic device 101. It can be shown to be folded outward as much as possible.
  • fully folded electronic device 101 means that the two parts (e.g., housing) included on one side (e.g., back) of the electronic device 101 face each other. This can indicate that the two parts are completely parallel or almost parallel.
  • that the electronic device 101 is completely folded may mean that the two parts included in one side of the electronic device 101 do not necessarily have to be in contact, but are arranged almost closely.
  • fully unfolded electronic device 101 means that the first display surface 410 of the display 210 of the electronic device 101 and the first display surface 410 of the display 210 are 2 This indicates when the display surface 420 is exposed to the outside and forms a flat surface like one display 210, and the area exposed to the outside of the display 210 is the largest or is close to the largest area. You can.
  • the folding axes 390 and 490 are shown to pass through the center of the electronic device 101, but the folding axes 390 and 490 can be positioned at any location in the electronic device 101. It can exist.
  • the electronic device 101 may be folded or bent asymmetrically about the folding axes 390 and 490, and the electronic device 101 may be folded by the folding axes 390 and 490 in a folded state.
  • the sizes of the two divided display surfaces (or two areas) facing each other (or the sizes of each display surface divided after folding) may be different.
  • the shape of the electronic device 101 may be in an intermediate form between fully folded or fully unfolded.
  • the electronic device 101 may detect the folded state or folded degree of the electronic device 101. According to one embodiment, the electronic device 101 may activate or deactivate some display surfaces (or partial areas) of the display 210 by detecting the folded state or degree of folding.
  • all display surfaces of the display 210 e.g., the first display surface ( 310) and the second display surface 320
  • the display surface e.g., the first display
  • the display surface e.g., the first display
  • One side of the display 210 that determines whether the surface 410 (e.g., front surface) or the second display surface 420 (e.g., rear surface) is used based on the result of the determination. can be activated, and the other side of the display 210 that is not used can be deactivated.
  • the electronic device 101 includes a first display surface 510 (or first area), a second display surface 520 (or second area), and a third display surface 530 (or It may include a housing 550 that fixes the display 210 including a third region).
  • the housing 550 may include a foldable structure (eg, a hinge structure). When the electronic device 101 is folded, the first part 501 and the third part 505 face in opposite directions from the second part 503, and the first part 501 and the third part 505 They can face the same direction as each other. When the electronic device 101 is unfolded, the first part 501, the second part 503, and the third part 505 may be formed to face the same direction.
  • FIG. 5 may show an example of an electronic device 101 including two folding axes 590 and 595.
  • the two folding axes 590 and 595 may each be employed in a vertical direction to divide the electronic device 101 into thirds.
  • the electronic device 101 may be folded, unfolded, or bent based on the folding axes 590 and 595.
  • the electronic device 101 illustrated in FIG. 5 may represent an example of the G fold type, with the electronic device 101 in a partially folded state (or partially unfolded state) viewed from the front. ) can represent the folded state viewed from the back (or below).
  • the electronic device 101 as illustrated in FIG. 5 may have different folding or bending directions based on each folding axis 590 and 595. This is an example, and the electronic device 101 may be folded or bent in the same direction based on each of the folding axes 590 and 595. According to one embodiment, the electronic device 101 is folded so that the first display surface 510 and the second display surface 520 of the display 210 face each other, and the second display surface 520 and the third display surface are folded. (530) can be folded face to face. It is not limited to this, and when the electronic device 101 illustrated in FIG.
  • the first display surface 510, the second display surface 520, and the third display surface 530 This is exposed to the outside, and the back of the first display surface 510 and the back of the second display surface 520 are folded to face each other, and the back of the second display surface 520 and the back of the third display surface 530 are folded. This can be folded face to face.
  • the electronic device 101 is asymmetrically folded with respect to each of the folding axes 590 and 595.
  • Each display surface of the electronic device 101 which can be folded or bent, and is separated by the folding axes 590 and 595 even when the electronic device 101 is completely folded with respect to the folding axes 590 and 595 ( or each area) may not completely overlap.
  • the display 210 may be employed on the front and/or rear of the electronic device 101. The display 210 can be activated or deactivated in a similar manner as described in the description with reference to FIGS. 3 and 4 above.
  • the electronic device 101 includes a first display surface 610 (or first area), a second display surface 620 (or second area), and a third display surface 630 (or It may include a housing 650 that fixes the display 210 including a third region).
  • the housing 650 may include a foldable structure (eg, a hinge structure). When the electronic device 101 is folded, the first part 601 and the third part 605 face in opposite directions to the second part 603, and the first part 601 and the third part 605 may face the same direction as each other. When the electronic device 101 is unfolded, the first part 601, the second part 603, and the third part 605 may be formed to face the same direction.
  • FIG. 6 may show an example of an electronic device 101 including two folding axes 690 and 695.
  • the two folding axes 690 and 695 may each be employed in a vertical direction to divide the electronic device 101 into thirds.
  • the electronic device 101 may be folded, unfolded, or bent based on the folding axes 690 and 695.
  • the electronic device 101 illustrated in FIG. 6 may represent an example of the S fold type, with the electronic device 101 in a partially folded state (or partially unfolded state) viewed from the front. ) can show the folded state viewed from the back.
  • the electronic device 101 as illustrated in FIG. 6 may have different folding or bending directions based on each folding axis 690 and 695. This is an example, and the electronic device 101 may be folded or bent in the same direction based on each folding axis 690 and 695.
  • the electronic device 101 has a first display surface 610 of the display 210 exposed to the outside, and a rear surface of the first display surface 610 and a rear surface of the second display surface 620. It can be folded to face each other, and the second display surface 620 and the third display surface 630 of the display 210 can be folded to face each other.
  • the electronic device 101 is asymmetrically folded with respect to each of the folding axes 690 and 695.
  • Each display surface of the electronic device 101 which can be folded or bent, and is separated by the folding axes 690 and 695 even when the electronic device 101 is completely folded with respect to the folding axes 690 and 695 ( or each area) may not completely overlap.
  • the display 210 may be employed on the front and/or rear of the electronic device 101. The display 210 can be activated or deactivated in a similar manner as described in the description with reference to FIGS. 3 and 4 above.
  • the electronic device 101 may detect a change in shape (eg, folding or unfolding) of the display 210 based on various methods.
  • the electronic device 101 may include a state detection sensor based on at least one sensor (eg, the sensor module 176 of FIG. 1).
  • the state detection sensor is, for example, at least one of a proximity sensor, an illumination sensor, a magnetic sensor, a hall sensor, a gesture sensor, a bending sensor, an infrared sensor, a touch sensor, a pressure sensor, or an infrared camera or these. It may include a combination of .
  • the state detection sensor is located on any side of the electronic device 101 (e.g., the folding axis, the end of the housing, the bottom of the display 210 (e.g., under the panel), and/or the display 210.
  • the unfolding angle represents the angle between the two display sides divided by the folding axis of the electronic device 101 and the folding axis.
  • the electronic device 101 measures the unfolding angle to determine whether the electronic device 101 is in a fully folded state, a fully unfolded state, or an unfolded (or folded) state at a certain angle. can do.
  • the display 210 of the electronic device 101 displays when the unfolding angle (or folding angle) measured by the electronic device 101 is about 180 degrees or an angle close to it. It can be judged as fully unfolded (e.g., unfolded state). According to one embodiment, when the unfolding angle measured by the electronic device 201 is about 0 degrees or an angle close to it, the display 210 of the electronic device 201 is completely folded (e.g. : can be judged as folded state).
  • the electronic device 101 determines that the unfolding angle measured by the electronic device 101 is a first specified angle (e.g., an angle that ensures the user's field of view in a partially folded state (e.g., about 90 degrees)). If the angle is greater than or equal to the second specified angle (e.g., about 180 degrees) (hereinafter referred to as 'partially folded state'), the display 210 of the electronic device 101 is partially folded (or partially unfolded) (e.g., : It can be judged to be in a partially folded state.
  • a first specified angle e.g., an angle that ensures the user's field of view in a partially folded state (e.g., about 90 degrees)
  • the second specified angle e.g. 180 degrees
  • the display 210 of the electronic device 101 is partially folded (or partially unfolded) (e.g., : It can be judged to be in a partially folded state.
  • the electronic device 101 may determine, based on data acquired from at least one sensor of the state detection sensor, that the measured unfolding angle (or folding angle) is within a predetermined angle range (e.g., a first specified angle (e.g., If it is within (approximately 90 degrees) ⁇ angle ⁇ second specified angle (e.g., approximately 180 degrees)), it may be determined that the display 210 of the electronic device 101 is folded, bent, or unfolded to a predetermined degree.
  • a predetermined angle range e.g., a first specified angle (e.g., If it is within (approximately 90 degrees) ⁇ angle ⁇ second specified angle (e.g., approximately 180 degrees)
  • FIGS. 7A and 7B are diagrams illustrating an example of an electronic device according to an embodiment of the present disclosure.
  • FIGS. 7A and 7B illustrate an electronic device 101 (e.g., a roll-up display) including a roll-up display (e.g., a rollable display).
  • a roll-up display e.g., a rollable display
  • Rollable device can be shown as an example.
  • the area where the display 210 is visually exposed to the outside depends on the extent to which the user unfolds the display 210. It may be relatively narrow (e.g., example ⁇ 701>), or the area visually exposed to the outside may be relatively large (e.g., example ⁇ 703>).
  • the electronic device 101 may be configured to have a first shape (e.g., It can be used in a folded state) (or in a bar type form).
  • the electronic device 101 when the display 210 has a relatively large area exposed to the outside as in example ⁇ 703> (e.g., when unfolded to a set second range), the electronic device 101 is configured to display in a second form (e.g., when unfolded to the outside). status) (or tablet type or extended display type).
  • the electronic device 101 creates an area where the display 210 is exposed to the outside based on the degree of curvature (e.g., radius of curvature) at which the display 210 is unfolded. You can obtain information related to the size of .
  • the electronic device 101 may measure the unfolding curvature of the display 210 (or the electronic device 101) based on a state detection sensor.
  • the electronic device 101 may have a threshold curvature determined in advance to measure the degree of unfolding curvature, and accordingly, the electronic device 101 displays the display 210 unfolded with a curvature greater than the threshold curvature. ) can obtain the size of the area. Based on the information related to the acquired size, the electronic device 101 determines whether the electronic device 101 is used in a first form (e.g., folded state) as in example ⁇ 701> or in a second form as in example ⁇ 703>. It is possible to determine whether it is used in an unfolded state (e.g., in an unfolded state).
  • a first form e.g., folded state
  • a second form as in example ⁇ 703>. It is possible to determine whether it is used in an unfolded state (e.g., in an unfolded state).
  • the electronic device 101 places a virtual threshold line 790 on the display 210 to determine the size of the area where the display 210 of the electronic device 101 is exposed to the outside. Relevant information can be obtained. For example, based on the state detection sensor, the electronic device 101 may obtain information about the difference in curvature of two adjacent parts located in opposite directions with respect to the threshold line 790 on the display 210, If the curvature difference is greater than a predetermined value, it may be determined that the display 210 is exposed to the outside by an area exceeding the threshold line 790. Based on the information related to the acquired size, the electronic device 101 determines whether the electronic device 101 is used in a first form as in example ⁇ 701> or in a second form as in example ⁇ 703>. can do.
  • the user unfolds the display 210 (e.g., the first display 740 and the second display 750).
  • the area exposed to the outside of the display 210 may be relatively small or the area exposed to the outside may be relatively large.
  • the roller unit 730 of the electronic device 101 may have a roughly cylindrical shape, but is not limited thereto.
  • the roller unit 730 may include a roller 720 and a roller housing (not shown) surrounding the roller 720.
  • the roller 720 may be rotatably mounted inside the roller unit 730 using a bearing (not shown).
  • a portion of the second display 750 of the display 210 may be drawn into the roller unit 730 and come into contact with the roller 720.
  • the roller 720 may be coated with an adhesive layer (not shown) with an adhesive function and may be in close contact with the second display 750.
  • the electronic device 101 may at least partially expand or reduce the display screen of the display 210 based on the slide movement of the display 210 (e.g., the second display 750). .
  • the electronic device 101 may be operated in a sliding manner so that the width of one side of the first display 740 is partially variable.
  • the electronic device 101 may be operated in a sliding manner so that the width of the other side of the first display 740 is partially variable.
  • the electronic device 101 may be operated so that the first display 740 has a first distance D1 (or first width W1) in a closed state.
  • the electronic device 101 moves the second display 750 to have a second distance D2 (or a second width W2), thereby displaying the display 210.
  • the screen may be operated to have a third distance D3 (eg, third width W3) that is greater than the first distance D1 (eg, increased by the second distance D2).
  • the display 210 includes a first display 740 that is always visible from the outside, a part of the first display 740 that extends partially, and an electronic device 101 that is not visible from the outside in the folded state. It may include a second display 750 that can be at least partially retracted into the internal space of .
  • the first display 740 of the display 210 is in a folded state (e.g., a slide-in state of the second display 750), and the electronic device 101 is in a folded state. When 101 is in an unfolded state (eg, a slide-out state of the second display 750), it may be always exposed to the outside.
  • the second display 750 of the display 210 is disposed so as not to be exposed to the outside in the internal space of the electronic device 101 when the electronic device 101 is folded (or in a retracted state). When the electronic device 101 is unfolded (or taken out), it can be exposed to the outside to extend from a portion of the first display 740. Accordingly, the electronic device 101 may partially expand the display screen of the display 210 according to the opening operation (eg, slide movement) of the second display 750.
  • the display 210 may have a display area corresponding to the first width W1 in a folded state. According to one embodiment, when the display 210 is in a partially unfolded state (e.g., the second display 750 is pulled out), a portion of the display 210 has a display area expanded beyond the first width W1. You can have it.
  • the electronic device 101 is a foldable (or flexible, rollable) device that can be folded, bent, rolled, or unfolded. It may include a display 210.
  • electronic device 101 may be folded, bent, or unfolded based on one or more folding axes, such as the examples in Figures 3, 4, 5, or 6, and Figure 7A or 7B. As an example, it can be rolled or unfolded based on the roller 720.
  • the form that the electronic device 101 can have may vary.
  • a vertical foldable display is described as an example, but the present invention is not limited thereto and can also be applied in the form of a horizontal foldable display, and the form of the display 210 that the electronic device 101 can have may vary. .
  • FIG. 8A is a diagram illustrating an unfolded state of an electronic device according to an embodiment of the present disclosure.
  • FIG. 8A is a diagram showing the front 800 and the back 850 of the electronic device 101 in an unfolded state.
  • the electronic device 101 includes a first housing 810 including a first side 811 and a third side 813, and a second side ( It may include a second housing 820 including 821) and a fourth surface 823.
  • the first side 811 of the first housing 810 and the second side 821 of the second housing 820 represent the front surface 800 of the electronic device 101
  • the third side of the first housing 810 represents the front surface 800 of the electronic device 101.
  • the surface 813 and the fourth surface 823 of the second housing 820 may represent the rear surface 850 of the electronic device 101.
  • the first housing 810 and the second housing 820 are disposed on both sides about a folding axis (eg, A-axis) and may have an overall symmetrical shape with respect to the folding axis.
  • a folding axis eg, A-axis
  • the first housing 810 is disposed on the left side of the electronic device 101 with respect to the folding axis
  • the second housing 820 is disposed on the left side of the electronic device 101. It can be placed on the right side of .
  • the first housing 810 and the second housing 820 may be designed to be folded relative to each other.
  • a hinge structure 860 is formed between the first housing 810 and the second housing 820, so that the front 800 of the electronic device 101 can be folded.
  • the first housing 810 and the second housing 820 form each other depending on whether the electronic device 101 is in an unfolded (or open) state, a folded (or closed) state, or an intermediate state.
  • the angle or distance may vary.
  • the unfolded state may mean an open state, an open state, or a flat (or flat) state (e.g., FIG. 8A).
  • the unfolded state is a state in which the first housing 810 and the second housing 820 are arranged side by side, and may mean a state in which the electronic device 101 is fully unfolded.
  • the unfolded state is such that the angle between the first housing 810 and the second housing 820 is about 180 degrees, and the first surface 811 of the first housing 810 and the second housing 820 ) may be arranged so that the second surface 821 faces the same direction (eg, the first direction).
  • FIG. 8A is a diagram illustrating the front 800 and the rear 850 of the electronic device 101 when the electronic device 101 is unfolded.
  • the folded state may mean a closed state or a close state (e.g., FIG. 8B).
  • the folded state is a state in which the first housing 810 and the second housing 820 are disposed to face each other, and may mean a state in which the electronic device 101 is completely folded.
  • the folded state is such that the angle between the first housing 810 and the second housing 820 forms a narrow angle (e.g., about 0 degrees to about 5 degrees), and the first housing 810 The surface 811 and the second surface 821 of the second housing 820 may face each other.
  • the electronic device 101 implemented in an in-folding method is described, but the same or similar method is applied to the electronic device 101 implemented in an out-folding method. It can be implemented.
  • the intermediate state is a state in which the first housing 810 and the second housing 820 are arranged at a certain angle, and the electronic device 101 may not be in an unfolded state or a folded state.
  • the intermediate state is when the first surface 811 of the first housing 810 and the second surface 821 of the second housing 820 make a certain angle (e.g., about 6 degrees to about 179 degrees). It can mean a state of being achieved.
  • the electronic device 101 includes a first display 830 (e.g., main display) on the first side 811 and the second side 821, which is the front 800 of the electronic device 101.
  • a first display 830 e.g., main display
  • the first display 830 may be formed entirely on the front surface 800 (eg, in the first direction of the electronic device 101).
  • the first display 830 may include a flexible display in which at least some areas can be transformed into a flat or curved surface.
  • the first display 830 can be folded left and right based on the folding axis (eg, A axis).
  • the first display 830 may include a first display area corresponding to the first side 811 or a second display area corresponding to the second side 821 .
  • the electronic device 101 may place the first camera 814 on the second side 821.
  • the drawing shows one first camera 814, there may be a plurality of first cameras 814.
  • the drawing shows that the first camera 814 is disposed on the second surface 821, the first camera 814 may also be formed on the first surface 811.
  • the electronic device 101 includes a second display 840 (e.g., a cover display or a sub-display) (e.g., FIG. 1 and/or FIG. 2) on a portion of the rear 850 of the electronic device 101.
  • a display module 160 can be formed.
  • the second display 840 may be formed on at least a portion of the third surface 813 of the electronic device 101.
  • the electronic device 101 may include a plurality of cameras (eg, 815, 817, 819, and 825) on the rear 850 of the electronic device 101. For example, the electronic device 101 arranges the second camera 815, the third camera 817, and the fourth camera 819 on the fourth side 823, and the fifth camera 819 on the third side 813.
  • a camera 825 can be placed.
  • the second camera 815, third camera 817, fourth camera 819, and fifth camera 825 may have the same or different performance (e.g., angle of view, resolution). there is.
  • the second camera 815 has a field of view greater than about 125 degrees (e.g., ultra wide)
  • the third camera 817 has a field of view between about 90 degrees and about 125 degrees (e.g., wide).
  • the fourth camera 819 may have an angle of view of approximately 90 degrees and may have a 2x zoom (e.g., tele)
  • the fifth camera 825 may have an angle of view of approximately 90 degrees and a normal magnification.
  • the electronic device 101 may further include a sensor area 841 on the fourth surface 823. Similar to the sensor module 176 of FIG. 1, an infrared sensor, a fingerprint sensor, or an illumination sensor may be placed in the sensor area 841.
  • the first display 830 when the electronic device 101 is unfolded (e.g., Figure 2a), the first display 830 is on (or activated), and the second display 840 is off ( or disabled).
  • the first display 830 When the first display 830 is turned on and no user input (e.g., touch, button selection) is detected for a certain period of time (e.g., about 5 seconds, about 10 seconds, about 1 minute), the electronic device 101 The first display 830 can be turned off.
  • a user input e.g., touch, button selection
  • the electronic device 101 displays the second display 840.
  • the first display 830 when the second display 840 is turned on, the first display 830 may be turned off. According to one embodiment, even if the second display 840 is turned on, the electronic device 101 maintains the first display 830 in the on state for a certain period of time and then displays the first display 830 even after a certain period of time has elapsed. ), when no user input is detected, the first display 830 can be turned off.
  • the electronic device 101 may further include a sensor module (eg, sensor module 176 in FIG. 1).
  • the electronic device 101 may include a sensor module 176 in at least one of the first housing 810 or the second housing 820.
  • the sensor module 176 may include at least one of an acceleration sensor, a gyroscope sensor, a geomagnetic sensor, a proximity sensor, an illumination sensor, a gesture sensor, or a hall sensor.
  • An acceleration sensor is a sensor that detects speed, and a gyroscope sensor can detect the angular velocity, which is the rotation speed of an object.
  • a geomagnetic sensor is a sensor that detects geomagnetism, and like a compass, it can detect geomagnetic directions (e.g., azimuth) such as east, west, south, and north.
  • a proximity sensor detects whether an object is close, and an illuminance sensor can measure the amount of surrounding light (e.g., illuminance) in real time or periodically.
  • Gesture sensors can detect infrared rays.
  • Hall sensors can detect changes in electrical signals based on the proximity or distance of an object with magnetic force (or magnetic force). When a Hall sensor is used to detect the folding state of the electronic device 101, the electronic device 101 may further include a magnet corresponding to the Hall sensor.
  • FIG. 8B is a diagram illustrating a folded state of an electronic device according to an embodiment of the present disclosure.
  • the electronic device 101 may have a hinge structure 860 formed around a folding axis (eg, A-axis), so that the front 800 of the electronic device 101 may be in a folded state.
  • FIG. 8B is a diagram showing the third side 813 of the first housing 810 and the fourth side 823 of the second housing 820 when the electronic device 101 is folded.
  • the first display 830 may be turned off and the second display 840 may be turned on. If a user input is not detected for a certain period of time while the second display 840 is turned on, the electronic device 101 may turn off the second display 840.
  • the electronic device 101 displays the second display 840.
  • the electronic device 101 displays the second display 840.
  • the electronic device 101 displays the second display 840.
  • FIG. 8C is a diagram illustrating a partially folded state or an intermediate state of an electronic device according to an embodiment of the present disclosure.
  • the electronic device 101 has a hinge structure 860 formed around a folding axis (e.g., A axis), so that the first housing 810 and the second housing 820 form a constant angle.
  • the intermediate state 801 one side of the first housing 810 or the second housing 820 is placed on the floor, and the second housing 820 is placed on the floor and the first housing 810 is not placed on the floor.
  • the fourth side 823 of the second housing 820 is placed on the floor, and the first side 811 of the first housing 810 and the second side 821 of the second housing 820 are constant. It shows the state of forming an angle.
  • the first display 830 may be activated and a user interface may be displayed through the first display 830.
  • the user interface may be displayed on the entire screen of the first display 830, or may be divided into two parts (or areas), such as a split screen.
  • an output unit e.g., an application execution screen
  • an input unit is displayed through the second surface 821 of the second housing 820. (e.g. keypad) may be displayed.
  • the electronic device 101 has an asymmetric front and back display (e.g., a first display on the front and a second display on the back (or cover side)).
  • asymmetric front and back display e.g., a first display on the front and a second display on the back (or cover side)
  • the various embodiments according to the present disclosure are not limited thereto.
  • the following describes an example in which the electronic device 101 according to an embodiment is a foldable device with one folding axis, but embodiments of the present disclosure are not limited thereto.
  • the various embodiments described below merely present specific components to easily explain the technical content of the present disclosure and aid understanding of the present disclosure, and are not intended to limit the scope of the present disclosure. Accordingly, the scope of the present disclosure should be construed as including all changes or modified forms derived based on the technical idea of the present disclosure in addition to the embodiments disclosed herein.
  • FIG. 9 is a diagram schematically showing the configuration of an electronic device according to an embodiment of the present disclosure.
  • the electronic device 101 may include various devices that can be used while held by a user with one or both hands.
  • the electronic device 101 may include a foldable device or a rollable device as described in the description with reference to FIGS. 3 to 7 above.
  • the electronic device 101 includes a display 210, a memory 130, a camera module 180, a sensor module 176, and/or a processor 120. It can be included. According to one embodiment, the electronic device 101 may not include at least one component (eg, the camera module 180). According to one embodiment, the electronic device 101 may include one or more other components (e.g., the communication module 190, power management module 188, and/or battery 189 of FIG. 1). For example, the electronic device 101 may include all or at least some of the components of the electronic device 101 as described in the description with reference to FIG. 1 .
  • the display 210 may correspond to the display module 160 or the display 210 as described in the description with reference to FIG. 1 and/or FIG. 2. According to one embodiment, the display 210 may visually provide various information to the outside of the electronic device 101 (eg, a user). According to one embodiment, the display 210 may visually provide various information related to the application under the control of the processor 120.
  • the display 210 may include a first display (eg, main display) on the front and a second display (eg, cover display) on the back.
  • the display 210 may include a touch sensor, a pressure sensor capable of measuring the intensity of touch, and/or a touch panel (e.g., digitizer) that detects a magnetic field-type stylus pen. there is.
  • the display 210 receives a signal (e.g., voltage, light amount, resistance, electromagnetic signal, and/or charge amount) for a specific position of the display 210 based on the touch sensor, pressure sensor, and/or touch panel. Touch input and/or hovering input (or proximity input) can be detected by measuring changes in .
  • the display 210 may include a liquid crystal display (LCD), an organic light emitted diode (OLED), or an active matrix organic light emitted diode (AMOLED). According to one embodiment, the display 210 may include a flexible display.
  • LCD liquid crystal display
  • OLED organic light emitted diode
  • AMOLED active matrix organic light emitted diode
  • the display 210 may include a flexible display.
  • the memory 130 may correspond to the memory 130 described in the description with reference to FIG. 1 .
  • the memory 130 may store various data used by the electronic device 101.
  • data may include, for example, input data or output data for an application (e.g., program 140 of FIG. 1) and instructions associated with the application.
  • the data may include various sensor data (eg, acceleration sensor data, gyro sensor data, or barometric pressure sensor data) acquired from the sensor module 176.
  • the data may include sensor data (eg, touch sensor data) obtained from the touch sensor of the display 210.
  • data may be stored in a designated state of the electronic device 101 (e.g., a first designated state (e.g., unfolded state), a second designated state (e.g., partially folded state), and a third designated state (e.g., : folded state)) may include various reference data set in the memory 130.
  • the data may include various schemas (or algorithms) set in memory 130 to identify the user's grip state.
  • the data represents the distance between the user's grip point and at least one object (e.g., a control object or function controller) for controlling a function related to the application on the execution screen of the application. It may include various schemas (or algorithms) set in the memory 130 for measurement.
  • the memory 130 may store instructions that cause the processor 120 to operate when executed.
  • an application may be stored as software (eg, program 140 in FIG. 1) on the memory 130 and may be executable by the processor 120.
  • the application is a variety of applications that can provide various services (e.g., services (or functions) for optimizing usability according to the grip state (hereinafter referred to as 'optimization service') in the electronic device 101. It can be.
  • the optimization service may represent a service that supports providing control objects (or controllers) in the most inaccessible area redundantly in the most optimized area while the user is holding the electronic device 101.
  • the optimization service provides at least one control object for controlling functions related to an application running on the electronic device 101 in an optimized area according to the state in which the user holds the electronic device 101. It may represent a service that supports minimizing the user's finger movement distance.
  • control object may include one or more control objects for controlling functions related to an application running in the foreground.
  • control object may represent a control object provided on the execution screen currently displayed through the display 210 among various control objects related to the application.
  • control object with the greatest distance from the user's grip position eg, touched point
  • the control object with the greatest distance from the user's grip position may be selected as the target of the duplicate object (eg, target control object) for optimization service.
  • the grip location may include, for example, a point where the user's finger touches the front display 210.
  • the sensor module 176 may correspond to the sensor module 176 described in the description with reference to FIG. 1 .
  • the sensor module 176 may include a state detection sensor.
  • the state detection sensor is, for example, at least one of a proximity sensor, an illumination sensor, a magnetic sensor, a hall sensor, a gesture sensor, a bending sensor, an infrared sensor, a touch sensor, a pressure sensor, or an infrared camera or these. It may include a combination of .
  • the state detection sensor is located on any side of the electronic device 101 (e.g., the folding axis, the end of the housing, the bottom of the display 210 (e.g., under the panel), and/or the display 210. bezel) to measure the folding (or unfolding) angle of the electronic device 101.
  • the electronic device 101 collects sensor data (sensor data) using the state detection sensor of the sensor module 176.
  • the specified state of the electronic device 101 may be determined based on the folding angle (e.g., folding angle).
  • the camera module 180 is a first camera module disposed on the first side (e.g., the front first display side or the second display side) of the electronic device 101, and/or the second side. It may include a second camera module disposed in the rear housing (e.g., a rear housing). According to one embodiment, the camera module 180 may include one or more lenses, an image sensor, and/or an image signal processor (ISP). According to one embodiment, two or more lenses (eg, wide-angle and telephoto lenses) and image sensors may be disposed on one side of the electronic device 101.
  • ISP image signal processor
  • the camera module 180 may be used to identify the user's grip state.
  • the camera module 180 eg, a camera module disposed in a rear housing
  • the camera module 180 may be activated during the operation of identifying the user's grip state under the control of the processor 120.
  • the camera module 180 may capture a subject and transmit related results (eg, a captured image) to the processor 120.
  • the processor 120 may operate to determine the presence or absence of an object corresponding to the user's finger through object recognition from the related results (eg, captured image) of the camera module 180.
  • the processor 120 may perform an application layer processing function required by the user of the electronic device 101. According to one embodiment, the processor 120 may provide commands and control of functions for various blocks of the electronic device 101. According to one embodiment, the processor 120 may perform operations or data processing related to control and/or communication of each component of the electronic device 101. For example, the processor 120 may include at least some of the components and/or functions of the processor 120 of FIG. 1 . The processor 120 may be operatively connected to components of the electronic device 101, for example. The processor 120 may load commands or data received from other components of the electronic device 101 into the memory 130, process the commands or data stored in the memory 130, and store the resulting data. there is.
  • the processor 120 may include processing circuitry and/or executable program elements. According to one embodiment, processor 120 may control (or process) operations related to providing optimization services based on processing circuitry and/or executable program elements. According to one embodiment, the processor 120 generates at least one control object for controlling functions related to an application in a specified state of the electronic device 101, based on processing circuitry and/or executable program elements, to control the user's electronic device. Operations related to providing an optimized area corresponding to the grip state for the device 101 may be controlled (or processed).
  • the processor 120 may control the display 210 to display an execution screen of an application in a designated state (eg, a first designated state or a second designated state) of the electronic device 101.
  • the processor 120 may detect a control object in the execution screen.
  • the processor 120 may determine the user's grip state.
  • the processor 120 may determine at least one target control object from the control objects based on the specified state and the grip state.
  • the processor 120 may generate a duplicate control object based on a control object corresponding to the determined target control object.
  • the processor 120 may provide duplicate control objects by floating them in an optimization area corresponding to the grip state.
  • the processor 120 may detect one or more control objects for controlling functions related to an application running in the foreground. According to one embodiment, the processor 120 may detect a control object from an execution screen currently displayed through the display 210.
  • the processor 120 may determine the grip position for the optimization area based on the grip state. According to one embodiment, the processor 120 may determine the target control object based on the distance between the determined grip position and the control object.
  • the processor 120 may determine a control object located at the greatest distance from the grip position in a designated state (eg, a first designated state or a second designated state) as the target control object. According to one embodiment, the processor 120 may provide a duplicate control object corresponding to the control object determined as the target control object to the grip position.
  • a designated state eg, a first designated state or a second designated state
  • the processor 120 may divide the display 210 into a plurality of virtual partition areas in the background based on designated partition information. According to one embodiment, the processor 120 may determine the target control object based on the control object in the partition area located at the furthest distance from the partition area at the grip position.
  • the processor 120 may calculate the straight-line distance between the grip position and the control object in the first designated state of the electronic device 101 and determine the control object at the farthest distance as the target control object. According to one embodiment, the processor 120 may determine the control object in the furthest hinge area as the target control object based on calculation of the three-dimensional distance between the grip position and the control object in the second designated state of the electronic device 101. .
  • operations performed by the processor 120 may be implemented as a recording medium (or computer program product).
  • the recording medium may include a non-transitory computer-readable recording medium on which a program for executing various operations performed by the processor 120 is recorded.
  • Embodiments described in this disclosure may be implemented in a recording medium readable by a computer or similar device using software, hardware, or a combination thereof.
  • the operations described in one embodiment include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and field programmable gate arrays (FPGAs). ), processors, controllers, micro-controllers, microprocessors, and/or other electrical units to perform functions. .
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • the recording medium includes the operation of displaying an execution screen of an application in a designated state (e.g., a first designated state or a second designated state) of the electronic device 101, and controlling the execution screen. Detecting an object (or controller), determining a user's grip state, identifying at least one target control object from a control object based on the specified state and the grip state, and a control object identified as the target control object. It may include a computer-readable recording medium on which a program for executing an operation of providing duplicate control objects corresponding to the redundant control objects to the optimization area corresponding to the grip state is recorded.
  • the electronic device 101 includes a display (e.g., the display module 160 of FIG. 1, the display 210 of FIG. 2 or FIG. 9), and a memory (e.g., the display module 160 of FIG. 1 or FIG. 9). It may include a memory 130) and a processor (eg, processor 120 of FIG. 1 or FIG. 9) operatively connected to the display and the memory.
  • a display e.g., the display module 160 of FIG. 1, the display 210 of FIG. 2 or FIG. 9
  • a memory e.g., the display module 160 of FIG. 1 or FIG. 9
  • It may include a memory 130) and a processor (eg, processor 120 of FIG. 1 or FIG. 9) operatively connected to the display and the memory.
  • the processor 120 may perform an operation of controlling the display to display an execution screen of an application in a designated state of the electronic device 101. According to one embodiment, the processor 120 may perform an operation of detecting at least one control object in the execution screen. According to one embodiment, the processor 120 may perform an operation to determine the user's grip state. According to one embodiment, the processor 120 may perform an operation of identifying a target control object from the at least one control object based on the designated state and the grip state. According to one embodiment, the processor 120 may perform an operation of providing a duplicate control object corresponding to the target control object to an optimization area corresponding to the grip state.
  • the first designated state may include a state in which the electronic device is fully unfolded.
  • the second designated state may include a state in which the electronic device is partially folded.
  • control object may include one or more control objects for controlling functions related to an application running in the foreground.
  • control object may be detected from an execution screen currently displayed through the display.
  • the processor 120 may perform an operation of determining a grip position for the optimized area based on the grip state. According to one embodiment, the processor 120 may perform an operation of determining the target control object based on the distance between the grip position and the control object.
  • the processor 120 may perform an operation of determining a control object located at the greatest distance from the grip position in the designated state as the target control object. According to one embodiment, the processor 120 may perform an operation of creating a duplicate control object based on the control object determined as the target control object. According to one embodiment, the processor 120 may perform an operation of providing the duplicate control object to the grip position.
  • the processor 120 may perform an operation in the background to divide the display into a plurality of virtual partitions based on designated partition information. According to one embodiment, the processor 120 may perform an operation of determining a target control object based on a control object in a partition area located at the greatest distance from the partition area at the grip position.
  • the processor 120 may calculate the straight-line distance between the grip position and the control object in the first designated state of the electronic device 101 and determine the control object at the farthest distance. there is. According to one embodiment, the processor 120 determines the control object in the furthest hinge area based on calculation of a three-dimensional distance between the grip position and the control object in the second designated state of the electronic device 101. It can be done.
  • the processor 120 may consider a specified condition along with the distance and perform an operation of determining the target control object based on additional points.
  • the processor 120 may perform an operation of determining the optimized area based on the gripping position or a surrounding area based on the gripping position.
  • the optimized area may be an area corresponding to the gripping position, or may include an area that does not overlap with other objects based on the gripping position.
  • the processor 120 maintains the target control object in its original position and performs an operation of providing duplicate control objects with corresponding functions by floating them in the optimization area. .
  • the processor 120 may perform an operation of providing a floating action button based on the grip position.
  • the floating action button may include a call object for calling a duplicate control object corresponding to a control object designated as the target control object.
  • the processor 120 may perform an operation of providing the floating action button to the grip position. According to one embodiment, the processor 120 may perform an operation of providing the duplicate control object corresponding to the control object to the grip position in response to a user input based on the floating action button.
  • the processor 120 may perform an operation of providing a visual cue to which a visual effect is applied based on a control object determined as the target control object.
  • the visual cue may be maintained while a floating action button or redundant control object is provided.
  • the processor 120 may perform an operation of configuring a duplicate control object having a function corresponding to a control object designated as a target control object. According to one embodiment, the processor 120 may perform an operation of mapping the location where the duplicate control object will be provided with the location of the floating action button.
  • the processor 120 may perform an operation to determine a designated state of the electronic device 101. According to one embodiment, the processor 120 may perform an operation of determining a target control object based on a first specified condition between the grip position and the control object in the first specified state. According to one embodiment, the processor 120 may perform an operation of determining a target control object based on a second designated condition between the grip position and the control object in the second designated state.
  • the control object may include a single control object format and/or a bundled control object format.
  • the processor 120 may perform an operation of creating a duplicate control object in a single control object format corresponding to the control object.
  • the processor 120 may perform an operation of creating a duplicate control object in the bundled control object format corresponding to the control object.
  • the processor 120 determines whether it is possible to secure space for the duplicate control object in the area corresponding to the grip position, based on the designated conditions of the area corresponding to the duplicate control object and the grip position. can be performed.
  • the processor 120 may perform an operation to determine an optimization area in which the duplicate control object will be provided, based on whether it is possible to secure space for the duplicate control object.
  • the processor 120 may perform an operation to determine the area of the grip position as the optimization area. According to one embodiment, if it is not possible to secure the space, the processor 120 may perform an operation to determine the area around the grip position as the optimization area. According to one embodiment, when determining the optimization area, the processor 120 may perform an operation to correct the optimization area based on the presence or absence of other objects overlapping in the determined optimization area.
  • the processor 120 performs an operation of moving, removing, or switching the target control object of the floating action button or duplicate control object based on the user's interaction with the floating action button or duplicate control object. can do.
  • Operations performed in the electronic device 101 include a processor 120 including various processing circuitry and/or executable program elements of the electronic device 101. It can be executed by . According to one embodiment, operations performed by the electronic device 101 may be stored in the memory 130 and, when executed, may be executed by instructions that cause the processor 120 to operate.
  • FIG. 10 is a flowchart illustrating a method of operating an electronic device according to an embodiment of the present disclosure.
  • FIG. 11 is a diagram illustrating an example of a designated state of an electronic device according to an embodiment of the present disclosure.
  • Figure 10 shows that in the electronic device 101 according to one embodiment, control objects in the most difficult to access area while the user is holding the electronic device 101 may be provided in duplicate in the most optimized area.
  • An example of supporting an optimization service can be shown.
  • a method of supporting an optimization service may be performed, for example, according to the flowchart shown in FIG. 10.
  • the flowchart shown in FIG. 10 is merely a flowchart according to an embodiment of the optimization service method for the electronic device 101, and the order of at least some operations may be changed, performed in parallel, performed as independent operations, or at least Some other operations may be performed complementary to at least some of the operations.
  • operations 1001 to 1009 may be performed by at least one processor 120 of the electronic device 101.
  • an operation method performed by the electronic device 101 is operated in a specified state. operation (1001), detecting a control object in the execution screen of the application (1003), determining the user's grip state (1005), and identifying the target control object based on the specified state and grip state (1007) ), and an operation 1009 of providing the target control object to the optimization area based on the grip state.
  • the processor 120 of the electronic device 101 may perform an operation in a designated state of the electronic device 101.
  • the designated state may mean various state information related to the device state in which the electronic device 101 is being used by the user.
  • the specified state may be based on various state information regarding whether the electronic device 101 is unfolded, the unfolded (or collapsed) angle, the operating mode (e.g., flex mode, unfolded mode, collapsed mode), and/or the executing application. It can be used to encompass the state that is determined.
  • the processor 120 may control the display 210 to display an execution screen of an application in a designated state (eg, a first designated state or a second designated state) of the electronic device 101.
  • the user may use the electronic device 101 in a first designated state (eg, fully unfolded state) or a second designated state (eg, partially folded state). An example of this is shown in Figure 11.
  • the first designated state may represent a state in which the electronic device 101 is fully unfolded, as in example 1101.
  • the first designated state is such that the first display surface 1110 of the display 210 of the electronic device 101 and the second display surface 1120 of the display 210 are exposed to the outside, forming one display 210.
  • the second designated state may represent a state in which the electronic device 101 is partially folded, as in example 1103.
  • the second designated state is a state in which the first display surface 1110 of the display 210 of the electronic device 101 and the second display surface 1120 of the display 210 form a certain angle (e.g., unfolded) It may include a state where the angle (or folding angle) is within a specified angle range.
  • the second designated state is such that the unfolding angle (or folding angle) between the first display surface 1110 and the second display surface 1120 is such that the user's field of view is at the first designated angle (e.g., in the partially folded state). It may include a state in which a certain angle is greater than a guaranteed angle (eg, about 90 degrees) and less than a second specified angle (eg, about 180 degrees).
  • the processor 120 may perform an operation of detecting a control object on the application execution screen.
  • the processor 120 may detect one or more control objects related to control of functions supportable by the application on the execution screen of the application being displayed through the display 210.
  • a control object may include various objects selectable by the user to control functions related to an application running in the foreground.
  • the processor 120 may display control objects related to the functions of an application (e.g., browser) on an Internet browser screen (e.g., go to previous page, go to next page, go to home, refresh, favorites, app icon, login, and/or objects related to category selection) may be detected.
  • the processor 120 displays control objects (e.g., pause/play, previous/next content selection, horizontal/vertical) related to the functions of the application (e.g., player) on a media (e.g., image or video) playback screen.
  • control objects e.g., pause/play, previous/next content selection, horizontal/vertical
  • Objects involved in switching modes, adjusting sound, adjusting brightness, and/or selecting playback speed can be detected.
  • the processor 120 may control a control object related to a function of an application (e.g., gallery) in a gallery screen, a control object related to a function of an application (e.g., calendar) in a calendar screen, and/or a message or messenger screen.
  • Various control objects such as control objects related to the functions of an application (eg, message or messenger), can be executed on the electronic device 101 and can be detected from the execution screen currently displayed through the display 210.
  • the processor 120 may perform an operation to determine the user's grip state. According to one embodiment, processor 120 may identify a grip location on display 210 based on a touched point on display 210 .
  • the processor 120 detects the sensor module 176 (e.g., a grip sensor in the bezel area), the camera module 180 on the rear, and/or the display on the rear (e.g., a touch sensor on the cover display). Based at least on the acquired data (e.g., sensor data and/or captured images), the user's grip state (e.g., left-hand grip, right-hand grip, or both-hand grip and the area in which the user's hand is located on the display 210) and the grip location can be identified based on the grip state. For example, the processor 120 may receive sensor data (e.g., position value) from the sensor module 176 and/or the touch sensor of the cover display, and determine the grip state based on the sensor data.
  • sensor data e.g., position value
  • the processor 120 acquires a captured image (e.g., an image for identifying the presence or absence of a user's finger) from the rear camera module 180, and determines the grip state by determining the presence or absence of an object based on the captured image. It may be possible. According to one embodiment, an operation of determining a user's grip state will be described with reference to the drawings described later.
  • a captured image e.g., an image for identifying the presence or absence of a user's finger
  • operations 1003 and 1005 are not limited to the order shown and may be performed in parallel, sequentially, counter-sequentially, or heuristically.
  • the processor 120 may perform an operation of identifying a target control object based on the specified state and grip state. According to one embodiment, the processor 120 may determine at least one target control object among a plurality of control objects based on the distance between the grip position and the control object. For example, the processor 120 may determine a control object located at the greatest distance from the grip position in a designated state (e.g., a first designated state or a second designated state) of the electronic device 101 as the target control object. .
  • a designated state e.g., a first designated state or a second designated state
  • the processor 120 divides the display 210 in the background into a plurality of virtual partitions based on specified partition information (e.g., number of screen divisions), and the partition area of the grip position and the control object are located.
  • the target control object can be determined based on the distance between the divided regions. For example, the processor 120 may determine the target control object based on the control object in the division area located at the furthest distance based on the division area of the grip position.
  • the processor 120 may identify the control object at the farthest distance based on calculating the distance between the grip position and the control object in the first designated state of the electronic device 101 by calculating a straight line distance on the screen. According to one embodiment, the processor 120 determines the distance between the grip position and the control object in the second designated state of the electronic device 101 by calculating the distance between the control object and the control object at the farthest distance. can be identified.
  • the processor 120 when determining the target control object, the processor 120 considers the distance and specified conditions (e.g., frequency of use, time of use, designated priority, and/or number of bundles of function objects), and determines the target control object. You can also determine the control object. According to one embodiment, the processor 120 may grant additional points to a control object located on the folding axis in the second designated state. According to one embodiment, an operation of determining a target control object is described with reference to the drawings described below.
  • the distance and specified conditions e.g., frequency of use, time of use, designated priority, and/or number of bundles of function objects
  • the processor 120 may perform an operation of providing the target control object to the optimization area based on the grip state. According to one embodiment, the processor 120 may determine an optimization area based on the grip position or the surrounding area based on the grip position, and provide a target control object to the determined optimization area.
  • the optimized area may be an area corresponding to the gripping position, or may be an area that does not overlap with other objects based on the gripping position.
  • the processor 120 may maintain the target control object in its original location and provide duplicate control objects with corresponding functions.
  • a control object e.g., duplicate control object
  • identical to the target control object may be configured (e.g., copied) and provided by floating the duplicate control object in an optimization area.
  • the processor 120 when providing a target control object, directly provides a duplicate control object corresponding to the target control object at the grip position or provides user input using a floating action button (FAB). It can be provided in connection with . According to one embodiment, an operation of providing a target control object is described with reference to the drawings described below.
  • FAB floating action button
  • FIG. 12 is a flowchart illustrating a method of operating an electronic device according to an embodiment of the present disclosure.
  • FIG. 13 is a diagram illustrating an example of virtual division of a display area in an electronic device according to an embodiment of the present disclosure.
  • FIG. 14 is a diagram illustrating an example of providing duplicate control objects in an electronic device according to an embodiment of the present disclosure.
  • FIG. 12 shows that in the electronic device 101 according to one embodiment, control objects in the most inaccessible area may be provided in duplicate in the most optimized area while the user is holding the electronic device 101.
  • An example of supporting an optimization service can be shown.
  • the operation described in FIG. 11 is, for example, performed heuristically in combination with the operations described in FIG. 10, or heuristically performed as a detailed operation of some of the described operations. It can be.
  • a method of supporting an optimization service may be performed, for example, according to the flowchart shown in FIG. 12.
  • the flowchart shown in FIG. 12 is merely a flowchart according to an embodiment of a method for optimizing usability according to the holding state of the electronic device 101, and the order of at least some operations is changed, performed in parallel, or performed as independent operations. Or, at least some other operations may be performed complementary to at least some operations.
  • operations 1201 to 1213 may be performed by at least one processor 120 of the electronic device 101.
  • an operation method performed by the electronic device 101 is to set the display area to a virtual state.
  • An operation 1213 of providing a duplicate control object may be included.
  • the processor 120 of the electronic device 101 may virtually divide the display area.
  • the processor 120 may divide the display 210 in the background (or internally) into a plurality of virtual divisions based on specified division information (eg, the number of screen divisions). An example of this is shown in Figure 13.
  • reference numeral 1300 in FIG. 13 may represent a virtual dividing line.
  • the virtual dividing line 1300 is shown for convenience of explanation, and may not be substantially displayed on the display 210 of the electronic device 101.
  • the processor 120 may divide the display area in the background into virtual partition areas corresponding to designated partition information.
  • the virtual dividing line 1300 may be provided as visual information on the display 210 based on the settings of the electronic device 101 so that the user can intuitively perceive the divided area. there is.
  • example ⁇ 1301> virtually divides the area of the display 210 into six (e.g., a first divided area, a second divided area, a third divided area, ..., and a sixth divided area).
  • the specified division information may be “3x2 grid”.
  • the processor 120 may divide the entire area of the display 210 into six virtual areas in a 3x2 grid.
  • the area of the display 210 of example ⁇ 1303> is virtually divided into 18 divisions (e.g., a first division region, a second division region, a third division region, ..., and an 18th division region).
  • 18 divisions e.g., a first division region, a second division region, a third division region, ..., and an 18th division region.
  • the specified segmentation information may be “9x2 grid”.
  • the processor 120 may divide the entire area of the display 210 into 18 virtual areas in a 9x2 grid.
  • the virtual division area is not limited to 6 divisions (e.g., 3x2 grid) in example ⁇ 1301> or 18 divisions (e.g., 9x2 grid) in example ⁇ 1303>, and may be set in various ways.
  • each division can be divided into 2x3 grid or 1x9 grid, or 9 divisions (e.g. 3x3 grid), 8 divisions (e.g. 4x2 grid), 10 divisions (e.g. 5x2 grid), It can be set in various ways, such as 12 divisions (e.g. 4x3 grid or 3x4 grid), or 27 divisions (e.g. 9x3 grid or 3x9 grid).
  • the virtual partition area may have different sizes and shapes depending on the partition method.
  • the virtual partition area may be set based on the position where the electronic device 101 is folded.
  • the virtual divided area may be set as two areas, left/right or top/bottom, in the horizontal or vertical direction of the folding axis according to the horizontal or vertical mode of the electronic device 101.
  • region division of the display 210 may be performed in the background by the processor 120.
  • the divided area of the display 210 may be virtually divided to distinguish the user's grip position and/or the position of the control object.
  • the operation of virtually dividing the area of the display 210 is for the purpose of accurately distinguishing the user's grip position and/or the position of the control object, and the embodiment of the present disclosure is not limited thereto.
  • the processor 120 may operate without dividing the display 210 into regions. For example, the processor 120 may operate by identifying the grip position and directly calculating the distance between the grip position and each control object.
  • the processor 120 may perform an operation of providing a floating action button to the corresponding partition area based on the grip state.
  • a floating action button may represent a calling object (e.g., an icon and/or text) for, for example, calling a duplicate control object corresponding to the target control object. An example of this is shown in Figure 14.
  • reference numeral 1400 represents an example of a floating action button
  • reference numeral 1405 may represent an optimization area
  • reference numeral 1410 represents a first control object (e.g., a bundle (or package) control object)
  • reference numeral 1420 represents a second control object (e.g., a uniform resource locator (URL) input object).
  • Reference numeral 1430 may represent a third control object (eg, screen movement object).
  • the first control object 1410 is optimized among a plurality of control objects (e.g., the first control object 1410, the second control object 1420, and the third control object 1430) associated with the application.
  • reference numeral 1440 may indicate an example of a visual cue for a visual effect indicating that the first control object 1410 is designated as the target control object.
  • reference numeral 1450 may indicate an example of a duplicate control object corresponding to the first control object 1410 designated as the target control object.
  • a floating action button 1400 may be provided (e.g., displayed) in response to the user's grip position (e.g., the point where the user's finger touches on the display 210) according to the user's grip state. there is.
  • the floating action button 1400 may be provided in a floating manner in an area corresponding to the grip position on the display 210.
  • the floating action button 1400 may be arranged so as not to overlap other touch areas in the optimization area 1405.
  • the processor 120 may omit the operation based on the floating action button and directly perform the operation of providing a duplicate control object.
  • the processor 120 may not perform operation 1203.
  • the processor 120 may operate to directly provide a duplicate control object 1450 as illustrated by reference numeral 1450 in FIG. 14 without displaying the floating action button 1400.
  • the processor 120 when providing the target control object 1410, the processor 120 immediately provides a duplicate control object 1450 corresponding to the target control object at the grip position, or provides the duplicate control object 1450 corresponding to the target control object at the grip position, as illustrated in FIGS. 12 and 14. As shown, it can be provided in association with user input using the floating action button 1400.
  • the processor 120 may perform an operation of extracting a control object (e.g., the first control object 1410, the second control object 1420, and the third control object 1430) associated with the application.
  • the control object may include one or more control objects for controlling functions related to an application running in the foreground.
  • the control object is a control object provided on the execution screen currently displayed through the display 210 among various control objects related to the application (e.g., the first control object 1410, the second control object 1420) , and a third control object 1430).
  • the processor 120 may extract a control object associated with an application based on the partition area.
  • the processor 120 selects at least one control object (e.g., a first control object 1410, a second control object) displayed on the execution screen displayed through the display 210 among control objects related to the application. (1420), and a third control object (1430) can be extracted.
  • a control object e.g., a first control object 1410, a second control object
  • a third control object (1430) can be extracted.
  • the processor 120 determines the relative area of the grip position based on the divided area. It can operate to identify a distant partition and extract control objects within the identified partition.
  • the processor 120 may perform an operation to identify a target control object.
  • the processor 120 may determine a control object (eg, the first control object 1410) corresponding to a specified condition among the extracted control objects as the target control object.
  • the processor 120 targets the target based on the distance between the grip position and the control object (e.g., the first control object 1410, the second control object 1420, and the third control object 1430).
  • a control object e.g, first control object 1410
  • the processor 120 may select a control object (e.g., a first control object (e.g., a first control object ( 1410)) can be determined as the target control object.
  • the processor 120 may determine the target control object based on the control object in the division area located at the furthest distance based on the division region of the grip position. According to one embodiment, the processor 120 may identify the control object at the farthest distance based on calculating the distance between the grip position and the control object in the first designated state of the electronic device 101 by calculating a straight line distance on the screen. According to one embodiment, the processor 120 may identify the farthest control object based on a three-dimensional (3D) distance calculation of the distance between the grip position and the control object in the second designated state of the electronic device 101. there is. An example of this is shown in Figure 14.
  • the control object of reference numeral 1410 (e.g., the first control object 1410) is the furthest distance from the user's grip position (e.g., corresponding to the position where the floating action button 1400 is provided in FIG. 14). It may be a control object located in .
  • the processor 120 may determine the first control object 1410 as the target control object.
  • the processor 120 when determining the target control object, the processor 120 considers specified conditions (e.g., frequency of use, time of use, designated priority, and/or number of bundles of function objects) along with the distance, and adds additional points.
  • the target control object may be determined based on .
  • the processor 120 may grant additional points to a control object located on the folding axis in the second designated state. According to one embodiment, an operation of determining a target control object is described with reference to the drawings described below.
  • the processor 120 may perform an operation to provide a visual cue 1440.
  • the processor 120 may provide a visual cue 1440 designated to the target control object 1410 so that the user can intuitively recognize the target control object 1410. An example of this is shown in Figure 14.
  • the first control object determined as the target control object 1 A visual cue 1440 with a visual effect applied to the control object 1410 may be provided.
  • the visual cue 1440 may include visual effects such as highlighting, animation, color change, and/or display of an indicator object (e.g., an arrow image or icon).
  • the visual cue 1440 may be provided as various elements that can focus the user's attention on the target control object.
  • the processor 120 may perform an operation of mapping the floating action button 1400 and the first control object 1410 designated as the target control object.
  • the processor 120 configures (e.g., copies) (or creates) a duplicate control object 1450 that is identical (e.g., has a corresponding function) to the target control object, and creates the duplicate control object 1450.
  • the location to be provided e.g., optimization area 1405) can be mapped to the location of the floating action button 1400.
  • processor 120 may determine an optimization area 1405 for providing redundant control objects 1450 .
  • the processor 120 in the case of an operation that directly provides a duplicate control object 1450 without using the floating action button 1400, the processor 120 does not use the floating action button 1400, but uses the user's grip position (e.g. : Touched point) may operate to map the location where the duplicate control object 1450 will be provided.
  • the user's grip position e.g. : Touched point
  • the processor 120 creates a duplicate control object (e.g., the first control object 1410) in the area corresponding to the floating action button 1400 (e.g., the optimization area 1405). 1450) can be performed.
  • the processor 120 may provide a target control object, the first control object 1410 designated as the target control object is maintained in its original position while a duplicate control object (1410) having a corresponding function is maintained. 1450) may be provided in duplicate on the floating action button 1400 (or the user's grip position).
  • the processor 120 may provide a duplicate control object 1450 identical to the first control object 1410 designated as the target control object by floating it in the optimization area 1405. An example of this is shown in Figure 14.
  • a target control object eg, first control object 1410 and a visual cue 1440 related thereto may be maintained on the display 210.
  • a duplicate control object 1450 may be displayed at a location (or optimization area 1405) corresponding to the user's grip position.
  • the overlapping control object 1450 may be arranged so as not to overlap other touch areas in the optimization area 1405.
  • a duplicate control object 1450 may be displayed in place of the floating action button 1400.
  • FIGS. 15A and 15B are diagrams for explaining a control object and an example in which it is provided according to an embodiment of the present disclosure.
  • FIG. 15A may represent an example of a single control object 1510 or a bundle (or package) control object 1520.
  • the control object provided by the application may be provided in the form of a single control object or a bundle of multiple control objects.
  • a single control object 1510 may be provided as an icon and/or text with a designated range of the touch area 1515.
  • the bundle control object 1520 is a plurality of icons (e.g., Home, Bookmarks, Tabs, Tools) arranged sequentially without space between the plurality of touch areas (1521, 1523, 1525, and 1527). ) and/or may be provided as text.
  • each control object forming the bundled control object 1520 may be a control object supporting different functions.
  • the bundled control object 1520 may be recognized as a single control object.
  • FIG. 15B shows a diagram for mapping a duplicate control object 1500 to a reference position 1400 (e.g., a floating action button or a grip position (or touched point)) to provide a duplicate control object 1500.
  • a reference position 1400 e.g., a floating action button or a grip position (or touched point)
  • the duplicate control object 1500 may be created and provided based on the center point C1 of the reference position 1400 and the center point C2 of the duplicate control object 1500.
  • the center point C2 of the duplicate control object 1500 may overlap the center point C1 of the reference position 1400.
  • the space in which the duplicate control object 1500 will be displayed according to the grip position If there is no (e.g., bias toward a certain edge area), for example, the edge point (E1 or E2) inside the screen opposite the edge area is connected to the center point (C1) of the reference position (1400). You can also map it.
  • the edge point (E1 or E2) inside the screen opposite the edge area is connected to the center point (C1) of the reference position (1400). You can also map it.
  • FIG. 16 is a flowchart illustrating a method of operating an electronic device according to an embodiment of the present disclosure.
  • Figure 16 shows that in the electronic device 101 according to one embodiment, control objects in the most difficult to access area while the user is holding the electronic device 101 may be provided in duplicate in the most optimized area.
  • An example of supporting an optimization service can be shown.
  • the operation described in FIG. 16 is performed heuristically, for example, in combination with the operations described in FIG. 10 and/or FIG. 12, or is performed heuristically as a detailed operation of some of the described operations. can be performed.
  • a method of supporting an optimization service may be performed, for example, according to the flowchart shown in FIG. 16.
  • the flowchart shown in FIG. 16 is merely a flowchart according to an embodiment of a usability optimization method according to the holding state of the electronic device 101, and the order of at least some operations is changed, performed in parallel, or performed as independent operations. Or, at least some other operations may be performed complementary to at least some operations.
  • operations 1601 to 1625 may be performed by at least one processor 120 of the electronic device 101.
  • an operation method performed by the electronic device 101 detects a control object.
  • operation 1601 operation 1603 of determining the user's grip state, operation 1605 of determining the designated state of the electronic device 101, when the electronic device 101 operates in the first designated state, the first
  • An operation 1607 of determining a target control object based on a specified condition an operation 1609 of determining a target control object based on a second specified condition when the electronic device 101 operates in a second specified state
  • the area of the grip position is selected as an optimization area (e.g., the first optimization area).
  • an operation 1619 of providing a redundant control object based on the area of the grip position e.g., an optimization area
  • operation 1621 of determining the surrounding area of the gripping position as the optimization area e.g., the second optimization area
  • providing a redundant control object based on the surrounding area of the gripping position e.g., the optimization area
  • It may include an operation 1623 of performing an operation 1625 and an operation 1625 of performing the corresponding operation based on a user input based on a duplicate control object.
  • the processor 120 of the electronic device 101 may perform an operation to detect a control object.
  • the processor 120 may detect one or more control objects related to control of functions supportable by the application on the execution screen of the application being displayed through the display 210.
  • the processor 120 is executable on the electronic device 101 and can detect various control objects from an execution screen currently displayed through the display 210.
  • the processor 120 may perform an operation to determine the user's grip state.
  • processor 120 may identify a grip location on display 210 based on a touched point on display 210 .
  • the processor 120 is connected to the sensor module 176 (e.g., a grip sensor in the bezel area), the camera module 180 on the rear, and/or the display on the rear (e.g., a touch sensor on the cover display).
  • the grip position can be identified based on the grip state. .
  • an operation of determining a user's grip state will be described with reference to the drawings described later.
  • operations 1601 and 1603 are not limited to the order shown and may be performed in parallel, sequentially, counter-sequentially, or heuristically.
  • the processor 120 may perform an operation to determine a designated state of the electronic device 101.
  • the user uses the electronic device 101 in a first designated state (e.g., fully unfolded state) or a second designated state (e.g., partially folded state), as illustrated in FIG. 11 .
  • the processor 120 may measure the unfolding (or folding) angle of the electronic device 101 based on sensor data from a state detection sensor.
  • the unfolding angle may represent the angle between the two display surfaces (eg, the first display surface and the second display surface) of the electronic device 101 divided by the folding axis and the folding axis.
  • the processor 120 determines whether the electronic device 101 is in a fully unfolded state (e.g., a first designated state) or a folded state at a certain angle (e.g., a first designated state) based on the unfolding angle of the electronic device 101. It is possible to determine whether it is in the second designated state).
  • a fully unfolded state e.g., a first designated state
  • a folded state at a certain angle e.g., a first designated state
  • the processor 120 determines that the display 210 of the electronic device 101 is fully unfolded when the unfolding angle (or folding angle) measured by the state detection sensor is about 180 degrees or an angle close to it. It can be judged by the state (e.g., the first designated state).
  • the processor 120 determines that the unfolding angle measured by the state detection sensor is greater than or equal to the first specified angle (e.g., an angle at which the user's field of view is guaranteed in a partially folded state (e.g., about 90 degrees)) and If the angle is less than the second specified angle (e.g., about 180 degrees), it may be determined to be in a partially folded state (e.g., the second specified state).
  • the first specified angle e.g., an angle at which the user's field of view is guaranteed in a partially folded state (e.g., about 90 degrees)
  • the second specified angle e.g., about 180 degrees
  • the processor 120 may, based on data acquired from at least one sensor of the state detection sensor, determine the measured unfolding angle (or folding angle) to be within a predetermined angle range (e.g., a first specified angle (e.g., approximately 90 degrees) ⁇ angle ⁇ second specified angle (e.g., about 180 degrees)), the display 210 of the electronic device 101 may be determined to be folded or unfolded to a predetermined degree.
  • a predetermined angle range e.g., a first specified angle (e.g., approximately 90 degrees)
  • angle ⁇ second specified angle e.g., about 180 degrees
  • the processor 120 determines whether the electronic device 101 operates in the first designated state. An operation to determine a target control object can be performed. According to one embodiment, the processor 120 may determine the control object located at the greatest distance from the grip position among the control objects detected in the first designated state as the target control object.
  • the processor 120 may identify the control object at the farthest distance based on calculating the distance between the grip position and the control object in the first designated state of the electronic device 101 by calculating a straight line distance on the screen.
  • the processor 120 when determining the target control object, the processor 120 considers specified conditions (e.g., frequency of use, time of use, designated priority, and/or number of bundles of function objects) along with the distance, and adds additional points.
  • the target control object may be determined based on .
  • a first designated condition and an operation of determining a target control object according to a first designated condition in a first designated state will be described with reference to the drawings described below.
  • the processor 120 determines whether the electronic device 101 operates in the second designated state. An operation to determine a target control object can be performed. According to one embodiment, the processor 120 determines the distance between the control object located on the folding axis and/or the grip position among the control objects detected in the second specified state based on three-dimensional (3D) distance calculation. A control object located at a long distance can be determined as the target control object.
  • 3D three-dimensional
  • the processor 120 when determining the target control object, the processor 120 considers specified conditions (e.g., frequency of use, time of use, designated priority, and/or number of bundles of function objects) along with the distance, and adds additional points. Based on this, the target control object may be determined. According to one embodiment, the processor 120 may grant additional points to a control object located on the folding axis in the second designated state. According to one embodiment, an operation of determining a target control object according to a second designated condition in a second designated condition and a second designated state will be described with reference to the drawings described below.
  • specified conditions e.g., frequency of use, time of use, designated priority, and/or number of bundles of function objects
  • the processor 120 may perform an operation to create a duplicate control object corresponding to the target control object.
  • the processor 120 may generate a duplicate control object corresponding to the target control object determined according to the specified condition (e.g., the first specified condition in operation 1607 or the second specified condition in operation 1609).
  • the control object includes a single control object format (e.g., control object 1510 in FIG. 15A) and/or a bundled (or packaged) control object format (e.g., control object 1520 in FIG. 15A). can do.
  • the processor 120 may operate to recognize a bundled control object as one control object.
  • the processor 120 may create a duplicate control object in a single control object format corresponding to the control object.
  • the processor 120 may create a duplicate control object in a bundled control object format corresponding to the control object.
  • the target control object may include one control object or may include a plurality of other control objects.
  • the processor 120 displays the target control object through a designated visual cue on the original location while displaying a control object having the function of the corresponding target control object (or the same as the target control object) (e.g., a duplicate control objects) can be configured (e.g. copied).
  • the processor 120 may determine the condition of the area corresponding to the grip position based on the duplicate control object. According to one embodiment, the processor 120 may control other objects (e.g., tabs, images, text, and/or toolbars associated with an application) in an area corresponding to the size (or area) and/or grip position of the duplicate control object. It is possible to determine specified conditions, such as whether there is overlap (or overlapping) with )).
  • other objects e.g., tabs, images, text, and/or toolbars associated with an application
  • the processor 120 may determine whether it is possible to secure space for a duplicate control object in the area corresponding to the grip position. According to one embodiment, the processor 120 controls an area (or It is possible to determine whether it is possible to secure space for duplicate control objects in the range). According to one embodiment, the processor 120 secures space based on whether the area corresponding to the grip position can have a size (or area) capable of displaying overlapping control objects and/or the presence or absence of other overlapping objects in the corresponding area. You can judge whether it is possible or not.
  • the processor 120 displays the user's information on the display 210 based on whether it is possible to secure space for a duplicate control object (or to provide a duplicate control object) in the area corresponding to the grip position.
  • An optimization area based on the touch point eg, first optimization area
  • an optimization area based on a surrounding area of the touch point eg, second optimization area
  • the processor 120 selects the area of the gripping location as an optimization area.
  • An operation may be performed to determine (e.g., the first optimization area).
  • the processor 120 may provide the corresponding area so that it does not overlap with other objects (e.g., tabs, images, text, and/or toolbars related to the application). For example, the processor 120 may correct the optimization area based on the presence or absence of other objects that overlap (or overlap) in the determined optimization area.
  • the processor 120 may perform an operation of providing a duplicate control object based on an area of the grip position (e.g., an optimization area).
  • the processor 120 may float and provide duplicate control objects with functions corresponding to the target control object in the determined optimization area.
  • the processor 120 displays the target control object through a designated visual cue on the original location while displaying a control object having the function of the corresponding target control object (or the same as the target control object) (e.g., a duplicate control object) can be provided in duplicate in the optimization area.
  • the processor 120 determines if it is not possible to secure space (e.g., space is not secured) in the area corresponding to the grip position (e.g., the touched point) (e.g., “No” in operation 1615), In operation 1621, an operation may be performed to determine an area surrounding the grip position as an optimization area (eg, a second optimization area). According to one embodiment, when determining the optimization area, the processor 120 may provide the corresponding area so that it does not overlap with other objects (e.g., tabs, images, text, and/or toolbars related to the application). For example, the processor 120 may correct the optimization area based on the presence or absence of other objects that overlap (or overlap) in the determined optimization area.
  • other objects e.g., tabs, images, text, and/or toolbars related to the application.
  • the processor 120 may perform an operation of providing a duplicate control object based on a surrounding area (eg, optimization area) of the grip position.
  • the processor 120 may float and provide a duplicate control object having a function corresponding to the target control object in the determined optimization area (eg, the second optimization area).
  • the processor 120 displays the target control object through a designated visual cue on the original location while displaying a control object having the function of the corresponding target control object (or the same as the target control object) (e.g., a duplicate control object) can be provided in duplicate in the optimization area.
  • the processor 120 may operate to perform the corresponding operation based on a user input based on a duplicate control object.
  • the processor 120 may perform various operations based on interaction with the user.
  • the processor 120 may perform an operation to move (e.g., change location), remove, or replace (e.g., create and display another duplicate control object) a duplicate control object according to a user input.
  • the processor 120 may control the operation of the application using functions related to duplicate control objects according to user input.
  • various operations performed based on interaction with a user will be described with reference to the drawings described below.
  • an operation using a floating action button may be performed in parallel, as described in the description with reference to FIG. 12 above.
  • a floating action button is provided in advance, and based on interaction with the user using the floating action button, the floating action button is moved, removed, or changed (e.g. : Changing the target control object) and/or replacing (e.g., providing replacement of a duplicate control object at the position of a floating action button) may be performed in parallel.
  • various operations performed based on user interaction using a floating action button will be described with reference to the drawings described below.
  • FIG. 17 is a diagram illustrating an example of providing a target control object based on a grip position in an electronic device according to an embodiment of the present disclosure.
  • FIG. 17 may illustrate an example of an operation of selecting a target control object for each user's grip state in the portrait mode of the first designated state (e.g., fully unfolded state) of the electronic device 101.
  • the control object is located in the position where it is most difficult for the user to manipulate (e.g., located at the greatest distance from the gripping position (or area) in the first designated state).
  • This may represent an example of providing (or a control object in the area located at the farthest distance) to the optimization area as a target control object.
  • a control object located in an area e.g., a diagonal area
  • the target control object for the duplicate control object may be the target control object for the duplicate control object.
  • the optimization area may represent the most appropriate area within the maximum range that the user's fingers (e.g., thumb) can reach while maintaining the current grip state.
  • a duplicate control object corresponding to the target control object may be placed in the optimization area.
  • example ⁇ 1701> may represent an example in which the user holds the lower left side (eg, area 4) of the electronic device 101 with his left hand.
  • the control object in the area furthest from the left-hand grip position e.g., area 4) is a control located at a diagonal position (e.g., area 3) from the left-hand grip position (e.g., area 4). It may be object 1710.
  • the electronic device 101 is located furthest away based on the distance and/or direction between the user's grip position and the control object 1710 (e.g., the user's finger (e.g., thumb) in a left-hand grip state.
  • the control object 1710 an area that cannot be touched with a finger
  • the optimized area that can be touched may be area 4, and the furthest area may be area 3.
  • the electronic device 101 duplicates the control object 1710 in an optimization area (e.g., area 4) corresponding to the user's grip position through a duplicate control object corresponding to the control object 1710. It can be placed.
  • example ⁇ 1703> may represent an example in which the user grips the upper left corner (eg, area 1) of the electronic device 101 with his left hand.
  • the control object in the area furthest from the left-hand grip position e.g., area 1 is the control object at a position diagonal from the left-hand grip position (e.g., area 1) (e.g., area 6). It may be an object 1720.
  • the electronic device 101 is located furthest away based on the distance and/or direction between the user's grip position and the control object 1720 (e.g., in a left-hand grip state, the user's finger (e.g., thumb)
  • the control object 1720 an area that cannot be touched with a finger
  • the optimized area that can be touched may be area 1, and the furthest area may be area 6.
  • the electronic device 101 duplicates the control object 1720 in an optimization area (e.g., area 1) corresponding to the user's grip position through a duplicate control object corresponding to the control object 1720. It can be placed.
  • example ⁇ 1705> may represent an example in which the user holds the upper right corner (eg, area 3) of the electronic device 101 with his right hand.
  • the control object in the area furthest from the right hand grip position e.g. area 3 is a control located at a diagonal position (e.g. area 4) from the right hand grip position (e.g. area 3). It may be an object 1730.
  • the electronic device 101 is located furthest away based on the distance and/or direction between the user's grip position and the control object 1730 (e.g., in a right-hand grip state, the user's fingers (e.g., thumb)
  • the control object 1730 an area that cannot be touched with a finger
  • the optimized area that can be touched may be area 3, and the furthest area may be area 4.
  • the electronic device 101 duplicates the control object 1730 in an optimization area (e.g., area 3) corresponding to the user's grip position through a duplicate control object corresponding to the control object 1730. It can be placed.
  • example ⁇ 1707> may represent an example in which the user holds the lower right corner (eg, area 6) of the electronic device 101 with his right hand.
  • the control object in the area furthest from the right hand grip position e.g. area 4
  • the control object at a position diagonal from the right hand grip position e.g. area 6) (e.g. area 1). It may be an object 1740.
  • the electronic device 101 is located furthest away based on the distance and/or direction between the user's grip position and the control object 1740 (e.g., in a right-hand grip state, the user's fingers (e.g., thumb)
  • the control object 1740 an area that cannot be touched with a finger
  • the optimized area that can be touched may be area 6, and the furthest area may be area 1.
  • the electronic device 101 duplicates the control object 1740 in an optimization area (e.g., area 6) corresponding to the user's grip position through a duplicate control object corresponding to the control object 1740. It can be placed.
  • example ⁇ 1709> may represent an example in which the user holds both lower ends of the electronic device 101 (eg, area 4 and area 6) with both hands.
  • the control object in the area furthest from the two-hand grip position e.g., area 4 and area 6) is located at a diagonal position (e.g., area 4 and 6) from the two-hand grip position (e.g., area 4 and 6).
  • It may be the control object 1750 in area 2).
  • the electronic device 101 is positioned furthest away based on the distance and/or direction between the user's grip position and the control object 1750 (e.g., in a two-handed grip state, the user's fingers (e.g., each The control object 1750 (an area that cannot be touched with the thumb) can be selected as the target control object.
  • the touchable optimized area may be area 4 and/or area 6, and the furthest area may be area 2.
  • the electronic device 101 controls the optimization area (e.g., area 4 and/or area 6) corresponding to the user's grip position through a redundant control object corresponding to the control object 1750.
  • Objects 1750 may be placed overlappingly.
  • the control object 1750 is provided in one of the optimization areas (e.g., area 4 and area 6), or control is provided to each optimization area corresponding to the two-handed grip.
  • Objects 1750 may also be provided separately.
  • a diagonal position (e.g., area 2) that converges based on the position of the two-handed grip may be determined as the furthest area. It is not limited to this, and in the case of two-handed grip, the diagonal position based on the left hand grip position (e.g., area 3) and the diagonal position based on the right hand grip position (e.g., area 1) are the furthest areas, respectively. You can also decide. Based on each gripping position, there are different redundant control objects, each corresponding to a control object in the corresponding region (e.g., region 1 and region 3), and each corresponding optimization region (e.g., region 4 and 6). area).
  • example ⁇ 1711> may represent an example in which a user holds both upper ends of the electronic device 101 (eg, area 1 and area 3) with both hands.
  • the control object in the area furthest from the two-hand grip position e.g., area 1 and area 3 is located at a diagonal position (e.g., area 1 and 3) from the two-hand grip position (e.g., area 1 and 3).
  • It may be the control object 1760 in area 5).
  • the electronic device 101 is positioned furthest away based on the distance and/or direction between the user's grip position and the control object 1760 (e.g., in a two-handed grip state, the user's fingers (e.g., each The control object 1760 (an area that cannot be touched with the thumb) can be selected as the target control object.
  • the touchable optimized area may be area 1 and/or area 3, and the furthest area may be area 5.
  • the electronic device 101 controls the optimization area (e.g., area 1 and/or area 3) corresponding to the user's grip position through a redundant control object corresponding to the control object 1760.
  • Objects 1760 may be arranged overlapping.
  • the control object 1760 is provided in one of the optimization areas (e.g., area 1 and area 3), or control is provided to each optimization area corresponding to the two-handed grip.
  • Objects 1760 may also be provided separately.
  • a diagonal position (e.g., area 5) that converges based on the position of the two-handed grip may be determined as the furthest area. It is not limited to this, and in the case of two-handed grip, the diagonal position based on the left hand grip position (e.g., area 6) and the diagonal position based on the right hand grip position (e.g., area 1) are the furthest areas, respectively. You can also decide. Based on each grip position, there are different redundant control objects, each corresponding to a control object in the corresponding region (e.g., region 4 and 6), and each corresponding optimization region (e.g., region 1 and 3). area).
  • FIG. 18 is a diagram illustrating an example of providing a target control object based on a grip position in an electronic device according to an embodiment of the present disclosure.
  • FIG. 18 may illustrate an example of an operation of selecting a target control object for each user's grip state in the landscape mode of the electronic device 101.
  • the control object 1810 located at the farthest distance from the user's grip position (or area) among various control objects displayed on the execution screen (or control of the area located at the farthest distance) An example of providing an object 1810 as a target control object to an optimization area may be shown.
  • the control object 1810 located in an area (e.g., a diagonal area) that is physically farthest from the point (or area) touched by the user's finger may be the target control object for the duplicate control object.
  • the optimization area may represent the most appropriate area within the maximum range that the user's fingers (e.g., thumb) can reach while maintaining the current grip state.
  • a duplicate control object corresponding to the target control object may be placed in the optimization area.
  • FIG. 18 illustrates an example in which the user holds the lower left hand (eg, area 4) of the electronic device 101 with his left hand, but the present invention is not limited thereto.
  • the operation of providing a target control object for each user's grip state in landscape mode is equivalent to the operation of providing a target control object for each user's various grip states in portrait mode as described in the description with reference to FIG. 17 above. It may include various corresponding actions.
  • FIGS. 19A and 19B are diagrams illustrating an example of determining a target control object in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 19A and 19B show the relationship between the user's grip position 1900 and the control objects 1910 to 1960 among the plurality of control objects displayed on the execution screen in the first designated state of the electronic device 101.
  • An example of selecting a target control object based on distance and additional points may be shown.
  • FIG. 19A shows the grip position 1900 and the control objects 1910, 1920, and 1930) can show an example of determining a target control object based on the distance between them.
  • FIG. 19A shows a first control object 1910 (e.g., a bundled control object of object A, object B, and object C) in the same area (e.g., area 3) on the currently displayed execution screen. It may represent an example of a case where a plurality of control objects 1910, 1920, and 1930 exist, such as a 2 control object 1920 (e.g., D object), and a third control object 1930 (e.g., E object). .
  • the electronic device 101 may give priority to a control object that is physically distant from the user's grip position 1900. For example, the electronic device 101 calculates the distances 1915, 1925, 1935 to each control object 1910, 1920, 1930 as a central reference of the optimization area (e.g., the grip position 1900), and The first control object 1910 having a long distance (eg, distance 1915) may be determined as the target control object.
  • the electronic device 101 when determining the target control object, selects each control object 1910, 1920, and 1930 in the area furthest from the optimization area of the grip position 1900 (e.g., area 3).
  • the user's usage conditions for the control object such as usage frequency, usage time, assigned priority, and/or number of bundles of function objects, may be identified.
  • the electronic device 101 may use distance bonus points and usage conditions (e.g., usage frequency, usage time, designated priority, and/or number of bundles of function objects) as criteria for selecting a target control object.
  • the electronic device 101 may give priority to control objects with higher usage conditions among the control objects 1910, 1920, and 1930 based on usage conditions.
  • the electronic device 101 is configured to control the control objects 1910, 1920, and 1930 when the usage conditions (e.g., usage frequency, usage time, assigned priority, and/or number of bundles of function objects) are the same.
  • the priority can be set high based on the representative (e.g., maximum, minimum, median, mode, or average) value of each control object (1910, 1920, 1930).
  • a first distance 1915 between the grip position 1900 and the first control object 1910 may select the first control object 1910 as the target control object and provide the first control object 1910 in duplicate to the optimization area.
  • a high usage condition e.g., frequency of use
  • the score of the first control object 1910 may be a total of 17 points, which is the sum of the usage frequency of 10 (e.g., the usage frequency of object A) + the additional point of 7 for the first distance 1915.
  • the score of the second control object 1920 may be a total of 20 points, which is the sum of the frequency of use of 15 (e.g., the frequency of use of the D object) + the additional point of 5 for the second distance 1925.
  • the score of the third control object 1930 exceeds about 20 in total by the sum of the frequency of use exceeding about 20 (e.g., the frequency of use of the E object) + the additional point of 0 for the third distance 1935. You can.
  • the electronic device 101 may select the third control object 1930 as the target control object and provide the third control object 1930 in duplicate to the optimization area.
  • FIG. 19B shows the grip position 1900 and the control objects 1940, 1950, and 1960, 1970) can show an example of determining a target control object based on the distance between them.
  • FIG. 19B shows a fourth control object 1940 (e.g., area 1940) in each of different areas (e.g., area 1, area 2, area 3, and area 6) on the currently displayed execution screen.
  • a bundled control object of object A, object B, and object C) e.g., D object
  • a sixth control object (1960) e.g., E object
  • a seventh control object (1970) may represent an example where a plurality of control objects (1940, 1950, 1960, 1970) exist.
  • the electronic device 101 may give priority to the fifth control object 1950 that is physically distant from the user's grip position 1900. For example, the electronic device 101 determines the distances 1945, 1955, 1965, 1975 to each control object 1940, 1950, 1960, 1970 as the central reference of the optimization area (e.g., the grip position 1900). The fifth control object 1950 with the longest distance (e.g., distance 1955) may be determined as the target control object.
  • the electronic device 101 may determine a control object in an area other than the control object in the furthest area (e.g., the fifth control object 1950) as the target control object. For example, the electronic device 101 may use distance bonus points and usage conditions (eg, usage frequency and/or usage time) as criteria for selecting a target control object.
  • (1945) extra point 10
  • 5th distance (1955) extra point 15
  • the control object (1970) 10.
  • the score of the fourth control object 1940 may be a total of 20 points, which is the sum of 10 additional points for the fourth distance 1945 + 10 frequency of use.
  • the score of the fifth control object 1950 may be a total of 16 points, which is the sum of 15 additional points for the fifth distance 1955 + 1 frequency of use.
  • the score of the sixth control object 1960 may be a total of 10 points, which is the sum of the additional point 0 of the sixth distance 1965 + the frequency of use 10.
  • the score of the seventh control object 1970 may be a total of 16 points, which is the sum of 6 additional points for the seventh distance 1975 + 10 frequency of use.
  • the electronic device 101 may select the fourth control object 1940 as the target control object and provide the fourth control object 1940 in duplicate to the optimization area.
  • (1945) extra point 10
  • 5th distance (1955) extra point 15
  • the control object (1970) 10.
  • the score of the fourth control object 1940 may be a total of 20 points, which is the sum of 10 additional points for the fourth distance 1945 + 10 frequency of use.
  • the score of the fifth control object 1950 may be a total of 16 points, which is the sum of 15 additional points for the fifth distance 1955 + 1 frequency of use.
  • the score of the sixth control object 1960 may be a total of 30 points, which is the sum of the additional point 0 of the sixth distance 1965 + the frequency of use 30.
  • the score of the seventh control object 1970 may be a total of 16 points, which is the sum of 6 additional points for the seventh distance 1975 + 10 frequency of use.
  • the electronic device 101 may select the sixth control object 1960 as the target control object and provide the sixth control object 1960 in duplicate to the optimization area.
  • (1945) extra point 10
  • 5th distance (1955) extra point 15
  • the control object (1970) 14.
  • the score of the fourth control object 1940 may be a total of 20 points, which is the sum of 10 additional points for the fourth distance 1945 + 10 frequency of use.
  • the score of the fifth control object 1950 may be a total of 16 points, which is the sum of 15 additional points for the fifth distance 1955 + 1 frequency of use.
  • the score of the sixth control object 1960 may be a total of 10 points, which is the sum of the additional point 0 of the sixth distance 1965 + the frequency of use 10.
  • the priority for each area may be determined based on the user's grip position. For example, as shown in the example of FIG. 19B, based on the grip of the lower left hand, it can be defined as area 3 > area 6 > area 2 > area 5 > area 1.
  • the electronic device 101 distinguishes the user's grip type (e.g., left-hand grip, right-hand grip, or both-hand grip) and the position at which the electronic device 101 is gripped (e.g., of the electronic device 101). By distinguishing between (top or bottom), you can determine the priority for each area. For example, the electronic device 101 may select the fourth control object 1940 in area 3, which has a high priority for each area, as the target control object, and provide the fourth control object 1940 in duplicate to the optimization area. there is.
  • FIGS. 20 and 21 are diagrams illustrating an example of providing control objects in duplicate on an execution screen in an electronic device according to an embodiment of the present disclosure.
  • Figure 20 displays execution screens of different depths (or layers) (e.g., upper depth and lower depth) based on screen division in a message (or messenger) application.
  • An example of a state may be shown.
  • the electronic device 101 may provide control objects in duplicate based on the user's grip position regardless of the depth (or layer) of the execution screen.
  • the target control object 2020 in FIG. 20 may be a control object for operating a lower-depth function (or mapped to lower-depth function control).
  • the duplicate control object 2030 may be located at a higher depth.
  • the electronic device 101 operates the function of the depth to which the target control object 2020 corresponding to the duplicate control object 2030 is mapped, regardless of the depth at which the duplicate control object 2030 is located. can do. For example, when a function is executed by a duplicate control object 2030 located at a higher depth, a function at a lower depth to which the target control object 2020 is mapped may be operated.
  • a floating action button 2010 may be provided based on the grip position 2000 within the optimized area according to the grip of the user's lower left hand.
  • the control object 2020 that is the furthest from the grip position 2000 among various control objects on the execution screen may be the target control object.
  • the electronic device 101 determines the position of the floating action button 2010 based on a user input (e.g., tapping or touching the floating action button 2010).
  • the optimization area is determined based on the user's grip position 2000), and the duplicate control object 2030 corresponding to the control object 2020 can be plotted in the optimization area and provided in duplicate.
  • the control object 2020 maintains the original position and controls the redundant control object 2030 through a designated visual cue.
  • the object 2020 can be highlighted so that the user can recognize it.
  • FIG. 21 may show an example of a state in which a gallery application displays an execution screen based on one entire screen without splitting the screen.
  • the electronic device 101 may provide control objects in duplicate based on the user's grip position regardless of the depth (or layer) of the execution screen.
  • a floating action button 2110 may be provided based on the grip position 2100 within the optimized area according to the grip of the user's lower left hand.
  • the control object 2120 that is the furthest from the grip position 2100 among various control objects on the execution screen may be the target control object.
  • the electronic device 101 may receive user input to the floating action button 2110 (e.g., tapping or touching the floating action button 2110) and/or the location of the floating action button 2110 (e.g., the user An optimization area may be determined based on the grip position 2100, and a duplicate control object 2130 corresponding to the control object 2120 may be plotted in the optimization area and provided in duplicate.
  • the control object 2120 when the electronic device 101 provides the redundant control object 2130, the control object 2120 maintains its original position and controls the redundant control object 2130 through a designated visual cue. The object 2120 may be highlighted so that the user can recognize it.
  • the duplicate control objects 2030 and 2130 overlap in the area to be provided. If other objects exist, the positions of the duplicate control objects 2030 and 2130 can be moved (e.g., correction of optimization area) so that they do not overlap with other objects.
  • floating action buttons (2010, 2110) are provided at the user's grip position (2000, 2100) when operating in the first designated state, and the floating action buttons (2010, 2110)
  • An example of providing duplicate control objects (2030, 2130) by floating them based on user input may be shown.
  • duplicate control objects 2030 and 2130 may be directly provided based on the user's grip positions 2000 and 2100.
  • FIGS. 22 and 23 are diagrams illustrating an example of providing a target control object based on a grip position in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 22 and 23 show examples of operations for selecting a target control object for each user's grip state in the portrait mode of the second designated state (e.g., partially folded state) of the electronic device 101. can represent.
  • the area of the display 210 is divided into nine areas (e.g., 1 area) in proportion to the size (e.g., width) of the hinge area (e.g., area 5) of the folding axis in the vertical direction.
  • An example of division can be shown as a 9x2 grid divided into areas (areas 9 to 9) and horizontally divided into two areas (e.g. area A and area B).
  • the position that is most difficult for the user to manipulate among various control objects displayed on the execution screen can be shown as an example of providing a control object to an optimization area as a target control object.
  • the control object located in the area that is physically most difficult for the user to operate from the point (or area) where the user's finger touched e.g., the hinge area farthest from the grip position
  • the optimization area may represent the most appropriate area within the maximum range that the user's fingers (e.g., thumb) can reach while maintaining the current grip state.
  • a duplicate control object corresponding to the target control object may be placed in the optimization area.
  • FIG. 22 shows that the user places the electronic device 101 in a second designated state (e.g., a partially folded state, e.g., in-folding at a certain angle (e.g., about 100 degrees) or less. ) state is maintained for more than N seconds (e.g., about 0.5 seconds), an example of holding the lower left corner of the electronic device 101 with the left hand may be shown.
  • the control object in the hinge area e.g., 5-A area
  • the control object 2210 located at a diagonal position from the left hand grip position (e.g., 5-A area). It can be.
  • the electronic device 101 is located (e.g., left hand) in the furthest hinge area (e.g., 5-A area) based on the distance and/or direction between the user's grip position and the control object 2210.
  • the control object 2210 an area that cannot be touched by the user's finger (e.g., thumb) in the grip state) may be selected as the target control object.
  • the touchable optimized area may be the bottom left area (e.g., area 1-B, area 2-B, area 3-B)
  • the hinge area furthest from the user's operation may be area 5-A.
  • the electronic device 101 selects an optimization area (e.g., 1-B area, 2-B area, 3-B area) corresponding to the user's grip position through a duplicate control object corresponding to the control object 2210.
  • Control objects 2210 may be placed overlapping in area B).
  • Figure 23 shows that the user places the electronic device 101 in a second designated state (e.g., a partially folded state, e.g., an in-folded state below a certain angle (e.g., about 100 degrees) for N seconds. (e.g., maintained for more than about 0.5 seconds), an example of holding the upper left corner of the electronic device 101 with the left hand may be shown.
  • the control object in the hinge area e.g., area 5-B
  • the control object 2310 located at a position diagonal from the left hand grip position (e.g., area 5-B). It can be.
  • the electronic device 101 is located (e.g., left hand) in the furthest hinge area (e.g., 5-B area) based on the distance and/or direction between the user's grip position and the control object 2310.
  • the control object 2310 an area that cannot be touched with the user's finger (e.g., thumb) in the grip state) may be selected as the target control object.
  • the touchable optimized area may be the upper left area (e.g., area 1-A, area 2-A, area 3-A)
  • the hinge area furthest from the user's operation may be area 5-B.
  • the electronic device 101 selects an optimization area (e.g., 1-A area, 2-A area, 3-A area) corresponding to the user's grip position through a duplicate control object corresponding to the control object 2310.
  • Control objects 2310 may be placed overlapping in area A).
  • the user's grip position e.g., the lower left area, the lower right area, or both hands
  • the hinge area furthest from the bottom area may have priority, such as 5-A area > 5-B area.
  • the user's grip position e.g., the upper left area, the upper right area, or both
  • the hinge area furthest from the top area may have priority, such as 5-B area > 5-A area.
  • the electronic device 101 may select the control object in the furthest hinge area as the target control object based on the user's grip position and provide the control object in duplicate to the optimization area.
  • the electronic device 101 when the electronic device 101 operates in the second designated state, it first provides a floating action object and then provides a target control object, as in the first designated state, or operates to provide a floating action object. It can operate to immediately provide the target control object in the furthest hinge area without providing .
  • a visual cue is provided based on a control object corresponding to the target control object, such as an action in the first designated state, and a floating action button or redundant control object is moved according to user interaction. Alternatively, it may include a deletion operation.
  • FIGS. 22 and 23 illustrate an example in which the user holds the lower left or upper left corner of the electronic device 101 with his or her left hand, but the present invention is not limited thereto.
  • the operation of providing a target control object for each user's grip state in the second designated state may include the user's various grip states ( It may include various actions corresponding to the action that provides the target control object (e.g., lower right hand grip, upper right hand grip, lower hand grip with both hands, and upper grip with both hands).
  • FIG. 24 is a diagram illustrating an example of determining a target control object in an electronic device according to an embodiment of the present disclosure.
  • Figure 24 shows the distance between the user's grip position 2400 and the control objects 2410 and 2420 among the plurality of control objects displayed on the execution screen in the second designated state of the electronic device 101 as 3.
  • An example of selecting a target control object based on dimensional distance calculation may be shown.
  • FIG. 24 shows a grip position 2400 and control objects 2410 and 2420 in a state in which the user grips the lower left corner of the electronic device 101 with his left hand in the second designated state of the electronic device 101.
  • An example of determining a target control object based on the distance between them may be shown.
  • FIG. 24 shows a first control object 2410 (e.g., object A in a hinge area) and a second control object 2420 (e.g., object B in a flat area) in different areas on the currently displayed execution screen.
  • a plurality of control objects 2410 and 2420 exist, such as an object).
  • the distance between the user's grip position 2400 and the first control object 2410 may be equal to the distance 2415 on the screen.
  • the distance between the user's grip position 2400 and the second control object 2420 is a distance (e.g., a distance corresponding to the first distance (e.g., B1) and the second distance (e.g., B2) on the screen. B1+B2), but in the second specified state, the actual distance between the grip position 2400 and the second control object 2420 may be a straight line distance in space 2425.
  • the electronic device 101 controls the display surface (eg, the second display surface) opposite to the display surface (eg, the first display surface) corresponding to the grip position 2400. If an object exists, the distance to the corresponding control object can be calculated as a straight line distance in space rather than a distance on the screen. According to one embodiment, the straight-line distance in space may use various 3D distance calculation methods.
  • the electronic device 101 may give priority to a control object that is far from the user's grip position 2400 in the second designated state. For example, the electronic device 101 calculates the distances 2415, 2425 to each control object 2410, 2420 with the center reference of the optimization area (e.g. grip position 2400) and the longest distance (e.g. : The first control object 2410 having a distance 2415) may be selected as the target control object, and the first control object 2410 may be provided in duplicate in the optimization area.
  • the electronic device 101 calculates the distances 2415, 2425 to each control object 2410, 2420 with the center reference of the optimization area (e.g. grip position 2400) and the longest distance (e.g. : The first control object 2410 having a distance 2415) may be selected as the target control object, and the first control object 2410 may be provided in duplicate in the optimization area.
  • the electronic device 101 may use distance bonus points and usage conditions (e.g., usage frequency and/or usage time) in parallel as criteria for selecting the target control object. .
  • the electronic device 101 may give priority to control objects with higher usage conditions among the control objects 2410 and 2420 based on usage conditions.
  • the electronic device 101 uses a representative of each control object 2410 and 2420 (e.g., You can also set the priority higher based on a value (maximum, minimum, median, mode, or average).
  • FIGS. 25 and 26 are diagrams illustrating an example of providing control objects in duplicate on an execution screen in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 25 and 26 an example of a state in which the electronic device 101 displays an execution screen including at least one control object based on one entire screen without splitting the screen in the second designated state is shown. can represent.
  • FIG. 25 may represent an example of a state in which a player application displays an execution screen (eg, a video play screen) including a plurality of control objects in different areas.
  • an execution screen eg, a video play screen
  • the electronic device 101 may provide a duplicate control object 2520 based on the user's grip on the lower left hand and the grip position 2500 within the optimization area.
  • the control object 2510 that is furthest from the grip position 2500 among various control objects on the execution screen may be the target control object.
  • the area (or hinge area) furthest from the grip position 2500 is area 5-A, while the control object may not exist in area 5-A.
  • the electronic device 101 targets control the control object 2510 in the furthest hinge area (e.g., 5-B area) based on the distance between various control objects on the execution screen and the grip position 2500. It can be selected as an object.
  • the electronic device 101 determines an optimization area based on the user's grip position 2500, and provides duplicate control objects 2520 corresponding to the control object 2510 by plotting them in the optimization area. can do.
  • the control object 2510 maintains the original position and controls the redundant control object 2520 through a designated visual cue. The object 2510 may be highlighted so that the user can recognize it.
  • FIG. 26 runs a quick panel in the foreground and displays an execution screen (e.g., quick panel screen) including a plurality of control objects in different areas in the quick panel.
  • an execution screen e.g., quick panel screen
  • An example of a state may be shown.
  • the electronic device 101 may provide a duplicate control object 2620 based on the user's grip on the lower left hand and the grip position 2600 within the optimization area.
  • the control object 2610 that is furthest from the grip position 2600 among various control objects on the execution screen may be the target control object.
  • the area (or hinge area) furthest from the grip position 2600 may be area 5-A.
  • the electronic device 101 targets control the control object 2610 in the furthest hinge area (e.g., 5-A area) based on the distance between various control objects on the execution screen and the grip position 2600. It can be selected as an object.
  • the electronic device 101 determines an optimization area based on the user's grip position 2600, and provides duplicate control objects 2620 corresponding to the control object 2610 by plotting them in the optimization area. can do.
  • the control object 2510 maintains the original position and controls the redundant control object 2620 through a designated visual cue. The object 2610 may be highlighted so that the user can recognize it.
  • the second application switches to the foreground as illustrated in FIG. 26. and can be operated.
  • the duplicate control object 2520 illustrated in FIG. 25 may be removed, and the duplicate control object 2620 illustrated in FIG. 26 may be provided by floating in an optimization area corresponding to the user's grip position.
  • the duplicate control objects 2520 and 2620 overlap in the area to be provided. If other objects exist, the positions of the duplicate control objects 2520 and 2620 can be moved (e.g., correction of optimization area) so that they do not overlap with other objects.
  • FIGS. 25 and 26 may show an example of providing duplicate control objects 2520 and 2620 directly to the user's grip positions 2500 and 2600 when operating in the second designated state.
  • a floating action button is provided at the user's grip position (2500, 2600), and a redundant control object (2520, 2520) is provided based on the user input by the floating action button. 2620) can also be provided in duplicate by floating.
  • FIGS. 27A and 27B are diagrams illustrating an example of selecting a target control object based on interaction with a user in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 27A and 27B show that when a plurality of control objects 2710, 2720, 2730, 2740, and 2750 exist on the execution screen, based on user input using the floating action button 2700, An example of selecting a user-specified target control object through switching between a plurality of control objects 2710, 2720, 2730, 2740, and 2750 may be shown.
  • a third control object eg, E object
  • a visual cue may be displayed based on the third control object 2730.
  • the electronic device 101 may switch to another control object in the same area or another area according to a user input and designate it as the target control object.
  • the electronic device 101 may provide a visual cue through a corresponding control object based on switching between control objects.
  • the user moves the floating action button 2700 in the first direction (1 direction) (e.g. downward) or in the second direction (2 direction) (e.g. : Diagonal direction to top left), or input to move (e.g. drag) selectively or sequentially in the third direction (3 direction) (e.g. right direction).
  • the electronic device 101 selectively or sequentially selects from the third control object 2730 to the fifth control object in response to a user input indicating a specified direction (e.g., dragging in the corresponding direction). (2750), it can be switched to the first control object 2710 or the second control object 2720.
  • the electronic device 101 may move to an object corresponding to the direction according to the user input, with the current target control object (eg, the third control object 2730) as the center.
  • the current target control object e.g., the third control object 2730
  • the left side of the third control object 2730 is the second control object 2720.
  • the fourth control object 2740 e.g. F object
  • the fifth control object 2750 e.g. G object.
  • the electronic device 101 when the electronic device 101 receives a first user input (e.g., a swipe input in a specified direction), the electronic device 101 controls the current target control object (e.g., an object in a specified direction centered on the third control object 2730). You can select by moving one by one.
  • the electronic device 101 may input a second user input different from the first user input (e.g., a long press (e.g., drag and hold) after dragging in a specified direction), Movement between objects may continue in a designated direction centered on the current target control object (e.g., the third control object 2730) until the second user input is released.
  • a first user input e.g., a swipe input in a specified direction
  • the electronic device 101 controls the current target control object (e.g., an object in a specified direction centered on the third control object 2730). You can select by moving one by one.
  • the electronic device 101 may input a second user input different from the first user input (e.g., a
  • a bundled control object such as the second control object 2720 is divided into one object (e.g., a group or bundle). ) can be selected.
  • the electronic device 101 when moving to select a target control object based on a user input, the electronic device 101 skips control objects with low usage conditions (e.g., frequency of use) and moves directly to the next object. Can move (e.g. jump).
  • the electronic device 101 may switch the visual cue based on the control object selected according to the transition between control objects to emphasize the corresponding control object.
  • the user continuously or repeatedly moves the floating action button 2700 in the same fourth direction (D1, D2, D3) (e.g., right direction). You can perform input that moves (e.g. dragging).
  • the electronic device 101 sequentially moves from the third control object 2730 to the fourth control object 2740 in response to a user input indicating a specified direction (e.g., dragging in the corresponding direction). , it is possible to sequentially switch to the fifth control object 2750 and the first control object 2710.
  • the electronic device 101 may switch the visual cue based on the control object selected according to the transition between control objects to emphasize the corresponding control object.
  • the electronic device 101 controls the control object based on a specified user input (or gesture) (e.g., long press or tap & hold). It may enter a transition mode and operate to support transition between control objects in response to user input in the transition mode. For example, the electronic device 101 may allow the floating action button to operate like a jog dial.
  • a specified user input or gesture
  • the electronic device 101 may allow the floating action button to operate like a jog dial.
  • FIG. 28 is a diagram illustrating an example of selecting a target control object based on user interaction in an electronic device according to an embodiment of the present disclosure.
  • Figure 28 is an example of selecting a user-specified target control object by adjusting the number of bundles of control objects based on user input using the floating action button 2800 for the bundled control object 2850. can represent.
  • example ⁇ 2801> may represent an example in which a bundled control object 2850 (e.g., A object 2810, B object 2820, C object 2830) is selected as the target control object. there is.
  • the electronic device 101 may adjust the number of bundled control objects 2850 according to user input and designate them as the target control object.
  • the user while the bundle control object 2850 is selected, the user performs a designated user input (or gesture) using the floating action button 2800 (e.g., a gesture of rotating after touching the floating action button 2800). Based on this, you can switch to the mode for controlling the number of control objects.
  • the electronic device 101 enters the number control mode of the control object, and in response to entering the number control mode, sends a visual cue 2870 specified to the bundled control object 2850 as in example ⁇ 2803>. It can be provided by changing the visual cue to a group of blinking lines or a group of dotted lines.
  • the user may perform an input to move (e.g., drag) the floating action button 2800 in a first direction (direction 1) (e.g., downward).
  • the electronic device 101 may reduce the number of bundled control objects in response to a user input indicating a specified direction (e.g., dragging in a corresponding direction), as shown in example ⁇ 2805>.
  • the number of bundles (or selections) of control objects in the bundle control object 2850 is sequentially changed. may decrease.
  • the A object 2810 is excluded from the visual cue 2870, and the B object 2820 and C object 2830 may be provided as bundled control objects.
  • the A object 2810 and the B object 2820 are excluded from the visual cue 2870.
  • the C object 2830 may be provided as a bundled control object.
  • the user may perform an input to move (e.g., drag) the floating action button 2800 in a second direction (2 direction) (e.g., upward direction).
  • the electronic device 101 increases (or adds) the number of bundled control objects in response to a user input indicating a specified direction (e.g., dragging in the corresponding direction), as in example ⁇ 2807>. can do.
  • the number control mode according to the input of moving in the designated second direction (2 direction) using the floating action button 2800, the number of bundles (or selections) of control objects in the bundle control object 2850 is sequentially changed. Can be increased (or added).
  • the A object 2810 may be added to the B object 2820 and the C object 2830 in the visual cue 2870 and provided as a bundled control object. .
  • the electronic device 101 enters the number adjustment mode of control objects based on a specified user input, and supports adjusting the selected number of control objects in response to the user input in the number adjustment mode. You can perform the following actions: For example, the electronic device 101 may support adjusting the number of individual control objects included in a bundled control object for a bundled control object in which the control object is selected as a bundle.
  • the electronic device 101 may record (or save settings) the changed bundle control object. there is. For example, when selecting a target control object in a subsequent operation, the electronic device 101 replaces the existing bundled control objects (e.g., A object 2810, B object 2820, and C object 2830) with the changed bundled control object. Select a target control object (e.g. B object 2820, C object 2830) and plot duplicate control objects corresponding to the changed bundled control objects (e.g. B object 2820, C object 2830). So it can be provided in duplicate.
  • the existing bundled control objects e.g., A object 2810, B object 2820, and C object 2830
  • Select a target control object e.g. B object 2820, C object 2830
  • plot duplicate control objects corresponding to the changed bundled control objects e.g. B object 2820, C object 2830
  • FIG. 29 is a diagram illustrating an example of selecting a target control object based on user interaction in an electronic device according to an embodiment of the present disclosure.
  • Figure 29 shows floating when a plurality of control objects (e.g., a first control object 2910, a second control object 2920, and a third control object 2930) exist on the execution screen.
  • a plurality of control objects e.g., a first control object 2910, a second control object 2920, and a third control object 2930
  • An example of selecting a user-specified target control object by switching between a plurality of control objects 2910, 2920, and 2930 may be shown based on user input using the action button 2900.
  • the floating action button 2900 is displayed as a representative object corresponding to the changed target control object.
  • An example provided can be shown by changing to .
  • reference numeral 2900 represents an example of a floating action button
  • reference numeral 2905 may represent an optimization area
  • reference numeral 2910 represents a first control object (e.g., a bundle (or package) control object)
  • reference numeral 2920 represents a second control object (e.g., a URL input object)
  • reference numeral 2930 represents the first control object.
  • 3 Can represent control objects (e.g. screen movement objects).
  • the first control object 2910 is optimized among a plurality of control objects (e.g., the first control object 2910, the second control object 2920, and the third control object 2930) associated with the application.
  • reference numeral 2940 may indicate an example of a visual cue (eg, first visual cue) for a visual effect indicating that the first control object 2910 is designated as the target control object.
  • reference numeral 2950 indicates that the third control object 2930 is designated as the target control object based on the target control object changing from the first control object 2910 to the third control object 2930.
  • An example of a visual cue for visual effects e.g., a second visual cue
  • reference numeral 2960 may indicate an example of a duplicate control object corresponding to the third control object 2930 designated as the target control object.
  • target control is performed according to user input (e.g., corresponding to user input for changing the target control object in FIGS. 27A to 28).
  • user input e.g., corresponding to user input for changing the target control object in FIGS. 27A to 28.
  • This may represent an example in which an object is selected by changing from the first control object 2910 to the third control object 2930.
  • a floating action button 2900 (e.g., a first floating action button 2900A) is displayed in response to the user's grip position (e.g., the point where the user's finger touches on the display 210) according to the user's grip state. ) may be provided (e.g. displayed).
  • the floating action button 2900 may be provided in a floating manner in an area corresponding to the grip position on the display 210.
  • the floating action button 2900 may be arranged so as not to overlap other touch areas in the optimization area 2905.
  • the first control object determined as the target control object among the control objects displayed on the display 210 e.g., the first control object 2910, the second control object 2920, and the third control object 2930
  • a first visual cue 2940 to which a visual effect is applied to the control object 2910 may be provided.
  • the first visual cue 2940 may include visual effects such as highlighting, animation, color change, and/or display of an indicating object (eg, an arrow image or icon).
  • the user moves the floating action button 2900 (e.g., the first floating action button 2900A) in one direction (e.g., degrees
  • the target control object can be switched from the first control object 2910 to the third control object 2930 through a sequential input of moving (eg, dragging) in the left direction at 29.
  • the electronic device 101 selectively or sequentially switches the target control object from the first control object 2910 to the second control object 2920 or the third control object 2930 in response to the user input. can do.
  • the electronic device 101 provides a second visual cue that applies a visual effect to the control object (e.g., the third control object 2930) designated as the switched target control object according to the switch of the target control object. (2950) can be provided.
  • the electronic device 101 may remove the first visual cue 2940 for the first control object 2910 and provide a second visual cue 2950 for the third control object 2930. there is.
  • a user may maintain user input on floating action button 2900 while switching target control objects.
  • the floating action button 2900 is a representative object indicating a control object (e.g., the first control object 2910 or the third control object 2930 in FIG. 29) corresponding to the currently specified target control object. (or a representative image) (e.g., an icon).
  • a control object e.g., the first control object 2910 or the third control object 2930 in FIG. 29
  • the floating action button 2900 is based on the first representative object (e.g., home icon) indicating the first control object 2910. It may be provided as a first floating action button (2900A).
  • the floating action button 2900 is based on the second representative object (e.g., screen movement icon) indicating the third control object 2930.
  • it can be provided as a second floating action button (2900B).
  • a representative object for the floating action button 2900 may be determined according to specified conditions. For example, in the case of a bundled control object, such as the first control object 2910 or the third control object 2930, at least one control object is based on specified conditions (e.g., frequency of use, time of use, and/or assigned priority). A representative image corresponding to the control object (e.g., the leftmost control object among the bundled control objects) may be provided.
  • a plurality of floating action buttons 2900 may be provided.
  • one or more floating action buttons 2900 may be provided based on the settings of the electronic device 101 (e.g., settings regarding a method of providing floating action buttons).
  • one floating action button corresponding to a specified control object may be provided based on the first setting for the floating action button 2900 (eg, one floating action button call setting).
  • a plurality of floating action buttons corresponding to a plurality of control objects specified based on the second setting for the floating action button 2900 e.g., a plurality of floating action button call settings
  • each control object included in the bundled control object e.g., corresponding to the number of control objects
  • a plurality of floating action buttons may be provided.
  • the floating action button 2900 may change from the first floating action button 2900A to the second floating action button 2900B.
  • the electronic device 101 changes the floating action button 2900 to the first control object 2910 based on the target control object changing from the first control object 2910 to the third control object 2930.
  • the electronic device 101 overlaps the area corresponding to the floating action button 2900 (e.g., optimization area 2905) with the target control object (e.g., third control object 2930).
  • a control object 2960 may be provided.
  • a duplicate control object 2960 may be displayed at a location corresponding to the user's grip position (or optimization area 2905).
  • the user performs a user input (e.g., touch) on the floating action button 2900 (e.g., the second floating action button 2900B) while the third control object 2930 is selected as the target control object. It can be released.
  • the electronic device 101 optimizes the duplicate control object 2960 corresponding to the control object (e.g., the third control object 2930) currently designated as the target control object in response to release of the user input. It can be provided to area 2905.
  • the electronic device 101 may provide a duplicate control object 2960 that is identical to the third control object 2930 designated as the target control object by floating it in the optimization area 2905.
  • the overlapping control object 2960 may be arranged so as not to overlap other touch areas in the optimization area 2905.
  • the electronic device 101 supports various settings in relation to providing control objects related to a running application to an optimization area according to the state in which the user holds the electronic device 101.
  • the electronic device 101 may support various settings for how to provide redundant control objects through a designated menu (e.g., quick panel or settings menu) of the electronic device 101.
  • various settings for how to provide duplicate control objects include on/off settings for whether to run duplicate control objects on the quick panel, on/off settings for whether to use floating action buttons, and/or It can include various settings, such as settings for how floating action buttons are called (e.g., whether to call multiple floating action buttons).
  • the electronic device 101 may support setting of a corresponding function through a switch object (eg, on/off switch) for setting the on/off option of the function.
  • FIG. 30 is a diagram illustrating an example of providing a duplicate control object based on user interaction in an electronic device according to an embodiment of the present disclosure.
  • FIG. 30 is a diagram for explaining an example of operating (eg, moving) a floating action button 3010 and/or a duplicate control object 3030 based on interaction with a user.
  • the electronic device 101 provides a floating action button 3010 in response to detection of a grip state by the user in a specified state according to a set first method, and then provides a floating action button 3010 based on the floating action button 3010. It may operate to provide a duplicate control object 3030 according to user input. According to one embodiment, the electronic device 101 may operate to immediately provide a duplicate control object 3030 without providing a floating action button 3010 in response to detection of the grip state in a specified state according to a set second method. there is.
  • grip state detection includes, for example, detecting a touch (e.g., palm touch) of an area of a specified size or more on the cover display and/or detecting an object corresponding to the user's hand based on a rear camera module; and may be performed based on detection of a touch (eg, a finger touch) at a certain point on the display 210.
  • a touch e.g., palm touch
  • a rear camera module e.g., a camera module
  • a floating action button 3010 is provided based on the first method, and the floating action button 3010 is moved and the floating action button is interacted with the user based on the floating action button 3010.
  • An example of providing a duplicate control object 3030 in the area corresponding to 3010 may be shown.
  • the electronic device 101 may detect the user's grip state and provide a floating action button 3010 based on the grip position corresponding to the grip state. According to one embodiment, when providing the floating action button 3010, the electronic device 101 may provide the floating action button 3010 to the user by applying a specified visual cue to the control object 3020 (e.g., target control object) to be provided in duplicate. there is.
  • the control object 3020 e.g., target control object
  • the user creates a duplicate control object ( 3030) can be called.
  • the electronic device 101 creates a duplicate control object 3030 in the area of the floating action button 3010 while maintaining the visual cue of the control object 3020 being duplicated based on a specified user input. It can be provided.
  • the user moves the floating action button 3010 in response to the user input based on the user input specified on the floating action button 3010 (e.g., movement (e.g., dragging) while long pressing). It can be provided.
  • the user may move the floating action button 3010 and then release the user input.
  • the electronic device 101 may call a duplicate control object 3030 corresponding to the control object 3020 in the area where the floating action button 3010 has been moved. there is.
  • the electronic device 101 may maintain the visual cue of the control object 3020 while the duplicate control object 3030 is provided.
  • the duplicate control object 3030 may be moved and provided according to a user input to the duplicate control object 3030.
  • the electronic device 101 moves the floating action button 3010 or the duplicate control object 3030 based on a designated user input (e.g., drag and drop moving beyond a designated area within the execution screen). mode and may operate to support movement of the position of the floating action button 3010 or the duplicate control object 3030 in response to user input in the movement mode.
  • a designated user input e.g., drag and drop moving beyond a designated area within the execution screen.
  • the electronic device 101 controls the floating action button 3010 or the duplicate control object 3030 based on a designated user input (e.g., drag and drop moving to an area outside (or edge) of the execution screen). It may enter the removal mode and, depending on the removal mode, operate to remove the floating action button 3010 or the duplicate control object 3030 from the screen. According to one embodiment, the electronic device 101 operates to stop displaying the visual cue by the control object 3020 when removing the floating action button 3010 or the duplicate control object 3030 according to the removal mode. You can.
  • a designated user input e.g., drag and drop moving to an area outside (or edge) of the execution screen. It may enter the removal mode and, depending on the removal mode, operate to remove the floating action button 3010 or the duplicate control object 3030 from the screen.
  • the electronic device 101 operates to stop displaying the visual cue by the control object 3020 when removing the floating action button 3010 or the duplicate control object 3030 according to the removal mode. You can.
  • FIG. 31 is a diagram illustrating an example in which a duplicate control object is provided in an electronic device according to an embodiment of the present disclosure.
  • FIG. 31 is a diagram illustrating an example of operating (e.g., removing) a visual cue for a control object 3110 or a duplicate control object 3120 being provided based on interaction with a user. .
  • a visual cue of the control object 3110 selected as the target control object and a duplicate control object 3120 corresponding to the control object 3110 are provided, and control is performed according to specified conditions.
  • An example of removing the visual cue of the object 3110 and the duplicate control object 3120 may be shown.
  • the electronic device 101 may remove the visual cue and the duplicate control object 3120 when the control object 3110 is removed from the execution screen.
  • a control object ( 3110) may be removed from the execution screen.
  • the electronic device 101 may remove the duplicate control object 3120 that is being provided. ) can be removed altogether to not display it.
  • the electronic device 101 removes the duplicate control object 3120 that is being provided even when the specified state of the electronic device 101 changes (e.g., a change in the mechanical state of the electronic device 101). Therefore, it may not be displayed. For example, while the electronic device 101 is providing the duplicate control object 3120 in the first designated state, the electronic device 101 may change state from the first designated state to the second designated state. For example, while the electronic device 101 is providing the duplicate control object 3120 in the second designated state, the electronic device 101 may change state from the second designated state to the first designated state. According to one embodiment, when the electronic device 101 detects a state change while providing the duplicate control object 3120, the electronic device 101 removes the visual cue of the duplicate control object 3120 and the corresponding control object 3110. You can.
  • FIGS. 32, 33, 34, 35, 36, and 37 are diagrams illustrating an example of detecting a holding state in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 32 and 36 are diagrams illustrating an example of the front of the electronic device 101.
  • FIGS. 32 and 36 may represent a projection view from the front of the electronic device 101.
  • FIGS. 33 and 34 may show an example of detecting the area where the user's hand touches the rear cover display based on a grid.
  • FIGS. 35 and 37 are diagrams illustrating an example of the back of the electronic device 101.
  • FIGS. 35 and 37 may represent an example of an image that can be acquired by the user's palm touch area and/or the rear camera module 3580 when viewed from the rear of the electronic device 101.
  • FIGS. 32, 33, 34, 35, 36, and 37 show an example in which the form factor of the electronic device 101 is a foldable device with asymmetric displays on the front and back. You can.
  • the electronic device 101 includes a touch circuit (e.g., a touch sensor) and at least one touch detected through the touch circuit of the first display 3210 (or main display or front display).
  • the user's grip status can be identified based on the area.
  • the electronic device 101 includes a second display 3220 (or a cover display or a rear display) and at least one touch area (e.g., detected through a touch circuit of the second display 3220).
  • the user's grip state can be identified based on the touch area (e.g., palm touch area) and the touch area (e.g., finger touch area) detected through the touch circuit of the first display 3210.
  • the electronic device 101 includes a rear camera module 3580, and determines the presence or absence of an object (e.g., a user's finger) based on the image acquired by the rear camera module, and a second display 3220.
  • the user's grip state based on at least one touch area (e.g., palm touch area) detected through the touch circuit of ) and the touch area (e.g., finger touch area) detected through the touch circuit of the first display 3210. can be identified.
  • FIGS. 32 and 35 show an example in which the electronic device 101 detects the user's grip state (e.g., grip state with the lower left hand) when the user grips the lower part of the electronic device 101 with the left hand. can represent.
  • the user's grip state e.g., grip state with the lower left hand
  • the electronic device 101 when the electronic device 101 detects the user's finger touch 3200 on the first display 3210 on the front (e.g., the front display), the electronic device 101 displays the second display 3220 on the back (e.g., the rear display).
  • the touch circuit of the cover display can be activated (e.g., the touch circuit can be driven while the screen is in an off state).
  • the electronic device 101 determines the gripping position (e.g., an optimized area within the touch-operable screen area) based on the finger touch on the first display 3210 and determines the grip position based on the palm touch 3500. Phage status can be determined.
  • the first display 3210 on the front (e.g., front display) and the second display 3220 on the back (e.g., cover display) may be divided into designated multi-sections for determining the grip state.
  • the first display 3210 may be divided into a first partition in units of blocks (or groups) of a plurality of pixels of the first display 3210.
  • the second display 3220 may be divided into a second partition in units of blocks (or groups) of a plurality of pixels of the second display 3220.
  • the electronic device 101 combines the section number according to the first section of the first display 3210 and the section number according to the second section of the second display 3220 to determine the holding state (e.g., The grip posture, grip direction, grip type, and/or operable range) can be determined. Examples of this are shown in Figures 33 and 34.
  • the electronic device 101 divides the second display 3220 into a grid and can identify the shape of the grip according to the grid area touched on the second display 3220. For example, when the user holds the electronic device 101 as shown in the examples of FIGS. 32 and 35, the electronic device 101 uses the touch circuit of the second display 3220 in example ⁇ 3301> of FIG. 33. At least one touch area 3300 (e.g., palm touch area 3300) is detected through the touch area 3200 (e.g., palm touch area 3300), and in example ⁇ 3303> of FIG. 33, the touch area 3200 (e.g., : Finger touch area (3200) can be detected.
  • the touch area 3300 e.g., palm touch area 3300
  • the touch area 3200 e.g., : Finger touch area (3200
  • the second display 3220 may be divided into a plurality of grids (e.g., on a pixel basis or on a block basis of a set of pixels).
  • the palm touch area 3300 includes a first palm touch point 3310 with a relatively large contact area (e.g., a full palm touch point) and a second palm touch point 3320 with a relatively small contact area. (e.g., incomplete palm touch points).
  • the electronic device 101 may determine the palm touch area 3300 on the second display 3220 based on the first palm touch point 3310 and the second palm touch point 3320. .
  • the electronic device 101 has the shape (or touch posture) (e.g., grip) of the palm touch area 3300 of the second display 3220 and the finger touch area 3200 of the first display 3210.
  • the grip state can be determined by identifying the direction and/or hand size).
  • the grip state may include a one-hand grip state and a two-hand grip state.
  • the one-handed grip states include gripping the lower left hand, middle gripping the left hand, grasping the upper left hand, gripping the lowermost part of the left hand (e.g., the lower part of the electronic device 101), and gripping the uppermost left hand (e.g., the upper part of the electronic device 101).
  • the two-hand grip state may include states such as two-hand bottom grip, two-hand middle grip, and two-hand top grip.
  • the electronic device 101 may determine an optimization area (eg, a manipulable area) according to the grip state. For example, as illustrated in FIG. 34, when the user is in a one-handed grip (e.g., left hand grip) state, the palm touch area 3300 of the second display 3220 and the finger touch area of the first display 3210 The phage type can be determined based on (3200). According to one embodiment, the electronic device 101 may determine an optimized area on the first display 3210 based on the determined grip type. For example, the electronic device 101 may determine the optimization area based on the fan-shaped areas 3410, 3420, and 3430 (e.g., one-handed operation area) including the finger touch area 3200 of the first display 3210. there is.
  • the fan-shaped areas 3410, 3420, and 3430 e.g., one-handed operation area
  • the electronic device 101 predicts the touch itself (e.g., grip direction and/or one-hand size) based on the palm touch area 3300 of the second display 3220 and displays the first display 3210.
  • the optimized area can be determined based on the finger touch area 3200 of ) and the one-hand manipulation area matched to the predicted touch posture.
  • example ⁇ 3401> is centered on the lower left corner of the first display 3210 including the finger touch area 3200, according to a first type of one-handed grip (e.g., lower left hand grip).
  • a first type of one-handed grip e.g., lower left hand grip
  • An example of determining the optimization area 3410 e.g., one-handed operation area
  • example ⁇ 3403> shows an optimized area centered on the left middle part of the first display 3210 including the finger touch area 3200, according to a second type of one-handed grip (e.g., left middle grip).
  • An example of determining (3420) can be shown.
  • example ⁇ 3405> is a first type including a finger touch area 3200 according to a third type of one-handed grip (e.g., left hand bottom grip (e.g., bottom grip of electronic device 101)).
  • a third type of one-handed grip e.g., left hand bottom grip (e.g., bottom grip of electronic device 101)
  • An example of determining the optimization area 3430 centered on the lower middle of the display 3210 may be shown.
  • the method of determining the grip state is not limited to determining the grip state by a combination of a front finger touch and a rear palm touch.
  • the method of determining the grip state includes detecting the grip state through a combination of additional sensing elements, such as a finger touch on the front, sensor data from a grip sensor, a palm touch on the back, and/or an acquired image from a camera module. Accuracy can also be improved.
  • FIGS. 36 and 37 show an example in which the electronic device 101 detects the user's grip state (e.g., the upper right hand grip state) when the user grips the upper part of the electronic device 101 with the right hand. can represent.
  • the user's grip state e.g., the upper right hand grip state
  • the electronic device 101 when the electronic device 101 detects the user's finger touch 3600 on the first display 3210 on the front (e.g., the front display), the electronic device 101 displays the second display 3220 on the back (e.g., the rear display).
  • the touch circuit of the cover display) and the camera module 3580 on the rear can be activated.
  • the electronic device 101 determines the grip position (e.g., an optimized area within the screen area that can be touched) based on the finger touch on the first display 3210, and obtains the grip position through the camera module 3580.
  • the gripping state may be determined based on the image 3700 and/or the palm touch.
  • the first display 3210 on the front and the second display 3220 on the back may be divided into designated sections for determining the grip state.
  • the first display 3210 may be divided into a first partition in units of blocks (or groups) of a plurality of pixels of the first display 3210.
  • the second display 3220 may be divided into a second partition in units of blocks (or groups) of a plurality of pixels of the second display 3220.
  • the electronic device 101 combines the section number according to the first section of the first display 3210 and the section number according to the second section of the second display 3220 to determine the holding state (e.g., The grip posture, grip direction, grip type, and/or operable range) can be determined.
  • the method of determining the grip state is not limited to determining the grip state by a combination of a front finger touch and a rear palm touch.
  • the method for determining the grip state includes combining additional sensing elements, such as a finger touch on the front, sensor data from a grip sensor, a palm touch on the back, and acquired images from a gyro sensor, an acceleration sensor, and/or a camera module. The accuracy of grip state detection can also be improved.
  • An operation method performed by the electronic device 101 may include performing an operation of displaying an execution screen of an application in a designated state of the electronic device 101.
  • An operating method according to an embodiment may include performing an operation of detecting at least one control object in the execution screen.
  • An operation method according to an embodiment may include performing an operation that determines the user's grip state.
  • An operating method according to an embodiment may include performing an operation of identifying a target control object from the at least one control object based on the specified state and the grip state.
  • An operating method according to an embodiment may include performing an operation of providing a duplicate control object corresponding to the target control object to an optimization area corresponding to a grip state.
  • the operating method may include performing an operation of determining a gripping position for the optimization area based on the gripping state.
  • the operation method according to one embodiment may include performing an operation of determining a control object located at the farthest distance as the target control object based on the distance between the grip position and the control object in the specified state.
  • An operating method according to an embodiment may include performing an operation of creating a duplicate control object based on a control object corresponding to the target control object.
  • An operating method according to an embodiment may include performing an operation of floating the duplicate control object at the grip position and providing duplicate control objects.
  • the designated state may include a first designated state and a second designated state.
  • the first designated state may include a state in which the electronic device is fully unfolded.
  • the second designated state may include a state in which the electronic device is partially folded.
  • the control object may include a control element for controlling functions related to the application when the application is running in the foreground.
  • the at least one control object may be detected from the execution screen when the execution screen is displayed on the display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Environmental & Geological Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des modes de réalisation de la présente divulgation concernent : un dispositif électronique ayant un afficheur flexible ; et un procédé de fourniture d'un objet de commande sur la base d'un état de préhension d'un utilisateur pour le dispositif électronique. Le dispositif électronique peut comprendre : un afficheur ; un module de capteur ; une mémoire ; et un processeur. Le processeur peut commander l'afficheur pour afficher un écran d'exécution d'une application dans un état désigné du dispositif électronique. Le processeur peut fonctionner de sorte qu'un objet de commande est détecté dans l'écran d'exécution. Le processeur peut fonctionner de sorte qu'un état de préhension de l'utilisateur est déterminé. Le processeur peut fonctionner de sorte qu'au moins un objet de commande cible est identifié à partir de l'objet de commande sur la base de l'état désigné et de l'état de préhension. Le processeur peut fonctionner de sorte qu'un objet de commande en double correspondant à un objet de commande identifié en tant qu'objet de commande cible est fourni en double à une zone d'optimisation correspondant à l'état de préhension.
PCT/KR2023/013229 2022-09-05 2023-09-05 Dispositif électronique ayant un afficheur flexible et procédé de fourniture d'un objet de commande sur la base d'un état de préhension de celui-ci WO2024053988A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/488,464 US20240077956A1 (en) 2022-09-05 2023-10-17 Electronic device having flexible display and method for providing control object based on gripping state thereof

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2022-0112182 2022-09-05
KR20220112182 2022-09-05
KR1020220133910A KR20240033613A (ko) 2022-09-05 2022-10-18 플렉서블 디스플레이를 갖는 전자 장치 및 그의 파지 상태에 기반한 제어 객체 제공 방법
KR10-2022-0133910 2022-10-18

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/488,464 Continuation US20240077956A1 (en) 2022-09-05 2023-10-17 Electronic device having flexible display and method for providing control object based on gripping state thereof

Publications (1)

Publication Number Publication Date
WO2024053988A1 true WO2024053988A1 (fr) 2024-03-14

Family

ID=90191477

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/013229 WO2024053988A1 (fr) 2022-09-05 2023-09-05 Dispositif électronique ayant un afficheur flexible et procédé de fourniture d'un objet de commande sur la base d'un état de préhension de celui-ci

Country Status (1)

Country Link
WO (1) WO2024053988A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120024299A (ko) * 2010-09-06 2012-03-14 엘지전자 주식회사 사용자 인터페이스 제공 방법 및 이를 이용하는 이동 단말기
KR20130111087A (ko) * 2012-03-30 2013-10-10 인포뱅크 주식회사 휴대용 단말기에서의 메뉴 구성 방법
JP2016038622A (ja) * 2014-08-05 2016-03-22 シャープ株式会社 電子機器
KR20160108705A (ko) * 2015-03-05 2016-09-20 삼성디스플레이 주식회사 표시 장치
KR20200045660A (ko) * 2018-10-23 2020-05-06 삼성전자주식회사 사용자 인터페이스를 제어하는 폴더블 전자 장치 및 그의 동작 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120024299A (ko) * 2010-09-06 2012-03-14 엘지전자 주식회사 사용자 인터페이스 제공 방법 및 이를 이용하는 이동 단말기
KR20130111087A (ko) * 2012-03-30 2013-10-10 인포뱅크 주식회사 휴대용 단말기에서의 메뉴 구성 방법
JP2016038622A (ja) * 2014-08-05 2016-03-22 シャープ株式会社 電子機器
KR20160108705A (ko) * 2015-03-05 2016-09-20 삼성디스플레이 주식회사 표시 장치
KR20200045660A (ko) * 2018-10-23 2020-05-06 삼성전자주식회사 사용자 인터페이스를 제어하는 폴더블 전자 장치 및 그의 동작 방법

Similar Documents

Publication Publication Date Title
WO2020171591A1 (fr) Dispositif électronique et son procédé de commande d'affichage
WO2021101189A1 (fr) Procédé et dispositif pour fournir une interface utilisateur dans un dispositif électronique ayant un écran pliable
WO2015178714A1 (fr) Dispositif pliable et procédé pour le commander
WO2015076463A1 (fr) Terminal mobile et son procédé de commande
WO2015093667A1 (fr) Dispositif électronique et procédé de commande de dispositif électronique
WO2016200136A1 (fr) Dispositif portable et procédé de changement d'écran de dispositif portable
WO2021256890A1 (fr) Dispositif électronique pour fournir des informations et/ou des fonctions par le biais d'une icône et son procédé de commande
WO2021246801A1 (fr) Dispositif électronique comprenant une pluralité d'affichages, et son procédé de fonctionnement
WO2015093665A1 (fr) Dispositif électronique et procédé de commande de dispositif électronique
WO2021246802A1 (fr) Dispositif électronique comprenant des écrans multiples et procédé de mise en œuvre
WO2022154500A1 (fr) Dispositif électronique et procédé de configuration de disposition sur la base d'un état de pliage d'un dispositif électronique
WO2021246800A1 (fr) Dispositif électronique comprenant une pluralité d'affichages et procédé de fonctionnement associé
WO2022025720A1 (fr) Dispositif électronique comprenant un module d'affichage flexible et procédé de fonctionnement associé
WO2022080814A1 (fr) Dispositif électronique et procédé associé de commande d'écran
WO2022108271A1 (fr) Dispositif électronique comprenant un afficheur flexible, et procédé de commande tactile associé
WO2022019488A1 (fr) Dispositif électronique d'affichage d'écran d'exécution d'application et son procédé de fonctionnement
WO2022031049A1 (fr) Dispositif électronique prenant en charge plusieurs modes multi-fenêtres et son procédé de commande
WO2022025451A1 (fr) Dispositif électronique coulissant et son procédé de commande
WO2024053988A1 (fr) Dispositif électronique ayant un afficheur flexible et procédé de fourniture d'un objet de commande sur la base d'un état de préhension de celui-ci
WO2022014958A1 (fr) Dispositif électronique comprenant un écran étirable
WO2022103014A1 (fr) Dispositif à électrodes pourvu d'un affichage flexible et procédé d'affichage l'utilisant
WO2023063558A1 (fr) Dispositif électronique et procédé permettant de faire fonctionner un dispositif électronique
WO2022186578A1 (fr) Dispositif électronique interagissant avec un dispositif électronique externe et son procédé d'interaction
WO2024080611A1 (fr) Dispositif électronique et procédé de commande d'écran selon une interaction d'utilisateur au moyen de celui-ci
WO2022108239A1 (fr) Dispositif électronique à affichage flexible et procédé de fonctionnement dudit dispositif

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23863460

Country of ref document: EP

Kind code of ref document: A1