WO2022182025A1 - Dispositif électronique et son procédé de commande - Google Patents

Dispositif électronique et son procédé de commande Download PDF

Info

Publication number
WO2022182025A1
WO2022182025A1 PCT/KR2022/002074 KR2022002074W WO2022182025A1 WO 2022182025 A1 WO2022182025 A1 WO 2022182025A1 KR 2022002074 W KR2022002074 W KR 2022002074W WO 2022182025 A1 WO2022182025 A1 WO 2022182025A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
display
processor
user gesture
main information
Prior art date
Application number
PCT/KR2022/002074
Other languages
English (en)
Korean (ko)
Inventor
허승혁
Original Assignee
삼성전자(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자(주) filed Critical 삼성전자(주)
Publication of WO2022182025A1 publication Critical patent/WO2022182025A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/93Document management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/93Document management systems
    • G06F16/94Hypermedia
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Definitions

  • the present invention relates to an electronic device and a control method thereof, and more particularly, to an electronic device capable of detecting a user gesture and a control method thereof.
  • the electronic device may perform complex functions such as, for example, taking a picture or moving picture, playing a music or moving picture file, playing a game, or receiving a broadcast.
  • the electronic device may provide various information according to the performance of a function through a display, and is provided to receive various types of user inputs such as user gestures.
  • the present invention provides an electronic device capable of displaying an object corresponding to main information with high visibility based on a user gesture, and a method for controlling the same.
  • the present invention provides an electronic device capable of more intuitively displaying an object appropriately corresponding to a user's gesture, and a method for controlling the same.
  • An electronic device includes a display; and identifying a region of the display corresponding to the user gesture based on detection of a predefined user gesture with respect to a partial region of the display, and at least one of a plurality of objects displayed on the display related to the identified region and corresponding to main information. and a processor for controlling the display to select an object of , and to provide an identification indication for the selected object.
  • the processor may identify an operation state of the electronic device and select an object corresponding to the main information based on the identified operation state.
  • the operating state may include characteristics of an application being executed in the electronic device.
  • the processor may select at least one of two or more objects located in the identified area as an object corresponding to the main information.
  • the object may include text, and the processor may control the display to provide an identifying indication for the text corresponding to the key information.
  • the processor may control the display such that the protection processing is provided for an object other than the text to which the identification indication is provided among the plurality of objects.
  • the text may include link information, and the processor may control the display to display a preview corresponding to the link information.
  • the identification mark may include applying at least one of highlight, high-contrast, magnification, or luminous to the selected object.
  • the processor may identify a surrounding environment state of the electronic device and determine an applied identification mark based on the identified surrounding environment state.
  • the user gesture is a first user gesture
  • the processor may provide an identification mark for the object in response to the movement of the second user gesture based on the detection of a second user gesture moving in succession to the first user gesture.
  • the processor may control the display so that a predetermined function object is changed and displayed in response to the second user gesture, and may control the function corresponding to the function object to be performed based on the movement of the second user gesture.
  • a method of controlling an electronic device includes: identifying a region of a display corresponding to the user gesture based on detection of a predefined user gesture with respect to a partial region of a display; selecting at least one object related to the identified area and corresponding to main information from among a plurality of objects displayed on the display; and providing an identifying indication for the selected object.
  • the method may further include identifying an operation state of the electronic device and selecting an object corresponding to the main information based on the identified operation state.
  • the operating state may include characteristics of an application being executed in the electronic device.
  • the method may further include selecting at least one of two or more objects located in the identified area as an object corresponding to the main information.
  • the object includes text, and may further include an operation of providing an identification mark for the text corresponding to the key information.
  • the method may further include providing protection processing to an object other than a text to which an identification mark is provided among the plurality of objects.
  • the text includes link information, and may further include an operation of displaying a preview corresponding to the link information.
  • the method may further include identifying a surrounding environment state of the electronic device and determining an identification mark to be applied based on the identified surrounding environment state.
  • the user gesture may be a first user gesture, and based on detection of a second user gesture moving in succession to the first user gesture, providing an identification mark for the object in response to the movement of the second user gesture may be further included. have.
  • FIG. 1 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • FIG. 2 is a block diagram of a display module according to various embodiments of the present disclosure.
  • FIG. 3 is a flowchart illustrating a method of providing an identification mark in an electronic device according to an embodiment of the present invention.
  • FIG 4 shows an example of a user gesture according to an embodiment of the present invention.
  • FIG 5 illustrates an example of providing an identification mark for an object according to an embodiment of the present invention.
  • FIG 6 shows an example of setting an identification mark according to an embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating a process of deriving main information in an electronic device according to an embodiment of the present invention.
  • 16 is a flowchart illustrating a method of providing an identification mark based on a movement of a user gesture in an electronic device according to an embodiment of the present invention.
  • 17 and 18 show an example of providing an identification mark based on movement according to an embodiment of the present invention.
  • a 'module' or 'unit' performs at least one function or operation, and may be implemented as hardware or software or a combination of hardware and software, and is integrated into at least one module. and can be implemented.
  • at least one of the plurality of elements refers to all of the plurality of elements as well as each one or a combination thereof excluding the rest of the plurality of elements.
  • FIG. 1 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 is, for example, a smart phone, a tablet, a smart pad, a smart note, or a personal digital (PDA). It may include various stationary or movable digital devices such as assistant, personal digital assistant), portable multimedia player (PMP), navigation device, MP3 player, or laptop computer (laptop, notebook).
  • the electronic device 101 is a wearable device (digital accessory, smart accessory) that can be worn on the user's body, such as a smart watch, a smart band, and a wireless headset (eg, a Bluetooth headset). , or may include an appcessory (also called an Appcessory).
  • the electronic device 101 may be an input device including a remote control (remote control or remote control unit) that transmits a control command to another device such as a television (eg, a smart TV) or a set-top box.
  • a remote control remote control or remote control unit
  • the electronic device 101 may include various types of devices provided as things or smarthings operating based on Internet of Things (IoT) technology, such as healthcare, remote meter reading, smart home, and smart car.
  • IoT Internet of Things
  • the electronic device 101 may be provided with a sensor for performing an operation of each device and sensing a surrounding environment.
  • the electronic device 101 communicates with the electronic device 102 through the first network 198 (eg, a short-range wireless communication network) in a network environment, or the second network 199 (eg: It may communicate with at least one of the electronic device 104 and the server 108 through a long-distance wireless communication network. According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the first network 198 eg, a short-range wireless communication network
  • the second network 199 eg: It may communicate with at least one of the electronic device 104 and the server 108 through a long-distance wireless communication network.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 .
  • at least one of these components eg, the connection terminal 178
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120. It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or computation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 32 . may store the command or data stored in the volatile memory 32 , and store the resulting data in the non-volatile memory 34 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 32 .
  • the volatile memory 32 may store the command or data stored in the volatile memory 32 , and store the resulting data in the non-volatile memory 34 .
  • the processor 120 is a main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the secondary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or when the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the coprocessor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. have.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which the artificial intelligence model is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176 ) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 32 or a non-volatile memory 34 .
  • the program 140 may be stored as software in the memory 130 , for example, an operating system 142 for controlling one or more resources of the electronic device 101 , middleware 144 , or the operating system ( 142 may include an executable application 146 .
  • the middleware 144 may provide various functions to the application 146 so that functions or information provided from one or more resources of the electronic device 101 may be used by the application 146 .
  • the middleware 144 may include, for example, a power manager.
  • the power manager for example, manages the capacity, temperature, or power of the battery 189 , and determines or provides related information necessary for the operation of the electronic device 101 by using the corresponding information.
  • the power manager may interwork with a basic input/output system (BIOS) (not shown) of the electronic device 101 .
  • BIOS basic input/output system
  • the input module 150 may receive a command or data to be used by a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 or an external electronic device (eg, a sound output module 155 ) directly or wirelessly connected to the electronic device 101 .
  • the electronic device 102) eg, a speaker or headphones
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more specified protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the electronic device 101 may be connected to an external power source capable of charging the battery 189 through the connection terminal 178 .
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a wireless communication module 192 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 eg, : It may include a local area network (LAN) communication module, or a power line communication module.
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • subscriber information eg, International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 uses various techniques for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements defined in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: Downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less) can be supported.
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, underside) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • each of the external electronic devices 102 or 104 includes at least some of the components of the electronic device 101 illustrated in FIG. 1 , and may further include other components.
  • all or part of the operations performed by the electronic device 101 may be executed by one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • the server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • FIG. 2 is a block diagram of a display module according to various embodiments of the present disclosure.
  • the display module 160 may include a display 210 and a display driver IC (DDI) 230 for controlling the display 210 .
  • the DDI 230 may include an interface module 231 , a memory 233 (eg, a buffer memory), an image processing module 235 , or a mapping module 237 .
  • the DDI 230 transmits, for example, image data or image information including an image control signal corresponding to a command for controlling the image data to other components of the electronic device 101 through the interface module 231 .
  • the image information is provided by the processor 120 (eg, the main processor 121 (eg, an application processor) or the auxiliary processor 123 (eg, an application processor) operated independently of the function of the main processor 121 )
  • the processor 120 eg, the main processor 121 (eg, an application processor) or the auxiliary processor 123 (eg, an application processor) operated independently of the function of the main processor 121
  • the processor 120 eg, the main processor 121 (eg, an application processor) or the auxiliary processor 123 (eg, an application processor) operated independently of the function of the main processor 121
  • the processor 120 eg, the main processor 121 (eg, an application processor) or the auxiliary processor 123 (eg, an application processor) operated independently of the function of the main processor 121 )
  • the DDI 230 may communicate with the touch circuit 250 or the sensor module 176 through the interface module 231.
  • the DDI 230 may be At least a portion of the received image information may be stored in the memory 233, for example, in units of frames, for example, the image processing module 235 may store at least a portion of the image data, Pre-processing or post-processing (eg, resolution, brightness, or size adjustment) may be performed based at least on the characteristics of the display 210.
  • the mapping module 237 may perform pre-processing or post-processing through the image processing module 135. A voltage value or a current value corresponding to the image data may be generated.
  • the generation of the voltage value or the current value may include, for example, a property of pixels of the display 210 (eg, an arrangement of pixels ( RGB stripe or pentile structure), or the size of each sub-pixel) At least some pixels of the display 210 are, for example, based at least in part on the voltage value or the current value.
  • visual information eg, text, image, or icon
  • the image data may be displayed through the display 210 .
  • the display module 160 may further include a touch circuit 250 .
  • the touch circuit 250 may include a touch sensor 251 and a touch sensor IC 253 for controlling the touch sensor 251 .
  • the touch sensor IC 253 may control the touch sensor 251 to sense a touch input or a hovering input for a specific position of the display 210 , for example.
  • the touch sensor IC 253 may detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, light amount, resistance, or electric charge amount) for a specific position of the display 210 .
  • the touch sensor IC 253 may provide information (eg, location, area, pressure, or time) regarding the sensed touch input or hovering input to the processor 120 .
  • At least a part of the touch circuit 250 is disposed as a part of the display driver IC 230 , or the display 210 , or outside the display module 160 . may be included as a part of another component (eg, the coprocessor 123).
  • the display module 160 may further include at least one sensor (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illuminance sensor) of the sensor module 176 , or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display module 160 (eg, the display 210 or the DDI 230 ) or a part of the touch circuit 250 .
  • the sensor module 176 embedded in the display module 160 includes a biometric sensor (eg, a fingerprint sensor)
  • the biometric sensor provides biometric information related to a touch input through a partial area of the display 210 . (eg, fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information related to a touch input through a part or the entire area of the display 210 .
  • the touch sensor 251 or the sensor module 176 may be disposed between pixels of the pixel layer of the display 210 , or above or below the pixel layer.
  • the electronic device 101 uses a predefined user gesture (eg, a touch gesture) can be detected.
  • the electronic device 101 receives a predefined user gesture with respect to a partial region of the display 210 by at least one sensor provided in the sensor module 176 or an image sensor of the camera module 180 . can be sensed to be
  • the electronic device 101 identifies a region of the display 210 corresponding to the detected gesture, and predefined at least one object related to the identified region among the objects displayed on the display 210 .
  • identification marks e.g. emphasis, contrast, magnification, etc.
  • FIG. 3 is a flowchart illustrating a method of providing an identification mark in an electronic device according to an embodiment of the present invention.
  • 4 shows an example of a user gesture according to an embodiment of the present invention.
  • 5 illustrates an example of providing an identification mark for an object according to an embodiment of the present invention.
  • 6 shows an example of setting an identification mark according to an embodiment of the present invention.
  • the processor 120 of the electronic device 101 performs a predefined user gesture with respect to a partial area of the display 210 . can detect what is happening.
  • the processor 120 may identify that a user gesture is detected as a touch input to a partial region of the display 210 by the touch sensor 251 .
  • the user gesture for example, as shown in FIG. 4 , may be formed in a shape of covering or enclosing a partial area of the display 210 by the user's hand 401 .
  • a user when the electronic device 101 is a smart watch, referring to FIG. 4 , a user makes a circle or ring shape using a finger or the like to form a display 210 adjacent to a bezel. ), for example, a touch input to the edge region 410 may be generated.
  • a user makes a circle or a ring shape to receive a touch input on a partial area of the display 210 . can cause
  • the shape of the user gesture and the region where it is generated are not limited to those shown in FIG. 4 , and various shapes that may be generated for some regions of the display 210 provided in various types of electronic devices 101 . or gestures of the form.
  • the user gesture detected in operation 311 of FIG. 3 may be referred to as a first user gesture to distinguish it from the second user gesture in the embodiment of FIG. 16 to be described later.
  • the processor 120 may identify an area of the display 210 corresponding to the user gesture detected in operation 311 .
  • the processor 120 when a user gesture is sensed in the electronic device 101 implemented as a smart watch, for example, as shown in FIG. 4 , the processor 120 performs an edge region ( 410) may be identified as a region corresponding to the sensed user gesture.
  • the processor 120 may select at least one object related to the region identified in operation 312 among a plurality of objects displayed on the display 210 . .
  • the processor 120 may select at least one of the objects located in the identified area as the object corresponding to the main information.
  • the main information may be defined as information provided to the user in relation to the operating state of the electronic device 101 among the information displayed on the display 210 , for example, characteristics of a running application (eg, application type) or surrounding environment.
  • the object may include, but is not limited to, text, an icon, an image, a menu, a button, and the like.
  • the processor 120 may identify the operating state of the electronic device 101 .
  • the operating state may include, for example, a characteristic of an application being executed in the electronic device 101 , a surrounding environment state of the electronic device 101 , and the like.
  • the processor 120 may select an object corresponding to main information from among at least one object related to the area identified in operation 312 based on the identified operation state.
  • the processor 120 is an operating state of the electronic device 101 that includes characteristics of an executed application, for example, an application type, an execution state (execution screen), a controller (control state), and a user selection.
  • an object corresponding to the main information may be selected with reference to a predetermined criterion, text information, and the like.
  • the processor 120 refers to a state of a surrounding environment, for example, ambient brightness (illuminance), a spatial characteristic (eg, a public place, etc.), as an operating state of the electronic device 101 .
  • a state of a surrounding environment for example, ambient brightness (illuminance), a spatial characteristic (eg, a public place, etc.), as an operating state of the electronic device 101 .
  • ambient brightness luminance
  • a spatial characteristic eg, a public place, etc.
  • the processor 120 may provide an identification mark for the object selected in operation 313 .
  • the identification mark includes a mark that gives a predefined visual effect so that the selected object is distinguished from the other objects.
  • the identification mark is, for example, a highlight mark such as a highlight, a contrast mark such as high-contrast, an enlarged mark such as a zoom-in, a luminous mark, etc.
  • the identification mark may include a mark imparting a continuously or temporarily (instantaneously) changing visual effect.
  • the identification mark may include a display for not displaying the unselected object or processing the unselected object so that the unselected object is not easily seen by the user by adjusting the contrast or color value.
  • the identification mark may further include a masking process.
  • the processor 120 may further apply a masking process to an object corresponding to the main information when the main information is identified as security information such as a password or an authentication code, for example.
  • the type of the identification mark provided to the object corresponding to the main information may be preselected by the user.
  • the user may differently select an applied identification mark according to the type of application executed in the electronic device 101 or may select a commonly applied identification mark.
  • the identification mark may be determined according to the surrounding environment of the electronic device 101 .
  • the surrounding environment is outdoors where illumination is very high
  • high-contrast processing may be provided for an object corresponding to main information to improve visibility.
  • masking processing may be additionally provided for an object corresponding to main information determined as security information.
  • the processor 120 determines that the operating state of the electronic device 101 displays the current time. can be identified as
  • the processor 120 based on the identified operation state, among the plurality of objects 511 , 512 , 513 , 514 , and 515 (eg, numbers, hour hand, minute hand, second hand, sunset time, etc.) displayed on the display 210 ).
  • the display 210 selects an object corresponding to the main information, for example, an object of a number, an hour hand, a minute hand, and provides an identification indication for the selected object 521 , 522 , 523 as shown in FIG. 5 . can control
  • the electronic device 101 may operate in a digital watch mode, and in this case, a number item representing a time, for example, an hour and a minute, respectively, may be selected as an object corresponding to the main information.
  • the identification display provided in operation 314 may correspond to a display mode previously selected by the user.
  • the user may select an advanced menu 611 from the setting menu 610 of the electronic device 101 to set a display mode corresponding to the identification display.
  • the display mode may include, for example, a nightglow mode in which the selected object is displayed to emit light.
  • the processor 120 may provide identification marks for the selected objects 521 , 522 , and 523 according to the display mode set or selected by the user as described above.
  • the electronic device 101 may selectively display an object corresponding to main information among a plurality of objects based on a user gesture with high visibility, and may more intuitively display an appropriately corresponding object. Therefore, the user can easily obtain necessary information.
  • FIG. 7 is a flowchart illustrating a process of deriving main information in an electronic device according to an embodiment of the present invention.
  • FIG. 7 may be a detailed flowchart of an operation of selecting an object corresponding to main information in operation 313 of FIG. 3 .
  • the processor 120 may identify an operating state of the electronic device 101 .
  • the processor 120 identifies the operating state of the electronic device 101 based on the detection of a predefined user gesture with respect to a partial region of the display 210 . can
  • the processor 120 may derive main information based on the operation state identified in operation 711.
  • the processor 120 may select an object corresponding to the main information derived in operation 713 .
  • the object selected in this way corresponds to the object selected in operation 313 of FIG. 3 and becomes a target of identification display in operation 314 .
  • FIGS. 8 to 15 show examples in which main information is identified and displayed according to various embodiments of the present disclosure. Specifically, FIGS. 8 to 15 show examples in which main information corresponding to an object selected according to the operation shown in FIGS. 3 and 7 is identified and displayed.
  • the processor 120 may identify the currently running application as the operating state of the electronic device 101 , and may derive main information based on characteristics of the identified application.
  • the characteristics of the application may include, for example, the type of the application, the execution state (execution screen), and the like. According to an embodiment, even in the same application, main information may be derived differently according to an execution state (execution screen).
  • the processor 120 may derive main information based on the control state (controller) of the running application.
  • a music playback application eg, a music app
  • main information based on the operation state for example, a playback progress state and music ( The title of the song) can be derived.
  • the processor 120 proceeds as an object corresponding to the main information derived from among the plurality of objects 811 , 812 , and 813 (eg, a progress bar, a title item, a volume icon, etc.) displayed on the display 210 .
  • a bar 811 and a title item 812 can be selected.
  • the processor 120 applies the identification mark described in operation 314 of FIG. 3 to the thus selected objects 821 and 822 , for example, high contrast, that is, high-contrast.
  • the display 210 may be controlled to do so.
  • a playback progress state and an image title may be derived as main information based on the operation state. Accordingly, the processor 120 may select an object corresponding to the derived main information from among the plurality of objects displayed on the display 210, for example, a progress bar and a title item, and provide an identification mark for the selected object. .
  • a shutter and a timer may be derived as main information based on the operation state. .
  • the processor 120 is an object corresponding to main information derived from among the plurality of objects 911 and 912 (eg, a shutter button, a timer button, a preview image, etc.) displayed on the display 210 , the shutter button 911 and the timer. Button 912 can be selected. As shown in FIG. 9 , the processor 120 controls the display 210 to provide the identification display described in operation 314 of FIG. 3 , for example, an enlarged display, for the thus-selected objects 921 and 922 . can do.
  • the processor 120 controls the display 210 to provide the identification display described in operation 314 of FIG. 3 , for example, an enlarged display, for the thus-selected objects 921 and 922 . can do.
  • the processor 120 may provide an identification mark for a plurality of objects displayed on the display 210 , that is, an object corresponding to main information derived from among menu items, for example, a brightness control bar.
  • the processor 120 may derive the main information based on the user's selection.
  • an application selected by the user may be derived as main information based on the operation state.
  • the processor 120 is configured to display an object corresponding to main information derived from among a plurality of objects 1011 and 1012 (eg, items of recently executed applications) displayed on the display 210 , that is, an object currently selected by the user. Item 1011 can be selected. As shown in FIG. 10 , the processor 120 may control the display 210 to apply the identification mark, for example, a highlight mark, described in operation 314 of FIG. 3 to the object 1021 thus selected. have.
  • a plurality of objects 1011 and 1012 eg, items of recently executed applications
  • main information may be derived based on a selected tab (eg, a bottom tab).
  • the processor 120 may, for example, extract main information from a web page corresponding to a tab in which a user gesture is detected, among a plurality of tabs constituting a web browser.
  • the processor 120 may derive main information based on a predetermined criterion.
  • an alarm application eg, alarm widget
  • main information based on the operation state in operation 712 of FIG. 7 for example, timer progress information and on/off information can be derived.
  • the plurality of objects (1111, 1112, 1113) displayed on the display 210 eg, timer bar (timer bar), on / off item, time item, day item, etc.
  • the processor 120 controls the display 210 to apply the identification mark described in operation 314 of FIG. 3 , for example, a luminous display, to the objects 1121 and 1122 thus selected. can do.
  • the processor 120 may derive main information based on the text of the application being executed.
  • an application eg, a web browser app
  • main information based on the operation state for example, a user gesture is generated.
  • the text of the selected area can be derived.
  • the processor 120 may select the text 1211 in the area 1210 corresponding to the user gesture as an object corresponding to the main information among the text displayed on the display 210 . As shown in FIG. 12 , the processor 120 may control the display 210 to apply an identification mark, eg, a contrast effect, to the thus-selected object, that is, the text 1211 . Accordingly, it can be confirmed that the text 1221 to which the contrast effect is applied is highlighted compared to the text 1222 to which the contrast effect is not applied.
  • an identification mark eg, a contrast effect
  • the processor 120 may select a text in the area 1310 corresponding to the user gesture as an object corresponding to the main information among the text displayed on the display 210 .
  • the processor 120 may control the display 210 to apply an identification mark, for example, an enlarged display or a magnifying glass effect, to the selected text. Accordingly, it can be confirmed that the text 1320 to which the enlarged display or the magnifying effect is applied is highlighted.
  • the processor 120 may extract main information based on a text attribute of a region corresponding to a user gesture. For example, the processor 120 may derive, as main information, a password consisting of numbers or a combination of numbers and letters, or text having link information on a web page as an attribute.
  • the processor 120 may select a text 1411 in an area 1410 corresponding to a user gesture as an object corresponding to main information among a plurality of objects displayed on the display 210 . Accordingly, it can be confirmed that the text 1421 to which the enlarged display is applied is highlighted.
  • the processor 120 may control the display 210 to apply an identification mark, for example, an enlarged display, to the thus-selected object, that is, the text 1411 .
  • the processor 120 protects the object 1412 except for the text to which the identification mark is applied so that it is difficult for the user to visually recognize it (eg, blur).
  • the display 210 may be controlled to provide processing, blurring processing, blind processing, masking processing, etc.). Accordingly, it can be confirmed that the object 1422 to which the protection process is applied is less visually recognized than the text 1421 to which the enlarged display is applied.
  • the processor 120 may select a text 1511 in an area 1510 corresponding to a user gesture as an object corresponding to main information among a plurality of objects displayed on the display 210 .
  • the processor 120 identifies that the selected object includes link information (web link) for the web page as an attribute of the selected object, ie, text 1511, and previews the link 1521 ) (or a preview image) may be controlled to display the display 210 .
  • link information web link
  • the processor 120 identifies that the selected object includes link information (web link) for the web page as an attribute of the selected object, ie, text 1511, and previews the link 1521 ) (or a preview image) may be controlled to display the display 210 .
  • the processor 120 protects objects other than the web link to which the preview is applied, making it difficult for the user to visually recognize them (eg, blur processing,
  • the display 210 may be controlled to provide blur processing, blind processing, etc.). Accordingly, it can be seen that the object 1522 to which the protection process is applied is less visually recognized than the preview 1521 .
  • the processor 120 may control the display 210 to protect objects except for main information based on the characteristics of the application being executed in the electronic device 101 . For example, in a state in which a security application such as a financial app is executed, protection processing may be applied to an object corresponding to information requiring security other than main information.
  • the processor 120 identifies the surrounding environment (or external environment) state as the operating state of the electronic device 101 in operations 711 and 712 of FIG. 7 , and derives main information based on the identified surrounding environment state.
  • the processor 120 may identify the surrounding environment state by at least one sensor (eg, an illuminance sensor, a temperature sensor, a humidity sensor, etc.) of the sensor module 176 .
  • a sensor eg, an illuminance sensor, a temperature sensor, a humidity sensor, etc.
  • the processor 120 may identify the surrounding environment state by at least one sensor (eg, an illuminance sensor, a temperature sensor, a humidity sensor, etc.) of the sensor module 176 .
  • a predetermined reference value or more it may correspond to a case in which visibility of the display 210 is poor due to sunlight outdoors.
  • the processor 120 may control to provide an identification mark for an object corresponding to the main information according to the identified surrounding environment state.
  • the processor 120 may provide a high-visibility improvement for an object corresponding to main information, such as a button for a user input or text in an area corresponding to a user gesture.
  • the display 210 may be controlled to provide an identification indication applying contrast.
  • the processor 120 may identify the location information of the electronic device 101 as surrounding environment information, for example, based on a satellite signal received through the GPS reception module.
  • the processor 120 for example, when the current location of the electronic device 101 is identified as a crowded outdoor public place and a security application such as a financial app is running, account information, password, recent transaction history, etc. can be derived as the main information.
  • the processor 120 may control the display 210 to provide an identification mark such as a masking process for an object such as a password requiring security among the main information.
  • the processor 120 may set the luminous mode in which light does not leak to the outside with respect to the object corresponding to the main information. Accordingly, the display 210 may be controlled to provide an identification indication.
  • the processor 120 may control the display 210 to provide an identification mark for an object corresponding to the main information at a time specified by the user. For example, when the user wants to check the current time as main information during bedtime, when the electronic device 101 operating in the watch mode reaches a specified time, an identification display is provided for an object indicating the current time. 210 can be controlled.
  • the time designation as described above may be performed through menu setting as described with reference to FIG. 6 .
  • the electronic device 101 further receives the user gesture in operation 311 of FIG. 3 , that is, a second user gesture that is continuous and moves following the first user gesture, and based on the movement of the second user gesture, An identification mark for the object may be provided.
  • 16 is a flowchart illustrating a method of providing an identification mark based on a movement of a user gesture in an electronic device according to an embodiment of the present invention.
  • 17 and 18 show an example of providing an identification mark based on movement according to an embodiment of the present invention.
  • operations 1611 to 1614 illustrated in FIG. 16 correspond to operations 311 to 314 illustrated in FIG. 3 , respectively. Accordingly, since various embodiments related to operations 311 to 314 of FIG. 3 are also applied to operations 1611 to 1614, a detailed description thereof will be omitted herein.
  • the processor 120 of the electronic device 101 continues to the first user gesture detected in operation 1611 in operation 1615 (the user gesture detected in operation 311 of FIG. 3 ). It may be detected that a moving second user gesture is generated.
  • the processor 120 is continuous to the first user gesture for a partial region of the display 210 , for example, an edge region, and the moving second user gesture is What is happening can be identified.
  • the second user gesture maintains the shape of the first user gesture covering a partial area of the display 210 by the user's hand 401 as shown in FIG. 17 , for example. In the state, it may have movement in a predetermined direction (eg, rotate to the right or left).
  • the second user gesture is, for example, as shown in FIG. 18 , the first user covering a partial area (eg, lower left corner) of the display 210 by the user's hand 401 . It may have movement in a predetermined direction (eg, moving to the right, left, up, down, or diagonally) while maintaining the shape of the gesture.
  • the processor 120 controls the display 210 to provide an identification display for an object in response to the movement of the second user gesture detected in operation 1615.
  • the object to be identified may correspond to the object selected in operation 313 of FIG. 3 or operation 713 of FIG. 7 .
  • the processor 120 based on the detection of the first user gesture, relates to the region where the generation of the first user gesture is identified and objects 1711 and 1721 corresponding to main information (eg, progress). F) may be provided with an identification mark.
  • main information eg, progress
  • the processor 120 identifies the object 1721 corresponding to the main information in response to the movement of the second user's gesture based on the detection of the second user's gesture moving in succession to the first user's gesture. indication can be provided.
  • the progress bar 1731 is advanced to correspond to the movement direction and distance of the second user gesture rotated a predetermined distance to the right, that is, an identification mark displayed by stretching is provided. can be (adjust timeline interface). Accordingly, the user may intuitively check the feedback for the second user gesture.
  • the processor 120 is configured to provide an identification indication for a predetermined functional object 1821 , 1831 associated with an area 1810 , 1820 , 1830 of the display 210 corresponding to the first user gesture.
  • the display 210 may be controlled.
  • the function objects 1821 and 1831 may include, for example, a status bar indicating a function of adjusting the brightness of the display 210 as shown in FIG. 18 .
  • FIG. 18 shows an example of a functional object, and in the present invention, in addition to brightness adjustment, a functional object related to various functions may be provided.
  • the processor 120 generates an identification mark in response to the movement of the second user gesture with respect to the functional objects 1821 and 1831 based on the detection of a second user gesture that moves continuously to the first user gesture.
  • an identification mark is provided in which the status bar of the functional object 1821 1831 is changed to correspond to the movement direction and distance of the second user gesture moving (eg, sliding) a predetermined distance to the right.
  • the processor 120 may control the function of the functional objects 1821 and 1831 to be performed in response to the movement of the second user's gesture. Accordingly, as shown in FIG. 18 , it can be confirmed that the brightness 1832 of the display 210 is adjusted in response to the movement of the status bar of the function objects 1821 and 1831 .
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first, second, or first or second may simply be used to distinguish an element from other elements in question, and may refer elements to other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 136 or external memory 138) readable by a machine (eg, electronic device 101).
  • a storage medium eg, internal memory 136 or external memory 138
  • the processor eg, the processor 120
  • the device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided in a computer program product (computer program product).
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly or online between smartphones (eg: smartphones).
  • a portion of the computer program product may be temporarily stored or temporarily generated in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a memory of a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. have.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. , or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • User Interface Of Digital Computer (AREA)
  • Electrotherapy Devices (AREA)

Abstract

La présente invention se rapporte à un dispositif électronique et à son procédé de commande, et le dispositif électronique comprend : une unité d'affichage; et un processeur qui identifie une zone d'affichage correspondant à un geste d'utilisateur sur la base de la détection d'un geste d'utilisateur prédéfini par rapport à une zone partielle du dispositif d'affichage, sélectionne, parmi la pluralité d'objets affichés sur le dispositif d'affichage, au moins un objet qui est associé à la zone identifiée et qui correspond à des informations importantes, et commande le dispositif d'affichage pour fournir une indication d'identification relative à l'objet sélectionné.
PCT/KR2022/002074 2021-02-26 2022-02-11 Dispositif électronique et son procédé de commande WO2022182025A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0026819 2021-02-26
KR1020210026819A KR20220122328A (ko) 2021-02-26 2021-02-26 전자 장치 및 그 제어 방법

Publications (1)

Publication Number Publication Date
WO2022182025A1 true WO2022182025A1 (fr) 2022-09-01

Family

ID=83048356

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/002074 WO2022182025A1 (fr) 2021-02-26 2022-02-11 Dispositif électronique et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20220122328A (fr)
WO (1) WO2022182025A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100041006A (ko) * 2008-10-13 2010-04-22 엘지전자 주식회사 3차원 멀티 터치를 이용한 사용자 인터페이스 제어방법
KR20140131262A (ko) * 2013-05-02 2014-11-12 삼성전자주식회사 스크린을 제어하는 휴대 단말 및 방법
KR101622196B1 (ko) * 2009-09-07 2016-05-18 삼성전자주식회사 휴대용 단말기에서 피오아이 정보 제공 방법 및 장치
KR20160097924A (ko) * 2015-02-10 2016-08-18 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20170041219A (ko) * 2014-08-12 2017-04-14 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 렌더링된 콘텐츠와의 호버 기반 상호작용

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100041006A (ko) * 2008-10-13 2010-04-22 엘지전자 주식회사 3차원 멀티 터치를 이용한 사용자 인터페이스 제어방법
KR101622196B1 (ko) * 2009-09-07 2016-05-18 삼성전자주식회사 휴대용 단말기에서 피오아이 정보 제공 방법 및 장치
KR20140131262A (ko) * 2013-05-02 2014-11-12 삼성전자주식회사 스크린을 제어하는 휴대 단말 및 방법
KR20170041219A (ko) * 2014-08-12 2017-04-14 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 렌더링된 콘텐츠와의 호버 기반 상호작용
KR20160097924A (ko) * 2015-02-10 2016-08-18 엘지전자 주식회사 이동 단말기 및 그 제어방법

Also Published As

Publication number Publication date
KR20220122328A (ko) 2022-09-02

Similar Documents

Publication Publication Date Title
WO2021075786A1 (fr) Dispositif électronique et procédé de traitement d'une fenêtre surgissante utilisant une multi-fenêtre de celui-ci
WO2022085885A1 (fr) Procédé de commande de fenêtre et dispositif électronique associé
WO2022031051A1 (fr) Procédé permettant de fournir une fonction de capture et dispositif électronique associé
WO2022030996A1 (fr) Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé
WO2022019635A1 (fr) Dispositif électronique pour fournir un écran partagé et un écran privé, et procédé de commande associé
WO2022119276A1 (fr) Dispositif électronique d'affichage souple et procédé de fonctionnement associé
WO2022098195A1 (fr) Dispositif électronique et procédé de fonctionnement en mode flexible
WO2022108192A1 (fr) Dispositif électronique et procédé de commande multi-fenêtre de dispositif électronique
WO2022030890A1 (fr) Procédé de capture d'image à fenêtres multiples et dispositif électronique associé
WO2022031055A1 (fr) Dispositif électronique et procédé permettant de commander une sortie de vibrations de celui-ci
WO2022097858A1 (fr) Dispositif électronique pouvant étendre une région d'affichage et procédé de commande d'écran associé
WO2022025494A1 (fr) Dispositif électronique de commande de luminance de dispositif d'affichage et procédé de fonctionnement associé
WO2021261949A1 (fr) Procédé d'utilisation selon l'état de pliage d'un afficheur et appareil électronique l'utilisant
WO2022114548A1 (fr) Procédé et appareil permettant de commander une interface utilisateur d'un écran souple
WO2022030998A1 (fr) Dispositif électronique comprenant une unité d'affichage et son procédé de fonctionnement
WO2022014836A1 (fr) Procédé et appareil d'affichage d'objets virtuels dans différentes luminosités
WO2022030921A1 (fr) Dispositif électronique, et procédé de commande de son écran
WO2022103021A1 (fr) Dispositif électronique à affichage flexible et procédé de commande dudit dispositif
WO2022086272A1 (fr) Dispositif électronique pour fournir une interface utilisateur, et procédé associé
WO2022055178A1 (fr) Appareil électronique pliable servant à afficher divers types de contenus, et son procédé de fonctionnement
WO2022182025A1 (fr) Dispositif électronique et son procédé de commande
WO2022010092A1 (fr) Dispositif électronique pour prendre en charge un partage de contenu
WO2023214675A1 (fr) Dispositif électronique et procédé de traitement d'entrée tactile
WO2022119118A1 (fr) Dispositif électronique comprenant un dispositif d'affichage flexible et procédé de commande d'écran
WO2022114509A1 (fr) Dispositif électronique conçu pour présenter un écran à visibilité modifiée en fonction de l'extension d'un affichage flexible et procédé de commande dudit dispositif

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22759943

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22759943

Country of ref document: EP

Kind code of ref document: A1