WO2024019311A1 - Dispositif électronique et procédé de traitement de contact d'objet externe sur un écran d'affichage - Google Patents

Dispositif électronique et procédé de traitement de contact d'objet externe sur un écran d'affichage Download PDF

Info

Publication number
WO2024019311A1
WO2024019311A1 PCT/KR2023/007319 KR2023007319W WO2024019311A1 WO 2024019311 A1 WO2024019311 A1 WO 2024019311A1 KR 2023007319 W KR2023007319 W KR 2023007319W WO 2024019311 A1 WO2024019311 A1 WO 2024019311A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
partial
partial region
nodes
external object
Prior art date
Application number
PCT/KR2023/007319
Other languages
English (en)
Korean (ko)
Inventor
이경택
장원일
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220099276A external-priority patent/KR20240013618A/ko
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2024019311A1 publication Critical patent/WO2024019311A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/046Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means

Definitions

  • the descriptions below relate to an electronic device and method for handling contact of an external object on a display panel.
  • the electronic device may include touch circuitry within the display to execute a function in response to a finger or stylus pen touching the display.
  • the touch circuit may make the contact based on a capacitive method, a resistive method, an infra-red method, an acoustic method, and/or a pressure method. It may include a touch sensor for identifying and a control circuit for processing data obtained from the touch sensor.
  • the electronic device may include a display panel including an area capable of receiving a touch input.
  • the electronic device may include a touch circuit including a control circuit and a touch sensor.
  • the electronic device may include a processor.
  • the control circuit may be configured to identify, through the touch sensor, a first partial region in the region in contact with the external object, based at least in part on an external object at least partially in contact with the region. You can.
  • the control circuit is configured to operate on a first partial region adjacent to or at least partially overlapping a second partial region within the region with creases and having an area greater than a critical area.
  • the first partial region and the second partial region It may be configured to provide data representing at least part of the touch area to the processor.
  • a method is provided.
  • the method may be executed in an electronic device including a touch sensor and a display panel including an area capable of receiving a touch input.
  • the method includes identifying, through the touch sensor, a first partial region in the region in contact with the external object, based at least in part on an external object at least partially in contact with the region. can do.
  • the method may be performed on at least one first partial region adjacent to or at least partially overlapping a second partial region within the region with creases and having an area greater than a critical area. Based on a portion of the first partial region and the second partial region, to recognize the contact of the external object as a touch input contacted on at least a portion of the first partial region and the second partial region. It may include an operation of acquiring data representing at least part of the touch area.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments.
  • FIG. 2 is a block diagram of a display module, according to various embodiments.
  • FIG. 3 is a simplified block diagram of an example electronic device.
  • 4 and 5 illustrate examples of states of a display panel of an example electronic device, according to one embodiment.
  • Figure 6 shows an example of a flat partial region and a wrinkled partial region of a display panel.
  • Figure 7 shows an example of data identified through a touch sensor based on contact with an external object.
  • Figures 8 and 9 illustrate example methods for identifying partial areas with wrinkles.
  • 10 and 11 show an example method of processing values obtained through nodes within a partial area of a display panel with wrinkles.
  • FIG. 12 is a flowchart illustrating an example method of providing data representing a first partial area and a second partial area as a touch area.
  • FIG. 13 is a flowchart illustrating an example method of providing data representing a first partial area and a second partial area as a touch area based on the state of the display panel.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with at least one of the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • a first network 198 e.g., a short-range wireless communication network
  • a second network 199 e.g., a long-distance wireless communication network.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
  • the processor 120 for example, executes software (e.g., program 140) to operate at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • software e.g., program 140
  • the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
  • the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • the processor 120 includes a main processor 121 (e.g., a central processing unit or an application processor) or an auxiliary processor 123 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 121 e.g., a central processing unit or an application processor
  • auxiliary processor 123 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the electronic device 101 includes a main processor 121 and a secondary processor 123
  • the secondary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
  • the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
  • the auxiliary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or while the main processor 121 is in an active (e.g., application execution) state. ), together with the main processor 121, at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) At least some of the functions or states related to can be controlled.
  • co-processor 123 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 101 itself on which the artificial intelligence model is performed, or may be performed through a separate server (e.g., server 108).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
  • Memory 130 may include volatile memory 132 or non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142, middleware 144, or application 146.
  • the input module 150 may receive commands or data to be used in a component of the electronic device 101 (e.g., the processor 120) from outside the electronic device 101 (e.g., a user).
  • the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101.
  • the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 can visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
  • the electronic device 102 e.g., speaker or headphone
  • the sensor module 176 detects the operating state (e.g., power or temperature) of the electronic device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that can be used to connect the electronic device 101 directly or wirelessly with an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 can capture still images and moving images.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 can manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • Communication module 190 is configured to provide a direct (e.g., wired) communication channel or wireless communication channel between electronic device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108). It can support establishment and communication through established communication channels. Communication module 190 operates independently of processor 120 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • processor 120 e.g., an application processor
  • the communication module 190 is a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
  • a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
  • the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 to communicate within a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low latency). -latency communications)) can be supported.
  • the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. It can support technologies such as input/output (FD-MIMO (full dimensional MIMO)), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199).
  • the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 197 may transmit or receive signals or power to or from the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for the communication method used in the communication network, such as the first network 198 or the second network 199, is connected to the plurality of antennas by, for example, the communication module 190. can be selected. Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 197.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 or 104 may be of the same or different type as the electronic device 101.
  • all or part of the operations performed in the electronic device 101 may be executed in one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 may perform the function or service instead of executing the function or service on its own.
  • one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of Things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199.
  • the electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • FIG. 2 is a block diagram 200 of the display device 160 according to various embodiments.
  • the display device 160 may include a display 210 and a display driver IC (DDI) 230 for controlling the display 210 .
  • the DDI 230 may include an interface module 231, a memory 233 (eg, buffer memory), an image processing module 235, or a mapping module 237.
  • the DDI 230 receives image information including image data or an image control signal corresponding to a command for controlling the image data from other components of the electronic device 101 through the interface module 231. can do.
  • the image information is stored in the processor 120 (e.g., the main processor 121 (e.g., an application processor) or the auxiliary processor 123 ( For example: a graphics processing unit).
  • the DDI 230 can communicate with the touch circuit 250 or the sensor module 176, etc. through the interface module 231.
  • the DDI 230 can communicate with the touch circuit 250 or the sensor module 176, etc.
  • At least a portion of the received image information may be stored, for example, in frame units, in the memory 233.
  • the image processing module 235 may, for example, store at least a portion of the image data in accordance with the characteristics or characteristics of the image data.
  • Preprocessing or postprocessing may be performed based at least on the characteristics of the display 210.
  • the mapping module 237 performs preprocessing or postprocessing through the image processing module 135.
  • a voltage value or current value corresponding to the image data may be generated.
  • the generation of the voltage value or current value may be performed by, for example, an attribute of the pixels of the display 210 (e.g., an arrangement of pixels ( RGB stripe or pentile structure), or the size of each subpixel). At least some pixels of the display 210 may be performed at least in part based on, for example, the voltage value or the current value.
  • visual information eg, text, image, or icon
  • corresponding to the image data may be displayed through the display 210.
  • the display device 160 may further include a touch circuit 250.
  • the touch circuit 250 may include a touch sensor 251 and a touch sensor IC 253 for controlling the touch sensor 251.
  • the touch sensor IC 253 may control the touch sensor 251 to detect a touch input or a hovering input for a specific position of the display 210.
  • the touch sensor IC 253 may detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, light amount, resistance, or charge amount) for a specific position of the display 210.
  • the touch sensor IC 253 may provide information (e.g., location, area, pressure, or time) about the detected touch input or hovering input to the processor 120.
  • At least a portion of the touch circuit 250 is disposed as part of the display driver IC 230, the display 210, or outside the display device 160. It may be included as part of other components (e.g., auxiliary processor 123).
  • the display device 160 may further include at least one sensor (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illumination sensor) of the sensor module 176, or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display device 160 (eg, the display 210 or the DDI 230) or a part of the touch circuit 250.
  • the sensor module 176 embedded in the display device 160 includes a biometric sensor (e.g., a fingerprint sensor)
  • the biometric sensor records biometric information associated with a touch input through a portion of the display 210. (e.g. fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information associated with a touch input through part or the entire area of the display 210. You can.
  • the touch sensor 251 or the sensor module 176 may be disposed between pixels of a pixel layer of the display 210, or above or below the pixel layer.
  • An electronic device may include a deformable or flexible display panel to provide a larger display area while having enhanced portability.
  • the display panel can be in multiple states.
  • the state of the display panel may be changed from the unfolded state to the folded state, or from the folded state to the unfolded state.
  • the state of the display panel changes from a first state in which at least a portion of the display panel is rolled to a second state in which at least a portion of the display panel is flat. may be changed, or may be changed from the second state to the first state.
  • an area of the display panel that can receive a touch input may include a partial area with creases due to this change in state.
  • a touch input having a contact area larger than the reference area may be used within the electronic device to provide an enhanced user experience.
  • the touch input may have a contact area that is narrower than the reference area, contrary to the user's intention that caused the touch input.
  • FIG. 3 is a simplified block diagram of an example electronic device.
  • the electronic device 300 may include a processor 120 and a display 330.
  • the electronic device 300 may further include a strain gauge sensor arranged with respect to the display 330.
  • a strain gauge sensor arranged with respect to the display 330.
  • the processor 120 may include the main processor 121 and/or the auxiliary processor 123 (eg, the sensor hub processor) shown in FIG. 1 .
  • processor 120 may be operatively coupled to display 330 .
  • display 330 may include display module 160 shown in FIG. 1 or 2 .
  • display 330 may include display 210 shown in FIG. 2 .
  • the display 330 may include a display panel 331 and a touch circuit 332.
  • the display panel 331 may be used to display a screen or image.
  • the display panel 331 may provide an area that can receive touch input.
  • the area may correspond to a display area. However, it is not limited to this.
  • the touch circuit 332 may include a control circuit 333 and a touch sensor 334.
  • the control circuit 333 may be used to control the touch sensor 334.
  • the control circuit 333 may process data on a plurality of values obtained from the touch sensor 334.
  • the control circuit 333 may generate or obtain data to be provided to the processor 120 based on the processing.
  • the control circuit 333 may provide at least part of the data to the processor 120.
  • the touch sensor 334 may obtain or identify the plurality of values through a plurality of nodes in response to an external object that is at least partially in contact with the area.
  • the plurality of nodes of the touch sensor 334 will be illustrated through FIGS. 7, 10, and 11.
  • the external object may include an object and/or a body part that is in contact with the area.
  • the external object may be a user's finger touching the area.
  • the external object may be the blade of the user's hand, the palm of the user's hand, and/or the user's clenched hand.
  • the external object may be a stylus pen associated with the electronic device 300 that is in contact with the area.
  • the display 330 may include a flexible display (or flexible display panel).
  • the display 330 may be a slideable display or roller. It may include a rollable display (or a slideable display panel or a rollable display panel).
  • the electronic device 300 may include a foldable type electronic device.
  • the display 330 (or display panel 331), which is the foldable display, may have a plurality of states.
  • the plurality of states may include a first state and a second state, which will be illustrated below.
  • the plurality of states can be illustrated through FIGS. 4 and 5.
  • 4 and 5 illustrate examples of states of a display panel of an example electronic device, according to one embodiment.
  • the display panel 331 may include a display area corresponding to an area capable of receiving a touch input.
  • the display area may include a first display area 431 disposed or formed on the first housing 410 of the electronic device 300, a second housing 420 of the electronic device 300, or It may include a formed second display area 432 and a third display area 433 between the first display area 431 and the second display area 432.
  • the third display area 433 includes a folding housing (folding housing 565 in FIG. 5) including a hinge structure (not shown in FIGS. 4 and 5) for providing the plurality of states. Can be placed across.
  • the display panel 331 may be in state 400, which is the first state among the plurality of states.
  • state 400 may be when second housing 420 is fully folded out relative to first housing 410 by a folding housing (folding housing 565 shown in Figure 5). ) It may be in an unfolding state.
  • the state 400 may be a state in which the first direction 401 in which the first display area 431 faces is substantially parallel to the second direction 402 in which the second display area 432 faces.
  • the state 400 may be a state in which the first display area 431, the second display area 432, and the third display area 433 are arranged on one virtual plane.
  • state 400 may be a state in which the angle 403 between the first display area 431 and the second display area 432 is about 180 degrees. However, it is not limited to this.
  • the display panel 331 may be in the second state among the plurality of states.
  • the display panel 331 may be in state 500, which is the second state.
  • state 500 may be a folded state in which the second housing 420 is completely folded in by the folding housing 565 with respect to the first housing 410 .
  • state 500 may be a state in which the first direction 401 is substantially opposite to the second direction 402 .
  • state 500 may be a state in which the first display area 431 substantially overlaps the second display area 432 .
  • state 500 may be a state in which the angle 503 between the first display area 431 and the second display area 432 is approximately 0 degrees. However, it is not limited to this.
  • display panel 331 can be bent by rotation provided through folding housing 565.
  • the third display area 433 may be curved within state 500.
  • the third display area 433 may be in a curved state in state 500 to prevent damage to the display panel 331 .
  • it is not limited to this.
  • the plurality of states may include one or more intermediate states between the state 400 and the state 500 .
  • the processor 120 determines whether the display panel 331 is in one (a) of the plurality of states or when the display panel 331 is in one of the plurality of states (a).
  • a) Providing a status can be identified through one or more sensors included in the electronic device 300.
  • the one or more sensors include a Hall sensor in the electronic device 300, an acceleration sensor in the electronic device 300, a gyro sensor in the electronic device 300, and a folding housing ( For example, it may include a rotation sensor in the folding housing 565 in FIG. 5, an illumination sensor in the electronic device 300, and/or a proximity sensor in the electronic device 300.
  • the state of the display panel 331 may be changed from the first state to another state among the plurality of states (e.g., the second state or the one or more intermediate states). .
  • the state of the display panel 331 may change from the second state to another state among the plurality of states (eg, the first state or the one or more intermediate states).
  • the state of the display panel 331 may be an intermediate state between the first state and the second state and another state (e.g., the first state, the a second state, or another intermediate state distinct from the intermediate state).
  • the third display area 433 may include creases due to a change in the state of the display panel 331.
  • the third display area 433, unlike the first display area 431 and the second display area 432, may include a partial area with wrinkles.
  • the third display area 433, unlike the first display area 431 and the second display area 432, includes the partial area spaced apart from an external object contacted on the display panel 331. can do.
  • the partial area can be illustrated through FIG. 6.
  • Figure 6 shows an example of a flat partial area and a wrinkled partial area of a display panel.
  • the first display area 431 and the second display area 432 are substantially flat, while the third display area 433 has wrinkles due to the change in the state of the display panel 331. It may include a partial region 600 having .
  • the external object 601 when the external object 601 is located on the first display area 431 or the second display area 432, the external object 601 is displayed in the first display area 431 and the second display area 432. It may be in contact with 432 or may be spaced less than a certain distance from the first display area 431 and the second display area 432 .
  • the external object 601 when the external object 601 is located on the third display area 433, the external object 601 may be spaced apart from the partial area 600 by a certain distance or more.
  • a space 610 created according to a wrinkle may be formed between the external object 601 and the area 600.
  • a touch input e.g., palm touch input
  • the external object 601 is partially Being away from area 600 may result in providing a different response that is distinct from the response to the touch input.
  • the control circuit 333 in the touch circuit 332 may be based at least in part on an external object 601 at least partially contacted on the area on the display panel 331 capable of receiving a touch input,
  • a plurality of values can be obtained through the plurality of nodes of the touch sensor 334.
  • the plurality of nodes may be evenly included within the area.
  • the plurality of nodes may be spaced apart from each other at certain intervals.
  • the plurality of nodes may be included in the area as one or more patterns.
  • at least some of the plurality of values may represent a state of the area that is at least partially changed according to the contact of the external object.
  • at least some of the plurality of values may represent the strength of the contact of the external object.
  • control circuit 333 may obtain a plurality of values through the plurality of nodes based at least in part on the external object 601 located on the partial region 600. For example, the control circuit 333 may determine, based on at least some of the plurality of values, that the external object 601 located on at least a portion of the partial area 600 has a contact area narrower than the contact area corresponding to the user's intention. Data indicating a touch input having an area can be identified. This identification can be illustrated through FIG. 7 .
  • Figure 7 shows an example of data identified through a touch sensor based on contact with an external object.
  • the control circuit 333 operates based at least in part on an external object at least partially in contact on the first display area 431 (or the second display area 432). , it is possible to identify the first partial area 701 in the area in contact with the external object. For example, the control circuit 333 obtains through the plurality of nodes based at least in part on the external object. By identifying values greater than 300 among a plurality of values, the first partial area 701 can be identified. For example, the area of the first partial area 701 may be larger than the reference area. For example, the control circuit 333 may provide data representing the first partial area 701 as a touch area to the processor 120.
  • the processor 120 may display the first partial area 701 Based at least in part on the data representing, the contact of the external object may be recognized as a touch input (hereinafter referred to as a palm touch input) having a contact area larger than the reference area.
  • the processor 120 may obtain a response to the palm touch input based on the contact of the external object.
  • the processor 120 may provide the response.
  • the processor ( 120) may provide the response by changing at least a portion of the screen displayed on the display panel 331.
  • the response is not limited thereto.
  • the control circuit 333 based at least in part on an external object at least partially in contact on the third display area 433, divides the partial area in contact with the external object into the first display area in the area. It can be identified as a partial area 751 and a second partial area 752.
  • the external object may be located at least partially on a partial region having wrinkles (eg, partial region 600 in FIG. 6).
  • the external object may be located at least partially on the partial area for the palm touch input.
  • the control circuit 333 identifies values of 300 or more among the plurality of values obtained through the plurality of nodes based at least in part on the external object, thereby dividing the first partial region 751 and the second partial region (752) can be identified.
  • the area of each of the first partial region 751 and the second partial region 752 may be smaller than the reference area.
  • the third partial area 753 between the first partial area 751 and the second partial area 752 may at least partially correspond to the space 610 .
  • the nodes in the third partial region 753 have a number of less than 300, based at least in part on the external object. Values can be obtained.
  • the nodes in the third partial area 753 identify, through the control circuit 333, data representing the palm touch input, corresponding to user intent because it obtains the values less than 300. The thing is, it can fail.
  • the control circuit 333 may provide the processor 120 with data representing each of the first partial area 751 and the second partial area 752 as a touch area.
  • the processor 120 determines the contact of the external object based at least in part on the data representing each of the first partial area 751 and the second partial area 752 as the touch input. It can be recognized as a touch input (eg, a touch input from two fingers) having two partial areas having a contact area smaller than the area.
  • the processor 120 may obtain a response based on the contact of the external object.
  • processor 120 may provide the response. Since the response is different from the user request corresponding to the response to the palm touch input, the user may perceive the response as a malfunction of the electronic device 300.
  • the electronic device 300 may identify a third partial area 753 (or partial area 600 in FIG. 6) that is not recognized as a touch input. For example, the electronic device 300 performs other processing that is distinct from processing for touch input located on the first display area 431 and the second display area 432 in the third partial area 753 (or It can be executed for a touch input located at least partially on the partial area 600 in FIG. 6.
  • the electronic device 300 which is a foldable type device, but the electronic device 300 may be a rollable type device.
  • the partial region having wrinkles may be located within at least a portion of the rollable region.
  • the partial region within the region having wrinkles is identified through obtaining self capacitance values through the touch sensor 334 or through obtaining values through the strain gauge sensor. It can be. For example, identifying the partial region through obtaining self-capacitance values can be illustrated through FIG. 8. For example, identifying the partial region through obtaining values through the strain gauge sensor can be illustrated in FIG. 9 .
  • Figures 8 and 9 illustrate example methods for identifying partial areas with wrinkles.
  • a plurality of self-capacitance values may be obtained through the touch sensor 334.
  • the control circuit 333 or the processor 120 may identify self-capacitance values that are less than a threshold value among the plurality of self-capacitance values. For example, because a self-capacitance value below the threshold may indicate that a wrinkle is located on a node that obtained the self-capacitance value, the control circuit 333 or processor 120 may determine the self-capacitance value below the threshold value.
  • the partial region corresponding to the nodes that obtained the self-capacitance values of may be identified as the partial region having wrinkles.
  • the threshold value may change according to a change in the size of the wrinkle.
  • information about the partial region having wrinkles may be stored in the electronic device 300 (eg, non-volatile memory 134). However, it is not limited to this.
  • obtaining the plurality of self-capacitance values and identifying the self-capacitance values may be performed using at least a portion of the display panel 331. While it is rolled, it can be executed. However, it is not limited to this.
  • the electronic device 300 may include a strain gauge sensor 900, which is the strain gauge sensor.
  • strain gauge sensor 900 may be located within the area of display panel 331 capable of receiving touch input.
  • the strain gauge sensor 900 may be located within the third display area 433, which may include the partial area with wrinkles. However, it is not limited to this.
  • the strain gauge sensor 900 may include a sensing unit 901 and a terminal unit 902.
  • the sensing unit 901 may have a strain sensitive pattern, as shown in FIG. 9 .
  • the terminal unit 902 may be used to identify or measure the resistance value of the sensing unit 901.
  • the control circuit 333 or the processor 120 may identify the resistance value of the sensing unit 901 through the terminal unit 902.
  • the control circuit 333 or the processor 120 may identify the partial region having wrinkles through the resistance value. For example, as in state 930, when the area where the strain gauge sensor 900 is located is flat, the control circuit 333 or processor 120, through the strain gauge sensor 900, Resistance value can be obtained.
  • the control circuit 333 or processor 120 A second resistance value greater than the first resistance value may be obtained. For example, as in state 990, if the area where strain gauge sensor 900 is located is compressed or the area is concave, control circuit 333 or processor 120 may cause strain gauge sensor 900 to Through this, a third resistance value smaller than the first resistance value can be obtained. For example, the control circuit 333 or the processor 120 may identify the partial region containing the location where the second resistance value and/or the third resistance value was obtained as the partial region having a wrinkle. You can.
  • the control circuit 333 or the processor 120 includes a position changed from the first resistance value to the second resistance value and/or a position changed from the first resistance value to the third resistance value.
  • a partial region can be identified as the partial region having wrinkles.
  • the difference between the first resistance value and the second resistance value and the difference between the first resistance value and the third resistance value may each be a change in the degree to which the region is convex and a change in the degree to which the region is concave. It may change depending on changes. However, it is not limited to this.
  • information about the partial region having wrinkles may be stored in the electronic device 300 (eg, non-volatile memory 134). However, it is not limited to this.
  • the control circuit 333 determines whether the external object is in contact based at least in part on an external object at least partially in contact on the area of the display panel 331 capable of receiving a touch input.
  • the first partial area within the area can be identified through the touch sensor 334.
  • the control circuit 333 may obtain a plurality of values through the plurality of nodes of the touch sensor 334 based at least in part on the external object.
  • the control circuit 333 identifies first nodes that have acquired values greater than or equal to a threshold value (e.g., 300 in the description of FIG. 7) among the plurality of nodes that have acquired the plurality of values.
  • 1 Partial areas can be identified.
  • the values obtained through the first nodes may be referred to as first values.
  • the control circuit 333 is adjacent to or adjacent to a second partial region within the region having a wrinkle (e.g., partial region 600 in FIG. 6 and/or third partial region 753 in FIG. 7). Based at least in part on the first partial region, which at least partially overlaps and has an area larger than a critical area, the contact of the external object is a touch input made on the first partial region and the second partial region. For recognition, data representing the first partial area and the second partial area as a touch area may be provided to the processor 120.
  • information about the second partial region may be obtained and stored in the electronic device 300 before the contact of the external object.
  • the control circuit 333 may identify that the first partial region is adjacent to or at least partially overlaps the second partial region, based at least in part on the information.
  • the control circuit 333 may identify that the area of the first partial region is larger than the critical area.
  • the critical area may be narrower than the reference area.
  • the critical area may be a parameter for identifying whether the external object is contacted for the palm touch input.
  • the external object may contact the palm touch input for the palm touch input. It can indicate that it is.
  • the area of the first partial region adjacent to or at least partially overlapping the second partial region is equal to or narrower than the critical area, meaning that the external object is It may indicate that contact is made for touch input, which is distinct from palm touch input.
  • the area of the first partial region may be narrower than the critical area.
  • the control circuit 333 may bypass the comparison between the first partial area and the critical area and Based on the relative positional relationship between the area and the second partial area, data representing the first partial area and the second partial area as a touch area may be obtained.
  • the control circuit 333 is based at least in part on the first partial region, adjacent to or at least partially overlapping the second partial region, and having the area larger than the critical area.
  • the data representing the first partial area and the second partial area as a touch area may be provided to the processor 120.
  • the processor 120 may recognize the contact of the external object as a touch input made in contact with the first partial area and the second partial area.
  • the processor 120 may obtain a response based on the result of the recognition.
  • processor 120 may provide the response.
  • the processor 120 may provide the response by at least partially changing the state of the screen displayed on the display panel 331.
  • the processor 120 may provide the response by stopping output of a notification (eg, sound) from the electronic device 300.
  • the processor 120 may provide the response by executing a predetermined function corresponding to the touch input.
  • a notification eg, sound
  • the processor 120 may provide the response by executing a predetermined function corresponding to the touch input.
  • the data may be used to compensate for values obtained through second nodes located within the second partial area among the plurality of values obtained through the plurality of nodes in response to the external object. Based on this, it can be obtained.
  • values obtained through the second nodes may be referred to as second values. Compensation of the second values can be illustrated through FIG. 10.
  • FIG. 10 shows an example method of processing values obtained through nodes within a partial region of a display panel with wrinkles.
  • the control circuit 333 may obtain a plurality of values, such as state 1000, based at least in part on the external object through the plurality of nodes.
  • the control circuit 333 may be configured to configure a first node in which the first nodes that have obtained the first values greater than or equal to the threshold value (e.g., 400) are located among the plurality of nodes that have obtained the plurality of values.
  • 1 Partial area 1010 can be identified.
  • the control circuit 333 may identify that the first partial region 1010 is adjacent to the second partial region 1020.
  • the control circuit 333 may identify that the area of the first partial region 1010 is larger than the critical area.
  • the control circuit 333 may, based at least in part on the first values, select nodes within a second partial region 1020 of the plurality of nodes that obtained the plurality of values.
  • the second values obtained through can be compensated for.
  • the control circuit 333 may change the second values to third values that are greater than the second values, based at least in part on the first values.
  • the control circuit 333 may identify or obtain data representing at least a portion of the first partial region 1010 and the second partial region 1020 as the touch region 1040 based on the change. there is.
  • the data may be used to recognize the contact of the external object as a touch input contacted on at least a portion of the first partial region 1010 and the second partial region 1020.
  • the control circuit 333 may obtain a plurality of values, such as state 1060, based at least in part on the external object through the plurality of nodes.
  • the external object may be located on the first display area 431 and the second display area 432, and may be located above the third display area 433.
  • the control circuit 333 may, among the plurality of nodes that have acquired the plurality of values, within the first display area 431 that has acquired the first values greater than or equal to the threshold value (e.g., 400).
  • the first partial area 1061 where the first nodes are located can be identified.
  • the control circuit 333 may determine the first node in the second display area 432 that has acquired third values greater than or equal to the threshold value (e.g., 400) among the plurality of nodes that have acquired the plurality of values.
  • a third partial area 1063 where 3 nodes are located can be identified.
  • the control circuit 333 may identify that a partial region 1062 within the region having wrinkles is located between the first partial region 1061 and the third partial region 1063.
  • the control circuit 333 may identify that the sum of the area of the first partial region 1061 and the area of the third partial region 1063 is greater than the critical area.
  • control circuit 333 may, in response to the identification, determine a second partial region of the plurality of nodes that obtained the plurality of values, based at least in part on the first values and the third values. 1062), the second values obtained through the nodes may be compensated. For example, as in state 1090, control circuit 333 may change the second values to fourth values that are greater than the second values, based at least in part on the first values and the third values. there is.
  • the fourth values may be obtained by performing interpolation between the first values and the third values based on the second value.
  • the control circuit 333 converts the first partial region 1061, at least a portion of the second partial region 1062, and the third partial region 1063 into the touch region 1095.
  • the data may include the contact of the external object, a touch input made on the first partial region 1061, the at least part of the second partial region 1062, and the third partial region 1063. It can be used to recognize.
  • the data is stored in the first values (and/or third values) obtained through the first nodes (and/or the third nodes) in response to the external object. It may be obtained based on an applied first threshold value (i.e., the threshold value) and a second threshold value applied to the second values obtained through the second nodes in response to the external object.
  • the second threshold may be smaller than the first threshold. Applying the second threshold can be illustrated through FIG. 11.
  • FIG. 11 shows an example method of processing values obtained through nodes within a partial region of a display panel with wrinkles.
  • the control circuit 333 may obtain a plurality of values, such as state 1000, based at least in part on the external object through the plurality of nodes. For example, the control circuit 333 determines that, among the plurality of nodes that have acquired the plurality of values, the first nodes that have acquired the first values greater than or equal to the first threshold value (e.g., 400) are located.
  • the first partial region 1010 can be identified. For example, the control circuit 333 may identify that the first partial region 1010 is adjacent to the second partial region 1020. For example, the control circuit 333 may identify that the area of the first partial region 1010 is larger than the critical area.
  • control circuitry 333 may, in response to the identification, select the second nodes within second partial area 1020 based on the second threshold value (e.g., 220) that is less than the first threshold value.
  • the second values obtained through can be identified.
  • the control circuit 333 may identify at least a portion of the second partial area 1020 including at least a portion of the second nodes that among the second values obtained values equal to or greater than the second threshold value. there is.
  • control circuitry 333 may, in response to the identification, indicate at least a portion of first partial area 1010 and second portion 1020 as a touch area 1110.
  • Data can be identified or obtained.
  • the data may be used to recognize the contact of the external object as a touch input contacted on at least a portion of the first partial region 1010 and the second partial region 1020.
  • the control circuit 333 may obtain a plurality of values, such as state 1060, based at least in part on the external object through the plurality of nodes.
  • the external object may be located on the first display area 431 and the second display area 432, and may be located above the third display area 433.
  • the control circuit 333 determines that, among the plurality of nodes that have acquired the plurality of values, the first nodes that have acquired the first values greater than or equal to the first threshold value (e.g., 400) are located.
  • the first partial area 1061 and the third partial area 1063 where the third nodes that have obtained the third values greater than or equal to the first threshold value (eg, 400) are located may be identified.
  • control circuit 333 may identify that a partial region 1062 within the region having wrinkles is located between the first partial region 1061 and the third partial region 1063. For example, the control circuit 333 may identify that the sum of the area of the first partial region 1061 and the area of the third partial region 1063 is greater than the critical area. For example, control circuitry 333 may, in response to the identification, select the second nodes within second partial area 1020 based on the second threshold value (e.g., 220) that is less than the first threshold value. The second values obtained through can be identified. For example, the control circuit 333 may identify at least a portion of the second partial area 1062 that includes at least a portion of the second nodes that obtained among the second values values equal to or greater than the second threshold value.
  • the second threshold value e.g. 220
  • control circuitry 333 may, in response to the identification, determine first partial region 1061, the at least a portion of second portion 1062, and third partial region 1063. ) can be identified or acquired as the touch area 1160.
  • the data may include the contact of the external object, a touch input made on the first partial region 1061, the at least part of the second partial region 1062, and the third partial region 1063. It can be used to recognize.
  • the control circuit 333 controls a partial region within the region having wrinkles (e.g., partial region 600 in FIG. 6, third partial region 753 in FIG. 7, and second partial region in FIG. 10). Processing of the partial region 1020, the second partial region 1062 in FIG. 10, the second partial region 1020 in FIG. 11, and/or the second partial region 1062 in FIG. 11 is performed on the display panel 331. ) can be executed differently depending on the status.
  • the control circuit 333 may operate under the condition that the angle between the first direction (e.g., first direction 401) and the second direction (e.g., second direction 402) is outside the reference range.
  • the control circuit 333 may further configure the first partial area and the second partial area to the touch area based on the display panel 331 in the first state among the plurality of states.
  • the data represented by can be provided to the processor 120.
  • the control circuit 333 may, based on the display panel 331 in a third state between the first state and the second state, wherein the angle among the plurality of states is within the reference range, Data representing the first partial area among the first partial area and the second partial area as the touch area may be provided to the processor 120.
  • the reference range may correspond to the angle at which reception of the palm touch input is deactivated within the electronic device 300. However, it is not limited to this.
  • control circuit 333 performs compensation of the second values and/or application of the second threshold;
  • the above application may be executed by the processor 120.
  • the processor 120 in response to contact with the external object, obtains a plurality of values obtained through the plurality of nodes from the control circuit 333, and in response to the acquisition, the second Compensation of values and/or application of the second threshold may be performed.
  • the second Compensation of values and/or application of the second threshold may be performed.
  • the control circuit 333 sends data indicating at least a portion of the first partial area and the second partial area as a touch area based on the first partial area adjacent to the second partial area to the processor.
  • An example provided at (120) is shown, but this is for convenience of explanation.
  • the control circuit 333 may configure the first partial region and the second partial region based on identifying that the first partial region away from the second partial region is moved toward the second partial region.
  • the data representing at least part of the area as the touch area may be provided to the processor 120.
  • the control circuit 333 may provide the data based on identifying that the first partial region is moved toward the second partial region within a reference time from the timing at which the first partial region was identified. You can.
  • the control circuit 333 may provide the data based further on the speed at which the first partial region moves toward the second partial region. However, it is not limited to this.
  • FIG. 12 is a flowchart illustrating an example method of providing data representing a first partial area and a second partial area as a touch area.
  • the method includes the electronic device 101 in FIG. 1, the processor 120 in FIG. 1, the touch sensor IC 253 in FIG. 2, the electronic device 300 in FIG. 3, the processor 120 in FIG. 3, and/ Alternatively, it may be executed by the control circuit 333 in FIG. 3.
  • the control circuit 333 determines, based at least in part on an external object that is at least partially in contact on an area of the display panel 331 capable of receiving a touch input, that the external object is
  • the first partial region that has been contacted can be identified.
  • the control circuit 333 may obtain the first values greater than or equal to the threshold value among the plurality of nodes of the touch sensor 334 that acquire the plurality of values in response to the contact of the external object.
  • the first partial area where first nodes are located may be identified.
  • the first partial area may be wider than the critical area. However, it is not limited to this.
  • the control circuit 333 determines whether the first partial region is adjacent to a second partial region that is a partial region within the predetermined region or whether the first partial region at least partially overlaps the second partial region. can be identified.
  • the second partial region may be a partial region within the region having wrinkles.
  • the control circuit 333 executes operation 1205 based at least in part on the first partial region adjacent to or at least partially overlapping the second partial region, and Operation 1207 may be performed based at least in part on the first partial region spaced from the partial region.
  • the control circuit 333 configures the first partial region on the condition that the first partial region is adjacent to the second partial region or the first partial region at least partially overlaps the second partial region. And data representing at least a portion of the second partial area as a touch area may be provided to the processor 120. For example, the control circuit 333 may configure the second values obtained through the second nodes in the second partial region to the first values obtained through the first nodes in the first partial region. Based on this, the data can be obtained by compensating. For example, the control circuit 333 may determine the second threshold value to be less than the first threshold value applied to identify the first values among the second values obtained through the second nodes in the second partial region. The data can be obtained by identifying values greater than or equal to the value. The control circuit 333 may provide the data to the processor 120.
  • control circuit 333 may provide the processor 120 with data indicating the first partial area as a touch area under the condition that the first partial area is separated from the second partial area. For example, the control circuit 333 may refrain from executing processing related to the second partial area, or may bypass it, based at least in part on the first partial area that is separated by a certain distance or more from the second partial area. You can. However, it is not limited to this.
  • the electronic device 300 improves the performance of recognizing a touch input (e.g., the palm touch input) having an area larger than the reference area through processing the second partial area having wrinkles. You can.
  • a touch input e.g., the palm touch input
  • FIG. 13 is a flowchart illustrating an example method of providing data representing a first partial area and a second partial area as a touch area based on the state of the display panel.
  • the method includes the electronic device 101 in FIG. 1, the processor 120 in FIG. 1, the touch sensor IC 253 in FIG. 2, the electronic device 300 in FIG. 3, the processor 120 in FIG. 3, and/ Alternatively, it may be executed by the control circuit 333 in FIG. 3.
  • the control circuit 333 determines, based at least in part on an external object that is at least partially in contact on an area of the display panel 331 capable of receiving a touch input, that the external object is
  • the first partial region that has been contacted can be identified.
  • the control circuit 333 may obtain the first values greater than or equal to the threshold value among the plurality of nodes of the touch sensor 334 that acquire the plurality of values in response to the contact of the external object.
  • the first partial area where first nodes are located may be identified.
  • the first partial area may be wider than the critical area. However, it is not limited to this.
  • the control circuit 333 determines whether the angle between the first direction (e.g., first direction 401) and the second direction (e.g., second direction 402) is within a reference range. can be identified. For example, identifying whether the angle is within the reference range is replaced with identifying whether the angle 403 between the first display area 431 and the second display area 432 is within the reference range. It can be.
  • Figure 13 shows an example in which operation 1303 is executed after operation 1301, but this is for convenience of explanation. Operation 1303 may be executed in parallel with operation 1301 or may be executed before operation 1301.
  • the identification in operation 1303 may be performed based on a predetermined signal obtained from processor 120.
  • the predetermined signal may indicate that the angle is within the reference range.
  • the predetermined signal may be provided from processor 120 to control circuit 333 in response to processor 120's identification that the angle is within the reference range.
  • the predetermined signal may include data about the angle and be provided from the processor 120 to the control circuit 333 based on a predetermined period. However, it is not limited to this.
  • control circuit 333 may execute operation 1305 based on the angle outside the reference range and execute operation 1309 based on the angle within the reference range.
  • control circuit 333 determines that, on the condition that the angle is outside the reference range, the first partial area is adjacent to the second partial area or the first partial area is at least partially adjacent to the second partial area. You can identify whether or not there is overlap.
  • operation 1305 may correspond to operation 1203 of FIG. 12 .
  • the control circuit 333 executes operation 1307 based at least in part on the first partial region adjacent to or at least partially overlapping the second partial region, and Operation 1309 may be performed based at least in part on the first partial area spaced from the partial area.
  • control circuit 333 configures the first partial region on the condition that the first partial region is adjacent to the second partial region or the first partial region at least partially overlaps the second partial region.
  • data representing at least a portion of the second partial area as a touch area may be provided to the processor 120.
  • operation 1307 may correspond to operation 1205 of FIG. 12 .
  • control circuit 333 In operation 1309, the control circuit 333 generates data representing the first partial area as a touch area on the condition that the first partial area is spaced apart from the second partial area or on the condition that the angle is within the reference range. It can be provided to the processor 120. For example, operation 1309 may correspond to operation 1207 of FIG. 12 .
  • the electronic device 300 may perform different operations depending on the state of the display panel 331. Through this execution, the electronic device 300 can save resources used for processing the second partial region having wrinkles. For example, through this execution, the electronic device 300 can enhance the performance of recognizing a touch input (eg, the palm touch input) having an area larger than the reference area.
  • a touch input eg, the palm touch input
  • the electronic device 300 may include a display panel 331 including an area capable of receiving a touch input.
  • the electronic device 300 may include a touch circuit 332 including a control circuit 333 and a touch sensor 334.
  • the electronic device 300 may include a processor 120.
  • the control circuit 333 is based at least in part on an external object at least partially in contact with the area, a first partial area in the area in contact with the external object ( 1010; 1061) may be configured to be identified through the touch sensor 334.
  • the control circuit 333 is adjacent to or at least adjacent to the second partial region 1020; 1062 in the region with creases.
  • the contact of the external object is determined by the first partial area (1010; 1061) and the second 2
  • the first partial area 1010; 1061 is determined by the first partial area (1010; 1061) and the second 2
  • a touch input contacted on at least a part of the partial area 1020; 1062 the at least part of the first partial area 1010; 1061 and the second partial area 1020; 1062 are touched for recognition. It may be configured to provide data represented by areas 1040, 1095, 1110, and 1160 to the processor 120.
  • the display panel 331 includes a first display area 431, a second display area 432, and a third display area 433 between the first display area and the second display area. ), and a first state ( 400 ) and a second state 500 in which the first direction 401 is opposite to the second direction 402 .
  • the second partial area 1062 may be located within the third display area 433.
  • the first partial area 1061 may be located within the first display area 431.
  • the control circuit 333 determines the first partial region 1061 and the external object in contact with the external object, based at least in part on the external object that is at least partially in contact with the area. may be configured to identify a third partial area 1063 located within the second display area 432 that is contacted.
  • the control circuit 333 is configured such that the second partial region 1062 is located between the first partial region 1061 and the third partial region 1063, and Based at least in part on identifying that the sum of the area of 1061 and the area of the third partial region 1063 is greater than the threshold area, the contact of the external object is determined by: With a touch input contacted on at least a portion of the second partial region 1062 and the third partial region 1063, the first partial region 1061 and the second partial region 1062 are used for recognition. It may be configured to provide the processor 120 with the data representing the touch area 1160 including at least the portion and the third partial area 1063.
  • the touch sensor 334 may include a plurality of nodes within the area.
  • the plurality of nodes include first nodes located within the first partial area 1061, second nodes located within the second partial area 1062, and the third partial area It may include third nodes located within 1063.
  • the control circuit 333 is based on first values obtained through the first nodes and third values obtained through the third nodes based on the contact of the external object, and may be configured to obtain the data by processing the second values obtained through the second nodes based on the contact of the external object.
  • the control circuit 333 may be configured to provide the data to the processor 120.
  • control circuit 333 uses the first values and the third values to change the second values to fourth values that are larger than the second values, thereby changing the second values to fourth values. It can be configured to process.
  • the plurality of nodes include first nodes located within the first partial area 1061, second nodes located within the second partial area 1062, and the third partial area It may include third nodes located within 1063.
  • the control circuit 333 in response to the contact of the external object, first values obtained through the first nodes and obtained through the third nodes based on a first threshold value. It may be configured to obtain the data by identifying third values obtained through the second nodes based on a second threshold value that is smaller than the first threshold value.
  • the control circuit 333 may be configured to provide the data to the processor 120.
  • the second partial region 1062 has a self-capacitance ( may be identified based at least in part on self capacitance values.
  • the second partial area 1062 may be identified based at least in part on values obtained through at least one strain gauge sensor located within the third display area 433. there is.
  • information representing the second partial region 1062 may be stored in the electronic device 300 before the external object at least partially touches the region.
  • control circuit 333 transmits the data to the processor 120 based further on the display panel 331 in the first state 400 among the plurality of states. It can be configured to provide to.
  • the plurality of states include the first state 400 and the second state 500 in which the angle between the first direction 401 and the second direction 402 is within a reference range.
  • ) may include a third state between.
  • the control circuit 333 based on the display panel 331 in the third state, determines the contact of the external object, the first partial region 1061 and the second portion. In order to recognize a touch input contacted on the first partial area 1061 of the area 1062, the first partial area 1061 of the first partial area 1061 and the second partial area 1062 ) may be configured to provide data representing the touch area to the processor 120.
  • the plurality of nodes may include first nodes located within the first partial area (1010; 1061) and second nodes located within the second partial area (1020; 1062). there is.
  • the control circuit 333 is based at least in part on first values obtained through the first nodes based on the contact of the external object and based on the contact of the external object. It may be configured to obtain the data by compensating for the second values obtained through second nodes.
  • the control circuit 333 may be configured to provide the data to the processor 120.
  • control circuit 333 in response to the contact of the external object, identifies first values obtained through the first nodes based on a first threshold value, and and may be configured to obtain the data by identifying second values obtained through the second nodes based on a second threshold value less than the value. According to one embodiment, the control circuit 333 may be configured to provide the data to the processor 120.
  • the processor 120 may be configured to identify a response to the touch of the external object based on the data. According to one embodiment, the processor 120 may be configured to provide the response.
  • the processor 120 may be configured to provide the response by changing at least a portion of the screen displayed on the display panel 331.
  • the electronic device 300 may include a housing containing components of the electronic device.
  • the display panel 331 may include the area including a display area that can be rolled into the housing.
  • the second partial area 1020 may be located within the display area.
  • the method executed within the electronic device 300 including the display panel 331 and the touch sensor 334 including an area capable of receiving a touch input is performed at least partially on the area ( It may include an operation of identifying, through the touch sensor 334, a first partial area (1010; 1061) within the area in which the external object is in contact, based at least partially on the external object that has been contacted.
  • the method is adjacent to or at least partially overlaps a second partial region (1020; 1062) in the region with creases; , Based at least in part on the first partial area (1010; 1061), which has an area larger than the critical area, the contact of the external object is divided into the first partial area (1010; 1061) and the second partial area ( With a touch input contacted on at least a part of the first partial area 1010; 1062, the at least a part of the first partial area 1010; 1061 and the second partial area 1020; 1062 are touched by a touch area 1040; It may include an operation of acquiring data indicated by 1095; 1110; 1160).
  • the method may include identifying a response of the external object to the touch based on the data. According to one embodiment, the method may include an operation of providing the response.
  • the operation of providing the identified response may include the operation of changing at least a portion of the screen displayed on the display panel 331.
  • Electronic devices may be of various types.
  • Electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances.
  • Electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, secondary, or first or second may be used simply to distinguish one element from another and may be used to distinguish such elements in other respects, such as importance or order) is not limited.
  • One (e.g. first) component is said to be “coupled” or “connected” to another (e.g. second) component, with or without the terms “functionally” or “communicatively”. Where mentioned, it means that any of the components can be connected to the other components directly (e.g. wired), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as logic, logic block, component, or circuit, for example. It can be used as A module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions. For example, according to one embodiment, the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document are one or more instructions stored in a storage medium (e.g., built-in memory 136 or external memory 138) that can be read by a machine (e.g., electronic device 101). It may be implemented as software (e.g., program 140) including these.
  • a processor e.g., processor 120
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and this term refers to cases where data is semi-permanently stored in the storage medium. There is no distinction between temporary storage cases.
  • Computer program products are commodities and can be traded between sellers and buyers.
  • a computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disc read only memory (CD-ROM)), through an application store (e.g., Play Store), or on two user devices (e.g., It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • a machine-readable storage medium e.g., compact disc read only memory (CD-ROM)
  • an application store e.g., Play Store
  • It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • at least a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server.
  • each component (e.g., module or program) of the above-described components may include a single or plural entity, and some of the plurality of entities may be separately placed in other components. there is.
  • one or more of the components or operations described above may be omitted, or one or more other components or operations may be added.
  • multiple components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar manner as those performed by the corresponding component of the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, or omitted. Alternatively, one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention fournit un procédé. Le procédé peut être exécuté dans un dispositif électronique comprenant un capteur tactile et un écran d'affichage comprenant une région pouvant recevoir une entrée tactile. Le procédé peut comprendre une étape consistant, au moins partiellement, sur la base d'un objet externe qui est au moins partiellement en contact avec la région, à identifier une première région partielle dans la région à l'aide du capteur tactile, l'objet externe étant en contact avec la première région partielle. Le procédé peut comprendre une étape consistant à obtenir des données indiquant, en tant que région tactile, au moins une partie de la première région partielle et une seconde région partielle ayant des plis dans la région, afin de reconnaître le contact de l'objet externe en tant qu'entrée tactile qui est en contact avec l'au moins une partie de la première région partielle et de la seconde région partielle, au moins partiellement sur la base de la première région partielle qui est adjacente à la seconde région partielle ou la chevauche au moins partiellement et comprend une zone plus grande qu'une zone seuil.
PCT/KR2023/007319 2022-07-22 2023-05-26 Dispositif électronique et procédé de traitement de contact d'objet externe sur un écran d'affichage WO2024019311A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2022-0091369 2022-07-22
KR20220091369 2022-07-22
KR10-2022-0099276 2022-08-09
KR1020220099276A KR20240013618A (ko) 2022-07-22 2022-08-09 디스플레이 패널 상의 외부 객체의 접촉을 처리하는 전자 장치 및 방법

Publications (1)

Publication Number Publication Date
WO2024019311A1 true WO2024019311A1 (fr) 2024-01-25

Family

ID=89618140

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/007319 WO2024019311A1 (fr) 2022-07-22 2023-05-26 Dispositif électronique et procédé de traitement de contact d'objet externe sur un écran d'affichage

Country Status (1)

Country Link
WO (1) WO2024019311A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140046178A (ko) * 2012-10-10 2014-04-18 한국과학기술원 플렉서블 디스플레이 장치 및 플렉서블 디스플레이 장치에서의 터치 보정 방법
KR101521219B1 (ko) * 2008-11-10 2015-05-18 엘지전자 주식회사 플렉서블 디스플레이를 이용하는 휴대 단말기 및 그 제어방법
JP2015141596A (ja) * 2014-01-29 2015-08-03 京セラ株式会社 携帯機器、タッチ位置補正方法およびプログラム
KR20150092588A (ko) * 2014-02-05 2015-08-13 삼성전자주식회사 전자장치에서 플랙서블 디스플레이의 표시 제어 방법 및 장치
KR20180026024A (ko) * 2016-09-01 2018-03-12 삼성디스플레이 주식회사 플렉서블 표시 장치 및 이의 구동 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101521219B1 (ko) * 2008-11-10 2015-05-18 엘지전자 주식회사 플렉서블 디스플레이를 이용하는 휴대 단말기 및 그 제어방법
KR20140046178A (ko) * 2012-10-10 2014-04-18 한국과학기술원 플렉서블 디스플레이 장치 및 플렉서블 디스플레이 장치에서의 터치 보정 방법
JP2015141596A (ja) * 2014-01-29 2015-08-03 京セラ株式会社 携帯機器、タッチ位置補正方法およびプログラム
KR20150092588A (ko) * 2014-02-05 2015-08-13 삼성전자주식회사 전자장치에서 플랙서블 디스플레이의 표시 제어 방법 및 장치
KR20180026024A (ko) * 2016-09-01 2018-03-12 삼성디스플레이 주식회사 플렉서블 표시 장치 및 이의 구동 방법

Similar Documents

Publication Publication Date Title
WO2022085885A1 (fr) Procédé de commande de fenêtre et dispositif électronique associé
WO2022097857A1 (fr) Dispositif électronique et procédé d'affichage d'image sur un écran souple
WO2022114416A1 (fr) Dispositif électronique pour fournir une multifenêtre en utilisant un écran extensible
WO2021261949A1 (fr) Procédé d'utilisation selon l'état de pliage d'un afficheur et appareil électronique l'utilisant
WO2022030804A1 (fr) Dispositif électronique pliable pour commander la rotation d'un écran, et son procédé de fonctionnement
WO2022119311A1 (fr) Dispositif électronique comprenant un écran flexible, et procédé de fonctionnement associé
WO2022030921A1 (fr) Dispositif électronique, et procédé de commande de son écran
WO2022103021A1 (fr) Dispositif électronique à affichage flexible et procédé de commande dudit dispositif
WO2022030890A1 (fr) Procédé de capture d'image à fenêtres multiples et dispositif électronique associé
WO2024019311A1 (fr) Dispositif électronique et procédé de traitement de contact d'objet externe sur un écran d'affichage
WO2024101704A1 (fr) Dispositif pouvant être porté et procédé d'identification d'entrée tactile et support de stockage lisible par ordinateur non transitoire
WO2022103010A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022177105A1 (fr) Dispositif électronique à affichage transparent et procédé de fonctionnement dudit dispositif
WO2024019295A1 (fr) Procédé de fourniture de source d'alimentation électrique, et dispositif électronique pour exécuter un procédé
WO2024063368A1 (fr) Dispositif électronique et procédé de commande d'entrée tactile
WO2024019300A1 (fr) Dispositif électronique et procédé de détection de fixation d'un dispositif d'entrée d'utilisateur
WO2022108402A1 (fr) Procédé de fonctionnement d'écran souple, et dispositif électronique
WO2023106830A1 (fr) Dispositif électronique prenant en charge un mode de fonctionnement à une seule main et procédé de fonctionnement du dispositif électronique
WO2023063584A1 (fr) Dispositif électronique pour identifier un état en utilisant un capteur
WO2022114648A1 (fr) Dispositif électronique de paramétrage d'un écran d'arrière-plan et procédé de fonctionnement dudit dispositif
WO2022098038A1 (fr) Appareil électronique à affichage extensible
WO2023096221A1 (fr) Appareil électronique et procédé permettant de détecter une entrée tactile d'un appareil électronique
WO2023287057A1 (fr) Dispositif électronique permettant de rapidement mettre à jour un écran lorsqu'une entrée est reçue en provenance d'un dispositif périphérique
WO2022103156A1 (fr) Dispositif électronique comprenant un afficheur flexible et son procédé d'utilisation
WO2022050627A1 (fr) Dispositif électronique comprenant un affichage souple et procédé de fonctionnement de celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23843168

Country of ref document: EP

Kind code of ref document: A1