WO2021025456A1 - Procédé de commande reposant sur une entrée tactile et dispositif électronique associé - Google Patents

Procédé de commande reposant sur une entrée tactile et dispositif électronique associé Download PDF

Info

Publication number
WO2021025456A1
WO2021025456A1 PCT/KR2020/010322 KR2020010322W WO2021025456A1 WO 2021025456 A1 WO2021025456 A1 WO 2021025456A1 KR 2020010322 W KR2020010322 W KR 2020010322W WO 2021025456 A1 WO2021025456 A1 WO 2021025456A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
electronic device
touch input
signal
type touch
Prior art date
Application number
PCT/KR2020/010322
Other languages
English (en)
Korean (ko)
Inventor
고승훈
유대현
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020190095215A external-priority patent/KR102719976B1/ko
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to CN202080052624.1A priority Critical patent/CN114144749B/zh
Priority to EP20850608.9A priority patent/EP3982240B1/fr
Priority to US16/975,532 priority patent/US11294496B2/en
Publication of WO2021025456A1 publication Critical patent/WO2021025456A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Definitions

  • the present disclosure relates to a method of operation based on a touch input and an electronic device thereof.
  • a wearable electronic device for example, a wrist-worn electronic device such as a smart watch has a display for displaying content.
  • a display provided in a wearable electronic device generally includes a touch-sensitive display, a so-called touch screen display.
  • wearable electronic devices have a smaller screen than general smartphones or tablets, there are restrictions on touch input, and mechanical or mechanical devices such as a crown or wheel to assist such touch input -An electronic input means may be additionally provided.
  • the touch screen display may include a display panel (eg, an OLED panel) for outputting content and a touch panel for recognizing a touch input.
  • a touch panel detects a change in capacitance to recognize a touch input and determine its coordinates.
  • the touch panel can distinguish whether a touch input by direct touch or a touch input by proximity (eg, hovering) by dividing the size of the capacitance change.
  • the wearable electronic device uses a metal frame that forms the side surface of the wearable electronic device as an antenna to support cellular network communication, but if a rotatable wheel is disposed on the metal frame, the radiation performance of the antenna may be degraded. In addition, simply removing the wheel reduces the number of means to assist touch input. According to an embodiment of the present disclosure, an electronic device and method capable of accurately selecting an object displayed on a display using a touch input from an outer area or a bezel area of a display may be provided.
  • An electronic device includes a housing, a touch panel and a display disposed inside the housing, a first area disposed on the touch panel and corresponding to the touch panel, and an outer area of the touch panel.
  • a cover window including a corresponding second region, the first region being a first sub-region corresponding to an inner region of the touch panel and a second sub-region corresponding to an outer region of the touch panel ( second sub-region) and a processor operatively coupled to the touch panel and the display.
  • the processor provides a user interface through the display, obtains a signal by a touch input from the touch panel while the user interface is provided, and determines an area where the touch input first occurs based on the signal, In response to the first occurrence of the touch input in the first sub-region, the touch input greater than a first threshold value is determined as a first type touch, based on the user interface and the first type touch Thus, an event corresponding to the first type touch is executed.
  • the touch input In response to the first occurrence of the touch input in the second sub-region or the second region, it is sensed whether the touch input includes movement of touch coordinates, and the touch input includes movement of touch coordinates In response to doing so, setting the touch input to be a second type touch distinguished from the first type touch, and executing an event corresponding to the second type touch based on the user interface and the second type touch Can be.
  • a method is a method of controlling an electronic device including a cover window, a touch panel, and a display, the operation of providing a user interface through the display, and the touch panel while the user interface is provided. And an operation of obtaining a signal from the touch input from and determining an area where the touch input first occurs based on the signal.
  • the touch input greater than a first threshold value is determined as a first type touch
  • the user An operation of executing an event corresponding to the first type touch based on an interface and the first type touch, and a second sub-region corresponding to an outer area of the touch panel or the touch input
  • it detects whether the touch input includes movement of touch coordinates, and in response to the touch input including movement of touch coordinates, the touch input
  • a second type touch that is distinguished from the first type touch may be determined, and an event corresponding to the second type touch may be executed based on the user interface and the second type touch.
  • one or more features selected from any one embodiment described in the present disclosure may be combined with one or more features selected from any other embodiment described in the present disclosure, and alternatives to these features.
  • two or more physically separate components may alternatively be integrated into a single component if the integration is possible, and a single component thus formed The integration is possible if the same function is performed by.
  • a single component of any embodiment described in the present disclosure may alternatively be implemented as two or more separate components that achieve the same function, as appropriate.
  • the electronic device may accurately select an object displayed on the display of the electronic device through a touch input input to an outer area of the display or a bezel area.
  • the electronic device may allow the user to more accurately control the electronic device by providing a wheel touch that allows the user's finger to manipulate a user interface displayed on the display without covering the display.
  • the user may control the electronic device according to the user's intention by touching the outer area of the cover window instead of directly touching the inside of the display with a finger.
  • FIG. 1 illustrates an electronic device in a network environment, according to an embodiment.
  • FIG 2 illustrates an electronic device according to an embodiment.
  • FIG 3 is a diagram illustrating a region where a touch input occurs and a cross section of the electronic device in the electronic device according to an exemplary embodiment.
  • FIG. 4A is an enlarged view of area B of FIG. 3.
  • 4B is a graph illustrating characteristics of signals generated in an outermost channel of a touch panel according to an exemplary embodiment.
  • 4C is a graph comparing a signal generated in an outermost channel of a touch panel with a signal generated in an outermost adjacent channel, according to an exemplary embodiment.
  • FIG. 5 illustrates a second type of touch initiated from a second area of a cover window in an electronic device according to an exemplary embodiment.
  • FIG. 6 is a diagram illustrating an operation of a second type touch started in a second sub-region 221-2 of a cover window in an electronic device according to an exemplary embodiment.
  • FIG. 7 illustrates haptic feedback corresponding to a second type of touch according to an embodiment.
  • FIG. 8 is a flowchart illustrating a method of executing an event corresponding to a touch input of an electronic device according to an exemplary embodiment.
  • FIG. 9 is a flowchart of determining a type of a touch input sensed from a touch panel in an electronic device according to an exemplary embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to an exemplary embodiment.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (for example, a short-range wireless communication network), or a second network 199 It is possible to communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network).
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, and a sensor module ( 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197 ) Can be included.
  • a sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197
  • at least one of these components may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components may be implemented as one integrated circuit.
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the display device 160 eg, a display.
  • the processor 120 for example, executes software (eg, a program 140) to implement at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and can perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 120 may store commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132. The command or data stored in the volatile memory 132 may be processed, and result data may be stored in the nonvolatile memory 134.
  • software eg, a program 140
  • the processor 120 may store commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132.
  • the command or data stored in the volatile memory 132 may be processed, and result data may be stored in the nonvolatile memory 134.
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and a secondary processor 123 (eg, a graphic processing unit, an image signal processor) that can be operated independently or together , A sensor hub processor, or a communication processor). Additionally or alternatively, the coprocessor 123 may be set to use less power than the main processor 121 or to be specialized for a designated function. The secondary processor 123 may be implemented separately from the main processor 121 or as a part thereof.
  • the coprocessor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, an application is executed). ) While in the state, together with the main processor 121, at least one of the components of the electronic device 101 (for example, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the functions or states related to. According to an embodiment, the coprocessor 123 (eg, an image signal processor or a communication processor) may be implemented as part of another functionally related component (eg, the camera module 180 or the communication module 190). have.
  • an image signal processor or a communication processor may be implemented as part of another functionally related component (eg, the camera module 180 or the communication module 190). have.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176).
  • the data may include, for example, software (eg, the program 140) and input data or output data for commands related thereto.
  • the memory 130 may include a volatile memory 132 or a nonvolatile memory 134.
  • the program 140 may be stored as software in the memory 130, and may include, for example, an operating system 142, middleware 144, or an application 146.
  • the input device 150 may receive a command or data to be used for a component of the electronic device 101 (eg, the processor 120) from an outside (eg, a user) of the electronic device 101.
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 155 may output an sound signal to the outside of the electronic device 101.
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display device 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display device 160 may include a touch circuitry set to sense a touch, or a sensor circuit (eg, a pressure sensor) set to measure the strength of a force generated by the touch. have.
  • the audio module 170 may convert sound into an electric signal or, conversely, convert an electric signal into sound. According to an embodiment, the audio module 170 acquires sound through the input device 150, the sound output device 155, or an external electronic device (for example, an external electronic device directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102) (for example, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101, or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used for the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that a user can perceive through a tactile or motor sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture a still image and a video.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101.
  • the power management module 388 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, electronic device 102, electronic device 104, or server 108). It is possible to support establishment and communication through the established communication channel.
  • the communication module 190 operates independently of the processor 120 (eg, an application processor), and may include one or more communication processors that support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg : A LAN (local area network) communication module, or a power line communication module) may be included.
  • a corresponding communication module is a first network 198 (for example, a short-range communication network such as Bluetooth, WiFi direct or IrDA (infrared data association)) or a second network 199 (for example, a cellular network, the Internet, or It can communicate with external electronic devices through a computer network (for example, a telecommunication network such as a LAN or WAN).
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 in a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the antenna module 197 may transmit a signal or power to the outside (eg, an external electronic device) or receive from the outside.
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is, for example, provided by the communication module 190 from the plurality of antennas. Can be chosen.
  • the signal or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as part of the antenna module 197.
  • At least some of the components are connected to each other through a communication method (e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI))) between peripheral devices and signals ( E.g. commands or data) can be exchanged with each other.
  • a communication method e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the electronic devices 102 and 104 may be a device of the same or different type as the electronic device 101.
  • all or part of the operations executed by the electronic device 101 may be executed by one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 needs to perform a function or service automatically or in response to a request from a user or another device, the electronic device 101 does not execute the function or service by itself.
  • One or more external electronic devices receiving the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit the execution result to the electronic device 101.
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • the electronic device may be a device of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a laptop, a desktop, a tablet, or a portable multimedia device
  • portable medical device e.g., a portable medical device
  • camera e.g., a camera
  • a wearable device e.g., a smart watch
  • a home appliance e.g., a portable medical device, a portable medical device, a camera, a wearable device, or a home appliance.
  • the electronic device according to the embodiment of the present document is not limited to the above-described devices.
  • phrases such as “at least one of, B, or C” may include any one of the items listed together in the corresponding one of the phrases, or all possible combinations thereof.
  • Terms such as “first”, “second”, or “first” or “second” may be used simply to distinguish the component from other corresponding components, and the components may be referred to in other aspects (eg, importance or Order) is not limited.
  • Some (eg, a first) component is referred to as “coupled” or “connected” with or without the terms “functionally” or “communicatively” to another (eg, second) component. When mentioned, it means that any of the above components can be connected to the other components directly (eg by wire), wirelessly, or via a third component.
  • module used in this document may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic blocks, parts, or circuits.
  • the module may be an integrally configured component or a minimum unit of the component or a part thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • An embodiment of this document is one or more commands stored in a storage medium (eg, internal memory 136 or external memory 138) that can be read by a machine (eg, electronic device 101). It may be implemented as software (for example, the program 140) including them.
  • the processor eg, the processor 120 of the device (eg, the electronic device 101) may call and execute at least one command among one or more commands stored from a storage medium. This makes it possible for the device to be operated to perform at least one function according to the at least one command invoked.
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • non-transient only means that the storage medium is a tangible device and does not contain a signal (e.g., electromagnetic wave), and this term refers to the case where data is semi-permanently stored in the storage medium. It does not distinguish between temporary storage cases.
  • a signal e.g., electromagnetic wave
  • a method according to an embodiment disclosed in this document may be provided in a computer program product.
  • Computer program products can be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a device-readable storage medium (e.g. compact disc read only memory (CD-ROM)), or through an application store (e.g. Play Store TM ) or two user devices ( It can be distributed (e.g., downloaded or uploaded) directly between, e.g. smartphones).
  • a device e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play Store TM
  • two user devices It can be distributed (e.g., downloaded or uploaded) directly between, e.g. smartphones).
  • at least a portion of the computer program product may be temporarily stored or temporarily generated in a storage medium that can be read by a device such as a server of a manufacturer, a server of an application store, or a memory of a relay server.
  • each component (eg, module or program) of the above-described components may include a singular number or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar to that performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be sequentially, parallel, repetitively or heuristically executed, or one or more of the above operations may be executed in a different order or omitted. , Or one or more other actions may be added.
  • FIG 2 illustrates an electronic device according to an embodiment.
  • an electronic device 200 includes a processor 210, a memory 220, a touch panel 230, a touch sensor integrated circuit (IC) 240, a display 250, and a display. It may include a driver IC (DDI) 260, a haptic device 270, or a combination thereof. In an embodiment, the electronic device 200 may omit at least one of the components or additionally include another component.
  • the processor 210 may control the overall operation of the electronic device 200. In one embodiment, the processor 210 may execute applications that provide messages, alarms, photos, advertisements, Internet, games, videos, and the like. In an embodiment, the processor 210 may include one processor core or may include a plurality of processor cores.
  • the processor 210 may identify signals of other components of the electronic device 200 or may receive data from other components of the electronic device 200. In an embodiment, the processor 210 may perform an operation based on signals, data, or a combination of other components of the electronic device 200. In an embodiment, the processor 210 may store a result of performing an operation in the memory 220. In an embodiment, the processor 210 may transmit a command to other components of the electronic device 200 based on a result of performing an operation. In an embodiment, the processor 210 may control operations of other components of the electronic device 200 by transmitting commands to other components of the electronic device 200.
  • the processor 210 may process data or signals generated or generated by an application. In an embodiment, the processor 210 may process a command stored in the memory 220 to execute or control an application.
  • the memory 220 may include a volatile memory or a nonvolatile memory. In an embodiment, the memory 220 may store various types of data used by at least one component of the electronic device 200 (eg, the processor 210 ).
  • the touch panel 230 may include a plurality of touch sensors or channels that generate a detection signal (eg, a touch detecting signal and a proximity detecting signal). In an embodiment, the touch panel 230 may transmit a detection signal to the touch sensor IC 240.
  • a detection signal eg, a touch detecting signal and a proximity detecting signal.
  • the touch sensor IC 240 may control the touch panel 230 to detect, for example, a touch input or a hovering input for a specific position of the display 250.
  • the touch sensor IC 240 can detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, light quantity, resistance, or charge quantity) for a specific location of the display 250. have.
  • the touch sensor IC 240 may provide information (eg, location, area, pressure, or time) on the sensed touch input or hovering input to the processor 210.
  • the touch sensor IC 240 may be included as a part of the DDI 260 or the display 250, or as a part of the processor 210.
  • the display 250 may display a screen corresponding to data generated by the processor 210.
  • the DDI 260 may receive, for example, image data or image information including an image control signal corresponding to a command for controlling the image data from other components of the electronic device 200 through an interface module. I can. According to an embodiment, image information may be received from the processor 210. The DDI 260 may communicate with the touch sensor IC 240 through the interface module.
  • the display 250 and the touch panel 230 may also be configured as a touch screen. In one embodiment, when the display 250 and the touch panel 230 are configured as a touch screen, the touch panel 230 is disposed above the display 250, disposed below the display 250, or 250 can be placed within.
  • the haptic device 270 may generate vibration in response to a signal generated by the processor 210.
  • the processor 210 may control the haptic device 270 to generate vibration based on a signal generated from the touch panel 230.
  • the processor 210 may be understood as a control unit.
  • the processor 210 may include at least one of an application processor (AP) and a communication processor (CP) that control functions of the electronic device 200.
  • the memory 220 may be understood as a storage unit.
  • the memory 220 may be understood as various types of storage units that store data of the electronic device 200.
  • the display panel 250 according to an exemplary embodiment may be understood as a display unit.
  • the display panel 250 according to an embodiment may include a display unit indicating contents on the screen of the electronic device 200.
  • FIG. 3 is a diagram illustrating a region where a touch input occurs and a cross section (A-A') of the electronic device in an electronic device (eg, the electronic device 200 of FIG. 2) according to an exemplary embodiment.
  • an electronic device 300 includes a housing 310, a cover window 320 forming a surface of the housing 310, and a touch panel 330 disposed under the cover window 320. , At least one processor 360 disposed under the cover window 320 and mounted on the display 340, the circuit board 350 inside the housing, and the circuit board 350 visible from the outside through the cover window 320 And/or a haptic device 370 mounted on the circuit board 350.
  • the touch panel 330 may be disposed below or above the display 340.
  • the display 340 and the touch panel 330 may be integrally formed.
  • the electronic device 300 may include a touch screen panel (TSP) in which the display 340 and the touch panel 330 are integrally formed.
  • TSP touch screen panel
  • the UI of the application may be displayed through the display 340.
  • the touch sensor IC may receive a signal caused by a touch input from the touch panel 330 and obtain data on a location where the touch input occurs based on the signal caused by the touch input.
  • the processor 360 may receive data regarding location information from the touch sensor IC.
  • the touch sensor IC may be included as a part of the processor 360.
  • the operation of the touch sensor IC will be included in the operation of the processor.
  • the touch input may be performed by an external object detectable by the touch panel 330 such as a user's finger, a touch pen, or a stylus pen.
  • an external object detectable by the touch panel 330 such as a user's finger, a touch pen, or a stylus pen.
  • the touch panel 330 may include a plurality of channels, and the processor 360 may determine the location of the touch input based on signals generated from the plurality of channels. For example, when a signal is generated from a first channel disposed at a first position among a plurality of channels due to a touch input, the touch panel 330 may be based on a signal generated from the first channel disposed at the first position, It may be determined that the touch input has occurred at the first position of the touch panel.
  • the processor 360 may determine that a touch input has occurred in a specific area of the touch panel 330 in response to a signal size generated from the touch panel 330 exceeding a specified threshold value. .
  • the processor 360 of the electronic device 300 is operatively coupled to a memory (eg, the memory 220 of FIG. 2 ), the display 340 and/or the touch panel 330, and Based on the location of the touch input acquired through 330, a UI corresponding to the location of the touch input may be controlled.
  • a memory eg, the memory 220 of FIG. 2
  • the display 340 and/or the touch panel 330 Based on the location of the touch input acquired through 330, a UI corresponding to the location of the touch input may be controlled.
  • the processor 360 may provide a UI related to a touch input while the processor 360 is in an activated or deactivated state. For example, even when the processor 360 is in the sleep mode, the processor 360 may detect a touch input through the touch panel 330.
  • the processor 360 may provide a UI related to a touch input while the display 340 is in an active state or in an inactive state.
  • the processor 360 may detect a touch input through the touch panel 330 even when the display 340 is turned off.
  • the processor 360 may sense a touch input through the entire area of the touch panel 330 in an always on display (AOD) mode in which only a portion of the display 340 is activated.
  • AOD always on display
  • a portion of the UI displayed on the display 340 related to a wheel event may be manipulated by a touch input.
  • the processor 360 may control the object to be scrolled or rotated based on a touch input through the touch panel 330.
  • the cover window 320 may be disposed on the touch panel 330 and the display 340 to protect the touch panel 330 and the display 340 from external impacts.
  • the cover window 320 is formed of a transparent material (eg, polymer, glass) so that the display 340 can be seen from the outside through the cover window 320.
  • the cover window 320 does not have a first area 321 and a touch panel 330 (or display 340) corresponding to the touch panel 330 (or display 340).
  • a second area 322 corresponding to the area 380 may be included.
  • the second area 322 of the cover window 320 is an area that does not overlap with the touch panel, and the touch panel 330 and the display 340 may not be disposed under the second area 322.
  • the first region 321 of the cover window includes a first sub-region 321-1 corresponding to a first group among a plurality of channels of the touch panel 330 and a touch A second sub-region 321-2 corresponding to a second group among a plurality of channels of the panel 330 may be included.
  • the second group corresponds to outer channels among a plurality of channels of the touch panel 330
  • the first group corresponds to channels other than the outer channels among a plurality of channels of the touch panel 330. May be applicable.
  • the second group corresponds to the outermost channels of the plurality of channels of the touch panel 330
  • the first group is the largest of the plurality of channels of the touch panel 330. It may correspond to channels excluding outer channels.
  • a touch input may occur on the cover window 320, and the processor 360 may detect any of the touch inputs on the cover window 320 through the touch panel 330 disposed under the cover window 320. You can determine if it originated from the location. For example, the processor 360 receives a touch input through the touch panel 330 in a first sub-region 321-1 and a second sub-region 321 of the cover window. -2) or the second region 322 may be determined in which region.
  • the touch input may also include a hovering input.
  • the processor eg, touch sensor IC
  • the touch input may not occur on the surface of the cover window 320. For example, it may occur when the user's finger is within a certain distance from the surface of the cover window 320.
  • a threshold value for sensing a touch input eg, the second threshold value in FIG. 4B
  • the sensitivity is generated apart from the surface of the cover window 320 and is more sensitive than the touch generated on the surface of the cover window 320.
  • a hovering input with a lower value may be detected by the processor 360.
  • the touch input that the processor 360 can detect through the touch panel 330 may include a first type touch and a second type touch.
  • the first type of touch may correspond to a touch directly input to an area corresponding to a UI to be controlled.
  • the processor 360 detects a first type touch through the touch panel 330, the processor 360 selects, moves, scrolls, and selects an object on the UI corresponding to the area where the first type touch is detected. You can zoom in or out.
  • the second type touch may correspond to a touch that is input to a different area from the UI to be controlled to indirectly control the UI.
  • the second type touch is a touch input that generates a wheel event, and may include a first touch and a touch that follows in a state in which the first touch is not released.
  • the second type touch may correspond to a touch input in which the touch moves along the outer edge of the cover window while the first touch is not released.
  • the first touch of the second type touch may occur in the second sub-region 321-2 as well as the second region 322.
  • the processor 360 detects the second area 322 in which the second type touch is detected. Otherwise, the object on the UI corresponding to the first area 321 may be selected, changed, converted, moved, scrolled, enlarged or reduced.
  • the processor 360 Selects, changes, or switches part or all of the UI corresponding to the first sub-region 321-1, not the second sub-region 321-2 where is detected , You can move, scroll, zoom in or out.
  • an event corresponding to the second type touch may be executed.
  • the event corresponding to the second type touch may include an operation of selecting, changing, converting, moving, scrolling, expanding or reducing a part or all of the UI.
  • the processor 360 may convert a first application displayed on the display 340 at a corresponding touch input point into a second application different from the first application in response to the second type touch.
  • the event corresponding to the second type touch may include an operation in which an object on the UI is not changed.
  • an operation such as changing the brightness of the display 340 or adjusting the volume in a multimedia application may not cause a change in an object on the UI.
  • the haptic device 370 may generate vibration in response to a touch input.
  • the processor 360 detects a touch input through the touch panel 330
  • the processor 360 vibrates the electronic device 300 through the haptic device 370 to give a user a feedback corresponding to the touch input.
  • the processor 360 determines the distance traveled by the touch input, the intensity of the touch input, and the touch input.
  • the haptic device 370 may be controlled to generate vibration based on at least one such as the sustained time.
  • FIG. 4A is an enlarged view of area B of FIG. 3, and FIG. 4B shows a signal generated in an outermost channel of a touch panel according to a location where a touch is input on a cover window in an electronic device according to an exemplary embodiment. It is a graph. 4C is a graph comparing a signal generated in an outermost channel of a touch panel with a signal generated in an outermost adjacent channel.
  • the electronic device 300 may determine a type of a touch input based on at least one of a size of a signal sensed through the touch panel 330, an area in which the signal is generated, or whether a moving touch is generated. When a signal exceeding a specific threshold value is generated, the electronic device 300 may detect the signal as a touch input. When a touch input is detected, the electronic device 300 may determine in which region of the cover window 320 the corresponding touch input occurs. The electronic device 300 may determine the type of the touch input based on whether a moving touch following the first touch is detected. According to an embodiment, the moving touch may correspond to moving touch coordinates acquired through the touch panel 330 while the touch is maintained.
  • the electronic device 300 may compare the magnitude of the signal generated from the touch panel 330 with the first threshold value V1 in order to detect a touch input. When the signal level exceeds the first threshold value V1, the electronic device 300 may detect a touch input and perform an operation corresponding to the corresponding touch input.
  • a distance d2 between the outermost channel 332 of the touch panel 330 and a finger is a second sub-area.
  • the touch T1 is input in the region 321-2, it may be farther than the distance d1 between the outermost channel 332 of the touch panel 330 and the finger.
  • a signal generated in the outermost channel 332 due to the touch input T2 on the second area 322 may be lower than the first threshold value V1. Therefore, when the electronic device 300 detects a touch input based on the first threshold value V1, it may not detect the touch input occurring in the second area 322.
  • the electronic device 300 may detect a touch input in the second area 322. According to an embodiment, the electronic device 300 compares the signal due to the touch T2 on the second area 322 with a second threshold value V2 lower than the first threshold value V1, so that the second area A touch input generated at 322 may be detected. For example, even if a signal lower than the first threshold value V1 is generated in the outermost channel 332 constituting the touch panel 330 by a touch input, the electronic device 300 may recognize the signal as a touch input. I can.
  • the electronic device 300 displays the corresponding signal in a second sub-region 321- It should be possible to identify whether it occurred in 2) or in the second area 322.
  • the signal Vb generated from the outermost channel 332 is By comparing the signal Va generated in the channel 331 adjacent to the outer channel 332, it is possible to identify a region in which the corresponding signal has occurred.
  • the electronic device 300 when the ratio of the signal Vb generated in the outermost channel 332 to the signal Va generated in the channel 331 adjacent to the outermost channel 332 exceeds a specified value, the electronic device 300 , The corresponding signal may be identified as occurring in the second region 322. When the ratio of the signal Va generated in the outermost channel 332 to the signal Va generated in the channel 331 adjacent to the outermost channel 332 is within a specified value, the electronic device 300 It may be identified as occurring in the second sub-region 321-2.
  • the electronic device 300 when a signal detected in the outermost channel 332 exceeds the first threshold value V1, the electronic device 300 is configured to be in the second sub-region 321-2. It can be recognized that a touch input has occurred.
  • the signal Vb detected in the outermost channel 332 exceeds the second threshold value V2, and the outermost channel 332 for the signal Va generated in the channel 331 adjacent to the outermost channel 332
  • the electronic device 300 may recognize that a touch input has occurred in the second area 322.
  • the electronic device 300 may detect whether the sensed touch input includes movement of touch coordinates and determine the type of the corresponding touch input. For example, when a first touch input is detected in the second sub-region 321-2, the electronic device 300 includes a moving touch following the first touch. In this case, the first touch input may be determined as the second type touch. On the other hand, when the first touch input does not include a moving touch following the first touch, the first touch input may be determined as a first type touch.
  • the electronic device 300 may recognize the first touch input as either a first type touch or a second type touch depending on whether a moving touch following the first touch occurs after the first point in time. have.
  • the electronic device may determine a touch input sensed in the first area as a first type touch.
  • the touch input exceeding the first threshold value and generated in the first sub-region 321-1 may be a first type touch.
  • the electronic device may determine the corresponding touch input as the second type touch. For example, a touch input that exceeds a first threshold value and occurs in the second sub-region 321-2 and includes a moving touch may be a second type touch. Meanwhile, a touch input that exceeds the first threshold value and occurs in the second sub-region 321-2, but does not include a moving touch may be a first type touch.
  • the electronic device may determine the corresponding touch input as the second type touch. For example, a touch input that exceeds the second threshold value and occurs in the second region 322 and includes a moving touch may be a second type touch.
  • FIG. 5 illustrates a second type of touch initiated from a second area of a cover window in an electronic device according to an exemplary embodiment.
  • the electronic device 300 includes a housing 310, a cover window 320 forming a surface of the housing 310, a touch panel disposed under the cover window 320, and a cover window.
  • a display disposed below and visible from the outside through the cover window 320, a circuit board inside the housing 310, at least one processor mounted on the circuit board, and/or a haptic device mounted on the circuit board It may include.
  • the cover window 320 includes a first area 321 corresponding to the touch panel 330 (or display 340) and a bezel area 380 in which the touch panel (or display) does not exist.
  • a corresponding second region 322 may be included.
  • the second area 322 of the cover window 320 is an area that does not overlap the touch panel, and a touch panel and/or a display may not be disposed under the second area 322.
  • the first region 321 of the cover window includes a first sub-region 321-1 corresponding to a first group among a plurality of channels of the touch panel and a plurality of touch panels.
  • a second sub-region 321-2 corresponding to the second group of channels of may be included.
  • the second group may correspond to outer channels among a plurality of channels of the touch panel
  • the first group may correspond to channels excluding outer channels among a plurality of channels of the touch panel.
  • the second group corresponds to the outermost channels of the plurality of channels of the touch panel
  • the first group is channels excluding the outermost channels of the plurality of channels of the touch panel. May correspond to.
  • the electronic device 300 may detect the corresponding signal as a touch input (T1, T2).
  • a second threshold eg, second threshold V2 in FIG. 4B
  • the electronic device 300 is The first touch input T1 and the second touch input T2 may be determined as the second type touch. For example, a moving touch of the touch inputs T1 and T2 may occur on the first area 321 as well as the second area 322.
  • the electronic device includes a touch in which the first touch input T1 moves from a first point P1 on the second area 322 to a second point P2 on the second area 322. Based on including, the first touch input T1 may be determined as a second type touch. Specifically, the first touch input T1 is a touch input sensed at the first point P1 at a first time point, a drag input moving from the first point to the second point P2 from the first time point to the second time point, It may include an input released at the second point P2 at the second point in time.
  • the electronic device includes a second touch input T2 moving from a first point P1 on the second area 322 to a third point P3 on the first area 321.
  • the second touch input T2 may be determined as a second type touch.
  • the second touch input T2 is a touch input sensed at the first point P1 at a first time point, a drag input moving from the first point to the third point P3 from the first time point to the second time point, It may include an input released at the third point P3 at the second point in time.
  • the electronic device 300 T1 when an angle ⁇ between the first point P1 and the second point P2 from the center of the cover window 320 is equal to or greater than a specified angle, the electronic device 300 T1) may be determined as a second type touch. For example, when the angle ⁇ from the center of the cover window 320 to the second point P1 is 15 degrees or more, the first touch input T1 is It is possible to determine as a 2-type touch and generate an event corresponding to the second-type touch.
  • the second type touch moves from the first point P1 to the second point P2 while the touch is maintained, and then stops at the second point P2 for a certain period of time and then stops at the third point ( It may include a touch (not shown) moving to P3).
  • the movement direction of the second type touch may include a clockwise direction, a counterclockwise direction, and a combination thereof.
  • the second type touch may include a touch that moves in a clockwise direction while the touch is maintained and then moves in a counterclockwise direction again.
  • the second type touch may include a touch that moves in a clockwise direction while the touch is maintained, moves in a counterclockwise direction, and finally moves in a clockwise direction.
  • the path of the second type touch is not limited to the above-described embodiments, and may be various.
  • FIG. 6 is a diagram illustrating an operation of a second type touch started in a second sub-region 321-2 of a cover window in an electronic device according to an exemplary embodiment. Redundant description of FIG. 5 will be omitted.
  • a first threshold value in a channel of the touch panel corresponding to a first point P1 on a second sub-region 321-2 at a first time point (eg, in FIG. 4B)
  • the electronic device 300 may detect the corresponding signal as a touch input T1 and T2.
  • the electronic device 300 detects and moves the first touch of the first touch input T1 and the second touch input T2 in the second sub-region 321-2. If the touch is included, the first touch input T1 and the second touch input T2 may be determined as the second type touch regardless of the path of the moving touch. For example, the moving touch of the touch inputs T1 and T2 is a second sub-region 321-2 as well as a first sub-region 321-1 and/or It may occur on the second area 322.
  • the electronic device includes a first touch input T1 input at a first point in time at a first point P1 on a second sub-region 321-2.
  • the first touch input T1 is based on including a touch that follows the touch and moves from the first point P1 to the second point P2 on the second sub-region 321-2. ) May be determined as the second type touch.
  • the electronic device includes the second touch input T2 being input at a first point P1 on a second sub-region 321-2 at a first time point.
  • the second touch input T2 may be determined as a second type touch based on the fact that the touch is followed by a touch moving from the first point P1 to the third point P3 on the second region 322. have.
  • a first threshold value in a channel of the touch panel corresponding to the fourth point P4 on the second sub-region 321-2 (eg, the first threshold value in FIG. 4B)
  • the electronic device 300 may detect the corresponding signal as a touch input T2.
  • the third touch input T3 is moved following the first touch input at the first point P1 on the second sub-region 321-2. Based on not including a touch, the third touch input T3 may be determined as the first type touch.
  • FIG. 7 illustrates haptic feedback corresponding to a second type of touch according to an embodiment.
  • the processor of the electronic device 300 is a haptic device (eg, the haptic device 370 of FIG. 3) while an event corresponding to a second type touch occurs.
  • a haptic device eg, the haptic device 370 of FIG. 3
  • feedback related to an event corresponding to the second type touch may be provided to the user.
  • the electronic device 300 may adjust the intensity and/or frequency of vibration generated through the haptic device based on at least one of a moving speed and acceleration of the touch.
  • vibration may occur based on a distance traveled by the second type touch. For example, vibration is generated at a second point P2 that is moved by a first angle ⁇ 1 from the first point P1, and a third is moved by a second angle ⁇ 2 from the second point P2. Vibration can be generated at point P3. According to an embodiment, the first angle ⁇ 1 and the second angle ⁇ 2 may be the same.
  • the electronic device when an upper limit and a lower limit exist in an action on a UI related to an event corresponding to a second type touch, the electronic device generates vibration through the haptic device when the second type touch reaches the lower or upper limit. , It is possible to notify the user that the operation on the UI has reached the upper or lower limit.
  • the electronic device generates vibration with a first intensity before the second type touch reaches the upper or lower limit through the haptic device, and when the second type touch reaches the upper or lower limit, the second intensity Can generate vibration.
  • the electronic device 300 may use the haptic device. It can generate vibration.
  • the intensity of the vibration generated at the fourth point P4 may be greater than the intensity of the vibration generated at the second point P2 and the third point P3.
  • a pattern of vibration generated through the haptic device is not limited to the above-described embodiments, and may be various.
  • the processor of the electronic device 300 generates sound through an acoustic device (eg, a speaker) provided in the electronic device 300 while an event corresponding to the second type touch occurs to the user. You can give feedback related to the event.
  • an acoustic device eg, a speaker
  • FIG. 8 is a flowchart illustrating a method of executing an event corresponding to a touch input of an electronic device according to an exemplary embodiment.
  • the processor of the electronic device may detect a signal generated on the touch panel by a touch input.
  • the touch panel When the user's finger approaches the cover window (for example, the cover window 320 of FIG. 3), the touch panel generates a signal in response to a change in capacitance according to the user's finger's approach, and is included in the electronic device.
  • the processor can detect the signal.
  • the touch panel may sense the proximity of the user's finger as well as the user's touch pen or stylus pen, and the contact between the finger and the cover window described below may include contact between the touch pen and the cover window.
  • the processor may determine whether the signal generated in operation 810 is caused by contact between the user's finger and the touch screen by comparing the signal detected in operation 810 with a specific threshold. .
  • the contact between the finger and the touch screen is a contact between the finger and a partial area of the cover window corresponding to the touch screen. Can be understood.
  • a signal detected in a specific channel among a plurality of channels included in the touch panel of the touch screen has a touch sensitivity lower than a first threshold value (eg, the first threshold value V1 in FIG. 4B ).
  • the electronic device may ignore the corresponding signal without recognizing it as a touch input.
  • a signal detected in a specific channel for example, channels included in the first group exceeds the first threshold value, the electronic device may recognize the signal as a touch input.
  • channels included in the first group may be channels of a touch panel disposed within a specified range from the center of the touch screen.
  • the processor may detect a touch input having a touch sensitivity smaller than the first threshold value but greater than the second threshold value (eg, the second threshold value V2 in FIG. 4B ). For example, when a signal detected in some of the plurality of channels included in the touch panel has a touch sensitivity between a first threshold value and a second threshold value, the processor performs a specified function. Can be used. For example, if a signal detected in some channels, for example, channels included in the second group, has a touch sensitivity between a first threshold and a second threshold, the touch input is applied to the second type touch. It can be judged as.
  • channels included in the second group may be channels of a touch panel disposed outside a designated range based on the center of the touch screen. In an embodiment, channels included in the second group may correspond to channels disposed at the outermost side of the touch screen or other channels not included in the first group.
  • the processor may determine the type of the sensed touch input. For example, in the electronic device, the touch input is of the second type based on at least one of an area in which a touch input is generated, a signal size corresponding to the touch input, and whether the touch input is moved in a specified direction or angle at the time when the touch input is initially detected Whether it is a touch can be determined.
  • the processor may generate an event corresponding to the touch input based on the determined type of the touch input.
  • an event corresponding to the second type of touch may include rotation and scrolling of an object displayed on the display.
  • the processor may control a haptic device (eg, the haptic device 370 of FIG. 3) included in the electronic device to generate a vibration corresponding to the event.
  • 9 is a flowchart of determining a type of a touch input sensed from a touch panel in an electronic device according to an exemplary embodiment. 9 may be described with reference to FIGS. 3, 4A, 4B, and 4C.
  • the processor 360 may detect a signal from a user input from the touch panel 330.
  • the processor 360 may determine whether the signal Vb generated in the outermost channel 332 exceeds the second threshold value V2. When the signal Vb generated in the outermost channel 332 is within the second threshold value V2, the corresponding signal may not be recognized as a touch input.
  • the processor 360 When the signal Vb generated in the outermost channel 332 exceeds the second threshold value V2, in operation 930, the processor 360 generates a signal Va generated in the channel 331 adjacent to the outermost channel 332. When the ratio of the signal Vb generated in the outermost channel 332 to) exceeds the specified value ⁇ , the corresponding signal may be identified as occurring in the second region 322. In operation 930, the processor 360 determines that the ratio of the signal Vb generated in the outermost channel 332 to the signal Va generated in the channel 331 adjacent to the outermost channel 332 is within the specified value ⁇ . In this case, it may be identified that the corresponding signal is generated in the second sub-region 321-2. In operation 940, it may be determined whether the touch input sensed in the second area 322 includes a moving touch.
  • the processor 360 may determine the touch input as the second type touch in operation 970.
  • the processor 360 May determine whether the signal Vb sensed in the outermost channel 332 exceeds the first threshold value V1.
  • the processor 360 performs a second sub-region ( In 321-2), it may be recognized that a touch input has occurred.
  • the processor 360 performs the second touch input in operation 970. It can be determined by type touch.
  • the processor 360 may determine a touch input sensed in a second sub-region as a first type touch in operation 980.
  • the processor 360 may execute an event corresponding to the determined type for each touch input.
  • the electronic device (eg, the electronic device 300 of FIG. 3) according to the above-described embodiment includes a housing (eg, the housing 310 of FIG. 3 ), and a touch panel disposed inside the housing (eg, the electronic device 300 of FIG. 3 ).
  • a touch panel 330 a display disposed inside the housing (eg, the display 340 of FIG. 3), a first area disposed on the touch panel and corresponding to the touch panel (eg, the third A cover window (eg, the cover window 320 of FIG. 3) including a first area 321) and a second area corresponding to the outer area of the touch panel (eg, the second area 322 of FIG.
  • the first region is a first sub-region (eg, a first sub-region 321-1 of FIG. 3) corresponding to an inner region of the touch panel and the touch panel Includes a second sub-region (for example, a second sub-region 321-2 in FIG. 3) corresponding to an outer region of and operates with the touch panel and the display
  • a processor coupled to each other for example, the processor 360 of FIG. 3 ), wherein the processor provides a user interface through the display, and receives a signal by a touch input from the touch panel while the user interface is provided.
  • the touch input is the second sub-region or In response to the first occurrence in the second area, whether the touch input includes movement of touch coordinates, and in response to the touch input including movement of touch coordinates, the touch The input may be set to be a second type touch distinguished from the first type touch, and to execute an event corresponding to the second type touch based on the user interface and the second type touch.
  • the touch panel of the electronic device includes a plurality of channels, and the plurality of channels are channels adjacent to an outermost channel among the plurality of channels (for example, a channel adjacent to an outermost channel in FIG. 4A).
  • a first group corresponding to (331)), and a second group corresponding to the outermost channels eg, the outermost channel 332 of FIG.
  • a first signal and a second signal according to the touch input are obtained from the first group and the second group, respectively, and in response to the magnitude of the second signal being greater than the first threshold value, the touch input is a second sub It is determined that it occurs in a second sub-region, the magnitude of the second signal exceeds a second threshold value lower than the first threshold value, and the magnitude of the second signal relative to the magnitude of the first signal In response to the ratio exceeding the specified value, it may be further set to determine that the signal has occurred in the second region.
  • the movement of the touch coordinates of the electronic device may correspond to movement of a predetermined angle or more with respect to the center of the touch panel.
  • the processor of the electronic device may be further configured to ignore a touch input different from the second type touch input input to the cover window while the second type touch is input.
  • An event corresponding to the second type touch of the electronic device is selected, changed, switched, scrolled, moved, enlarged or reduced for some or all of the user interface according to the second type touch. It may include at least one of.
  • An event corresponding to the second type touch of the electronic device may include an event that occurs while maintaining the user interface.
  • the electronic device further includes a haptic device (for example, the haptic device 370 of FIG. 3) operatively coupled to the processor, and the processor executes an event corresponding to the second type touch. While, it may be further set to control the haptic device to generate vibration based on the second type touch.
  • a haptic device for example, the haptic device 370 of FIG. 3
  • the processor of the electronic device generates the vibration based on at least one of a distance traveled by the second type touch, a speed of the second type touch, or an acceleration of the second type touch. It can be further set to control.
  • the processor of the electronic device detects whether the second type touch reaches the upper limit or the lower limit when there is an upper limit or a lower limit related to a second type touch in the user interface, and When the second type touch reaches the upper limit or the lower limit, the haptic device is controlled to generate vibration with a first intensity, and when the second type touch reaches the upper limit or the lower limit, the first intensity It may be further configured to control the haptic device to generate a vibration with a second intensity different from that.
  • the second type touch of the electronic device may pass through the first area and/or the second area.
  • the control method of an electronic device including a cover window, a touch panel, and a display includes: providing a user interface through the display, and a touch input from the touch panel while the user interface is provided.
  • determining the touch input greater than a first threshold value as a first type touch and executing an event corresponding to the first type touch based on the user interface and the first type touch
  • the touch input causes the movement of touch coordinates.
  • the touch input is determined as a second type touch distinguished from the first type touch, and the user interface and the second type touch
  • An electronic device control method includes a first group corresponding to an outermost channel and adjacent channels among a plurality of channels included in the touch panel, and a first group corresponding to the outermost channels among the plurality of channels. Acquiring each of the first signal and the second signal according to the touch input from the second group, and in response to the magnitude of the second signal being greater than the first threshold value, the touch input is sub-region), and when the size of the second signal exceeds a second threshold value lower than the first threshold value, the ratio of the size of the second signal to the size of the first signal is In response to exceeding the specified value, determining that the signal has occurred in the second area may be further included.
  • the movement of the touch coordinates in the method of controlling an electronic device may correspond to movement of a specified angle or more based on the center of the touch panel.
  • the event corresponding to the second type touch in the electronic device control method according to an embodiment is selected, changed, switched, scrolled, moved, enlarged for some or all of the user interface according to the second type touch. Alternatively, at least one of reduction may be included.
  • the event corresponding to the second type of touch in the electronic device control method may include an event that occurs while maintaining the user interface.
  • the movement of the touch coordinates in the method of controlling an electronic device may correspond to movement of a specified angle or more with respect to the center of the touch panel.
  • An electronic device control method is an operation of controlling a haptic device provided in the electronic device to generate vibration based on the second type touch while an event corresponding to the second type touch is executed. It may further include.
  • An electronic device control method includes the haptic device to generate vibration based on at least one of a distance traveled by the second type touch, a speed of the second type touch, or an acceleration of the second type touch. It may further include an operation to control.
  • An electronic device control method includes an operation of detecting whether the second type touch reaches the upper limit or the lower limit when there is an upper limit or a lower limit related to a second type touch in the user interface.
  • controlling the haptic device to generate vibration with a first intensity and when the second type touch reaches the upper limit or the lower limit, the second The operation of controlling the haptic device to generate vibration with a second intensity different from the first intensity may be further included.
  • the second type of touch in the method of controlling an electronic device according to an embodiment may pass through the first area and/or the second area.
  • Examples described in this disclosure include non-limiting example implementations of components corresponding to one or more features specified by the appended independent claims, and these features (or Their corresponding components), individually or in combination, may contribute to improving one or more technical problems that may be inferred by a person skilled in the art from this disclosure.
  • Additional example implementations include one or more elements taken jointly and individually, in any and all permutations, of any herein described implementation. It can be realized by doing. Still other example implementations may also be realized by combining one or more features of the appended claims with selected one or more components of any example implementation described in this disclosure.
  • any example implementation described in this disclosure may be omitted.
  • One or more components that may be omitted are components that a person skilled in the art would directly and clearly understand as not so essential to the function of the present technology in light of a technical problem discernible from the present disclosure.
  • a person skilled in the art does not need to modify other components or features of the further alternative example to compensate for the change, even if such omitted components are replaced or removed.
  • further example implementations may be included within the present disclosure, in accordance with the present technology, although a selected combination of features and/or components thereof is not specifically mentioned.
  • Two or more physically separate components of any described example implementation described in this disclosure may alternatively be integrated into a single component, if their integration is possible, and in a single component so formed. If the same function is performed by means of, the integration is possible. Conversely, a single component of any example implementation described in this disclosure may, alternatively, be implemented with two or more separate components that achieve the same functionality, where appropriate.
  • a computer-readable storage medium storing one or more programs (software modules) may be provided.
  • One or more programs stored in a computer-readable storage medium are configured to be executable by one or more processors in an electronic device (device).
  • the one or more programs include instructions that cause the electronic device to execute methods according to embodiments described in the claims or specification of the present disclosure.
  • These programs include random access memory, non-volatile memory including flash memory, read only memory (ROM), and electrically erasable programmable ROM.
  • EEPROM electrically erasable programmable read only memory
  • magnetic disc storage device compact disc-ROM (CD-ROM), digital versatile discs (DVDs), or other types of It may be stored in an optical storage device or a magnetic cassette. Alternatively, it may be stored in a memory composed of a combination of some or all of them. In addition, a plurality of configuration memories may be included.
  • the program is a communication network such as the Internet (Internet), intranet (Intranet), LAN (local area network), WLAN (wide LAN), or SAN (storage area network), or a communication network consisting of a combination thereof. It may be stored in an attachable storage device that can be accessed. Such a storage device may access a device performing an embodiment of the present disclosure through an external port. In addition, a separate storage device on the communication network may access a device performing an embodiment of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif électronique, selon un mode de réalisation de la présente invention, comprend : un boîtier ; un écran tactile et un dispositif d'affichage disposé à l'intérieur du boîtier ; une fenêtre de couvercle comprenant une première région correspondant à l'écran tactile et une seconde région correspondant à la région extérieure de l'écran tactile, la première région comprenant une première sous-région correspondant à la région intérieure de l'écran tactile et une seconde sous-région correspondant à la région extérieure de l'écran tactile ; et un processeur couplé à l'écran tactile et au dispositif d'affichage. Le processeur peut être configuré pour fournir une interface utilisateur à travers le dispositif d'affichage, acquérir un signal d'entrée tactile pendant que l'interface utilisateur est fournie, déterminer une région dans laquelle l'entrée tactile se produit d'abord sur la base du signal, et exécuter un événement correspondant au type de l'entrée tactile déterminée sur la base de l'interface.
PCT/KR2020/010322 2019-08-05 2020-08-05 Procédé de commande reposant sur une entrée tactile et dispositif électronique associé WO2021025456A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202080052624.1A CN114144749B (zh) 2019-08-05 2020-08-05 基于触摸输入的操作方法及其电子装置
EP20850608.9A EP3982240B1 (fr) 2019-08-05 2020-08-05 Procédé de commande reposant sur une entrée tactile et dispositif électronique associé
US16/975,532 US11294496B2 (en) 2019-08-05 2020-08-05 Operation method based on touch input and electronic device thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190095215A KR102719976B1 (ko) 2019-08-05 터치 입력에 기반한 동작 방법 및 그 전자 장치
KR10-2019-0095215 2019-08-05

Publications (1)

Publication Number Publication Date
WO2021025456A1 true WO2021025456A1 (fr) 2021-02-11

Family

ID=74503233

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/010322 WO2021025456A1 (fr) 2019-08-05 2020-08-05 Procédé de commande reposant sur une entrée tactile et dispositif électronique associé

Country Status (1)

Country Link
WO (1) WO2021025456A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090295753A1 (en) * 2005-03-04 2009-12-03 Nick King Electronic device having display and surrounding touch sensitive bezel for user interface and control
KR20130099717A (ko) * 2012-02-29 2013-09-06 주식회사 팬택 터치 스크린 기반의 사용자 인터페이스 제공 장치 및 방법
KR20140106996A (ko) * 2013-02-27 2014-09-04 삼성전자주식회사 햅틱을 제공하는 방법 및 장치
KR20150095540A (ko) * 2014-02-13 2015-08-21 삼성전자주식회사 사용자 단말 장치 및 이의 디스플레이 방법
KR20170107872A (ko) * 2016-03-16 2017-09-26 엘지전자 주식회사 워치 타입의 이동단말기 및 그 제어방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090295753A1 (en) * 2005-03-04 2009-12-03 Nick King Electronic device having display and surrounding touch sensitive bezel for user interface and control
KR20130099717A (ko) * 2012-02-29 2013-09-06 주식회사 팬택 터치 스크린 기반의 사용자 인터페이스 제공 장치 및 방법
KR20140106996A (ko) * 2013-02-27 2014-09-04 삼성전자주식회사 햅틱을 제공하는 방법 및 장치
KR20150095540A (ko) * 2014-02-13 2015-08-21 삼성전자주식회사 사용자 단말 장치 및 이의 디스플레이 방법
KR20170107872A (ko) * 2016-03-16 2017-09-26 엘지전자 주식회사 워치 타입의 이동단말기 및 그 제어방법

Also Published As

Publication number Publication date
KR20210016875A (ko) 2021-02-17

Similar Documents

Publication Publication Date Title
WO2020085789A1 (fr) Dispositif électronique pliable pour commander une interface utilisateur et son procédé de fonctionnement
WO2019209041A1 (fr) Afficheur souple et dispositif électronique équipé de celui-ci
WO2019168318A1 (fr) Dispositif électronique et procédé d'interface d'authentification d'empreinte digitale
WO2020218742A1 (fr) Dispositif électronique pliable et son procédé de fonctionnement
WO2021118061A1 (fr) Dispositif électronique et procédé de configuration d'agencement utilisant ledit dispositif
WO2020013528A1 (fr) Affichage souple et dispositif électronique le comportant
WO2020017743A1 (fr) Dispositif électronique comprenant une unité d'affichage sur laquelle est affiché un écran d'exécution pour de multiples applications, et procédé de fonctionnement du dispositif électronique
WO2021091286A1 (fr) Dispositif électronique comprenant un capteur pour détecter une entrée externe
AU2019318996B2 (en) Electronic device including button and method for operation in electronic device
WO2019160347A1 (fr) Procédé de traitement d'entrée tactile et dispositif électronique prenant en charge ledit procédé
WO2021060889A1 (fr) Dispositif électronique pliable et procédé de fonctionnement à fenêtres multiples l'utilisant
WO2020159308A1 (fr) Dispositif électronique et procédé permettant de mapper une fonction avec une entrée de bouton
WO2020190028A1 (fr) Dispositif électronique, procédé, et support lisible par un ordinateur destinés à afficher un écran dans un panneau d'affichage déformable
WO2020106019A1 (fr) Dispositif électronique et procédé de fourniture de service d'information-divertissement à bord d'un véhicule
WO2021118187A1 (fr) Dispositif électronique pliable ayant une caméra rotative et procédé de capture d'images associé
WO2019160348A1 (fr) Dispositif électronique d'acquisition d'une entrée d'utilisateur en état submergé à l'aide d'un capteur de pression, et procédé de commande de dispositif électronique
WO2021221421A1 (fr) Procédé de commande d'affichage et dispositif électronique associé
WO2021080360A1 (fr) Dispositif électronique et son procédé de commande de fonctionnement du dispositif d'affichage
WO2021145692A1 (fr) Dispositif électronique pliable et procédé d'affichage d'écran
WO2019039729A1 (fr) Procédé de modification de la taille du contenu affiché sur un dispositif d'affichage, et dispositif électronique associé
WO2020222428A1 (fr) Appareil électronique et son procédé de sortie d'image
WO2019199086A1 (fr) Dispositif électronique et procédé de commande pour dispositif électronique
AU2018321518B2 (en) Method for determining input detection region corresponding to user interface and electronic device thereof
WO2021133123A1 (fr) Dispositif électronique comprenant un écran flexible et son procédé de fonctionnement
WO2020091530A1 (fr) Procédé et dispositif de détermination de compensation pour des données tactiles sur la base d'un mode de fonctionnement d'un dispositif d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20850608

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020850608

Country of ref document: EP

Effective date: 20220106

NENP Non-entry into the national phase

Ref country code: DE