WO2018143744A1 - Dispositif d'affichage à détection tactile et procédé de commande de son écran - Google Patents

Dispositif d'affichage à détection tactile et procédé de commande de son écran Download PDF

Info

Publication number
WO2018143744A1
WO2018143744A1 PCT/KR2018/001506 KR2018001506W WO2018143744A1 WO 2018143744 A1 WO2018143744 A1 WO 2018143744A1 KR 2018001506 W KR2018001506 W KR 2018001506W WO 2018143744 A1 WO2018143744 A1 WO 2018143744A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
pressure
contact point
touch
zoom
Prior art date
Application number
PCT/KR2018/001506
Other languages
English (en)
Korean (ko)
Inventor
전재범
Original Assignee
주식회사 하이딥
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 하이딥 filed Critical 주식회사 하이딥
Publication of WO2018143744A1 publication Critical patent/WO2018143744A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present invention relates to a touch-sensitive display device and a screen control method thereof, and more particularly, to a device operability of a user by controlling a screen display in response to a pressure touch input in a device having a touch-sensitive display capable of sensing touch pressure.
  • the present invention relates to a touch-sensitive display device and a screen control method thereof.
  • a touch screen capable of detecting touch pressure has also appeared to enable a variety of touch manipulation methods.
  • the map image is retrieved and rendered by increasing the zoom level for the geographic origin and the geographic destination and decreasing the zoom level of other parts according to the intensity of the user's touch pressure. It is trying to shorten the time to do.
  • An object of the present invention is to provide a touch sensitive display device and a method of controlling the same, which control a screen display in response to touch pressure.
  • Another object of the present invention is to provide a touch-sensitive display device and a method of controlling the same, which can easily manipulate a vector image.
  • the touch-sensitive display device of the present invention includes a touch screen, pressure sensing means, and a controller.
  • the touch screen includes a display element and a touch sensing element that senses a touch on a screen.
  • the control unit controls the screen according to the pressure of the contact point.
  • control unit moves the screen in consideration of the position of the contact point.
  • the controller updates the resolution of the map to match the size of the map at the time of the hard press.
  • the screen when the user presses the screen beyond the threshold pressure at the end of the zoom gesture, the screen is enlarged or reduced more than the current magnification.
  • the viewpoint of the 3D vector image is adjusted and displayed according to the position of the contact point.
  • the screen display of the vector type image is controlled in response to the touch pressure, thereby simplifying the screen operation.
  • the zoom-in or zoom-out operation using two fingers can be more conveniently utilized.
  • the operation of changing the viewpoint in the 3D vector image can be easily manipulated.
  • FIG. 1 is a block diagram of a touch-sensitive display device according to an embodiment of the present invention.
  • FIG. 2 is a view for explaining a zoom-in operation according to a first embodiment of the present invention.
  • FIG 3 is a view for explaining a zoom-in operation according to a second embodiment of the present invention.
  • FIG. 4 is a diagram for explaining a zoom-in operation according to a third exemplary embodiment of the present invention.
  • FIG. 5 is a view for explaining a view point change operation according to a fourth embodiment of the present invention.
  • FIG. 7 is a diagram for describing a correlation between a touch point on a screen and a viewpoint change in a fourth exemplary embodiment of the present invention.
  • the touch-sensitive display device described herein includes a mobile phone equipped with a touch screen, a smart phone, a laptop computer, a terminal device for digital broadcasting, a personal digital assistant, a navigation device, and a slate PC. ), A tablet PC, an ultrabook, a wearable device, a kiosk, and the like.
  • FIG. 1 is a block diagram of a touch-sensitive display device 100 according to an embodiment of the present invention, showing an example in which the present invention is applied to a smartphone.
  • the touch-sensitive display device 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 130, an output unit 150, an interface unit 160, a memory 140, a controller 180, and a power supply unit. 160, and the like.
  • the components shown in FIG. 1 are not essential to implementing a mobile terminal device, and the mobile terminal device described herein may have more or fewer components than those listed above.
  • the wireless communication unit 110 may include a touch-sensitive display device 100 and a wireless communication system, between the touch-sensitive display device 100 and another touch-sensitive display device 100, or between the touch-sensitive display device 100 and an external server. It may include one or more modules to enable wireless communication of. In addition, the wireless communication unit 110 may include one or more modules for connecting the touch-sensitive display device 100 to one or more networks.
  • the wireless communication unit 110 may include at least one of the mobile communication module 112, the wireless internet module 113, the short range communication module 114, and the location information module 115.
  • the mobile communication module 112 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network constructed according to technical standards or communication schemes for mobile communication.
  • the wireless internet module 113 refers to a module for wireless internet access and may be built in or external to the touch-sensitive display device 100.
  • the wireless internet module 113 is configured to transmit and receive wireless signals in a communication network according to wireless internet technologies such as a wireless local area network (WLAN), wireless-fidelity (Wi-Fi), and the like.
  • WLAN wireless local area network
  • Wi-Fi wireless-fidelity
  • the short range communication module 114 uses short range communication using technologies such as Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), ZigBee, and Near Field Communication (NFC). communication).
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • ZigBee ZigBee
  • NFC Near Field Communication
  • the location information module 115 is a module for obtaining the location (or current location) of the mobile terminal device.
  • a representative example thereof includes a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. It is not limited to modules that directly calculate or obtain.
  • GPS Global Positioning System
  • WiFi Wireless Fidelity
  • the input unit 120 may include a camera 121 or an image input unit for inputting an image signal, a microphone 122 for inputting an audio signal, an audio input unit, or a user input unit 123 for receiving information from a user. , Touch keys, mechanical keys, and the like.
  • the voice data or the image data collected by the input unit 120 may be analyzed and processed as a control command of the user.
  • the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
  • the processed image frame may be displayed on the display unit 151 or stored in the memory 140.
  • the microphone 122 processes external sound signals into electrical voice data.
  • the processed voice data may be utilized in various ways according to a function (or an application program being executed) performed by the touch-sensitive display device 100.
  • the user input unit 123 is for receiving information from a user. When information is input through the user input unit 123, the controller 180 may control an operation of the touch-sensitive display device 100 to correspond to the input information. Can be.
  • the user input unit 123 may be a mechanical input unit (or a mechanical key, for example, a button, a dome switch, or a jog located at the front, rear, or side surfaces of the touch-sensitive display apparatus 100). Wheels, jog switches, etc.) and touch input means.
  • the touch input means may include a virtual key, a soft key, or a visual key displayed on the touch screen through a software process, or a portion other than the touch screen. It may be made of a touch key disposed in the.
  • the virtual key or the visual key may be displayed on the touch screen while having various forms, for example, graphic, text, icon, video, or the like. It can be made of a combination of.
  • the sensing unit 130 may include one or more sensors for sensing at least one of information in the mobile terminal device, surrounding environment information surrounding the mobile terminal device, and user information.
  • the sensing unit 130 may include a proximity sensor 131, an illumination sensor 132, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
  • the sensor may include a G-sensor, a gyroscope sensor, a motion sensor, and the like.
  • the output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes at least one of a display unit 151, an audio output unit 152, a hap tip module 153, and an optical output unit 154. can do.
  • the display unit 151 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible display). display, a 3D display, or an e-ink display.
  • the display unit 151 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
  • the touch screen may function as a user input unit 123 that provides an input interface between the touch-sensitive display device 100 and the user, and at the same time, may provide an output interface between the touch-sensitive display device 100 and the user. .
  • the display unit 151 may include a touch sensor that senses a touch on the display unit 151 so as to receive a control command by a touch method.
  • the touch sensor may sense the touch, and the controller 180 may generate a control command corresponding to the touch based on the touch sensor.
  • the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
  • the touch sensor may be formed in a film form having a touch pattern and disposed between the window and the display on the rear surface of the window, or may be a metal wire directly patterned on the rear surface of the window.
  • the sound output unit 152 outputs an audio signal such as music or voice and may include a receiver, a speaker, a buzzer, and the like.
  • the haptic module 153 generates various haptic effects that a user can feel. A representative example of the tactile effect generated by the haptic module 153 may be vibration.
  • the light output unit 154 outputs a signal for notifying occurrence of an event by using light of a light source of the touch-sensitive display device 100. Examples of events generated in the touch-sensitive display device 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, and information reception through an application.
  • the memory 140 stores data supporting various functions of the touch sensitive display apparatus 100.
  • the memory 140 may store a plurality of application programs or applications driven in the touch sensitive display apparatus 100, data for operating the touch sensitive display apparatus 100, and instructions. At least some of these applications may be downloaded from an external server via wireless communication. In addition, at least some of these applications may be installed on the touch-sensitive display device 100 from the time of shipment for basic functions (for example, call reception, call-out function, message reception, and call-out function) of the touch-sensitive display device 100. May exist.
  • the application program may be stored in the memory 140 and installed on the touch-sensitive display device 100 to be driven by the controller 180 to perform an operation (or function) of the mobile terminal device.
  • the controller 180 In addition to the operation related to the application program, the controller 180 typically controls the overall operation of the touch-sensitive display device 100.
  • the controller 180 may provide or process information or a function appropriate to a user by processing signals, data, information, etc. input or output through the above-described components, or by driving an application program stored in the memory 140.
  • the controller 180 may control at least some of the components in order to drive an application program stored in the memory 140.
  • the controller 180 may operate at least two or more of the components included in the touch-sensitive display apparatus 100 in combination with each other to drive the application program.
  • the power supply unit 190 receives power from an external power source and an internal power source under the control of the controller 180 to supply power to each component included in the touch-sensitive display device 100.
  • the power supply unit 190 may include a battery, and the battery may be a built-in battery or a replaceable battery.
  • At least some of the components may operate in cooperation with each other in order to implement an operation, control, or control method of the mobile terminal apparatus according to various embodiments described below.
  • the operation, control, or control method of the mobile terminal device may be implemented on the mobile terminal device by driving at least one application program stored in the memory 140.
  • the present invention has been described in the case where the present invention is applied to a smartphone.
  • a wired communication is used instead of wireless communication, and a camera and a microphone are used. It can be modified to be omitted. That is, according to the nature of the apparatus to which the present invention is applied, the components may be added or omitted as appropriate.
  • the touch-sensitive display device 100 may distinguish the type of touch command based on the pressure. For example, the touch-sensitive display device 100 may recognize a touch gesture less than a preset pressure as a selection command for the touched area. In addition, the display apparatus 100a may recognize a touch gesture having a predetermined pressure or more as an additional command.
  • the touch-sensitive display device 100 of the present invention is provided with a pressure sensing means.
  • the pressure sensing means may be integrally coupled to the touch screen or provided as a separate component, and the present invention is not limited to a specific pressure sensing scheme.
  • the pressure of the touch gesture can be detected using various methods.
  • the display unit 151 of the touch-sensitive display device 100 may include a touch recognition layer capable of sensing a touch and a fingerprint recognition layer capable of sensing a fingerprint.
  • the image quality of the touched part may vary.
  • the display unit 151 lightly the touched portion may be blurred.
  • the display unit 151 including the fingerprint recognition layer may recognize the touched part with an image quality proportional to the touch pressure.
  • the touch-sensitive display device 100 may detect the intensity of the touched part according to the image quality.
  • the touch sensitive display apparatus 100 may detect the strength of the touch pressure by using a touch area recognized by the touch recognition layer. When the user presses lightly on the display unit 151, the touched area may be relatively small. In addition, when the user presses hard, the touched area may be relatively large. The touch sensitive display apparatus 100 may calculate the touch pressure by using a relationship between the touched area and the pressure. Accordingly, the touch-sensitive display device 100 may recognize a touch gesture of a predetermined pressure or more.
  • the touch sensitive display apparatus 100 may detect the pressure of the touch gesture using a piezoletric element.
  • a piezoelectric element refers to a device that detects pressure or generates deformation / vibration by using a piezoelectric effect.
  • mechanical stress exactly a mechanical force or pressure
  • deformation occurs, electric charges accumulate as polarization occurs in the solid. (accumulate).
  • the integrated charge appears in the form of an electrical signal, ie a voltage, between the electrodes of the material. This phenomenon is called the piezoelectric effect, the solid material is called the piezoelectric material, and the integrated charge is called the piezoelectricity.
  • the touch-sensitive display device 100 may include a sensing unit (not shown) including a layer made of a piezoelectric material that can be driven by the piezoelectric effect.
  • the sensing unit may detect the applied mechanical energy (force or pressure) and the electrical energy (voltage, which is a kind of electrical signal) generated by the deformation, and detect the applied mechanical force or pressure based on the detected voltage. can do.
  • the touch-sensitive display device 100 may include three or more pressure sensors as pressure sensing means. Three or more pressure sensors may be disposed in different layers in the display unit 151 area or may be disposed in the bezel area.
  • the pressure sensor may sense the amount of pressure applied.
  • the strength of pressure detected by the pressure sensor may be inversely proportional to the touch point and distance.
  • the intensity of the pressure detected by the pressure sensor may be proportional to the touch pressure.
  • the touch sensitive display apparatus 100 may calculate the intensity of the touch point and the actual touch pressure by using the intensity of the pressure detected by each pressure sensor.
  • the touch sensitive display apparatus 100 may include a touch input layer that detects a touch input to detect a touch point.
  • the touch-sensitive display apparatus 100 may calculate the intensity of the touch pressure of the touch point by using the detected touch point and the intensity of the pressure detected by each pressure sensor.
  • FIG. 2 is a diagram for explaining a zoom-in operation according to the first exemplary embodiment of the present invention, and illustrates a case where the map screen is enlarged.
  • the present invention can be applied to enlarged viewing of various screens such as a web browser screen, a photo screen, a CAD screen, in addition to the enlarged view of the map screen.
  • the zoom-in gesture for enlarging the screen generally uses two fingers to touch two contact points P1 and P2 on the screen (initialization of the zoom-in gesture) and then the two fingers move away from each other, for example, in FIG.
  • the fingers are opened in the direction of the arrow (the state (b) of FIG. 2), and then the fingers are removed from the screen.
  • the controller 180 enlarges and displays the screen according to the zoom-in gesture. That is, the screen is displayed while increasing the magnification of the screen according to the degree of the finger spreading from the beginning of the zoom-in gesture until the finger opens and stops.
  • the enlargement of the screen is made around a middle point of two fingers or a center point of the screen.
  • the screen is moved in consideration of the position of the contact point.
  • the controller 180 enlarges and displays the screen accordingly, and as shown in FIG. 2C, one of the contact points P2 at the end of the zoom-in gesture.
  • the pressure of ' hereinafter referred to as' end pressure'
  • the touch screen is controlled to move the screen in consideration of the position of the contact point in the enlarged state. The screen at this time is shown in FIG.
  • the screen may be moved so that the first contact point is in the center of the screen, or the screen may be moved by a predetermined distance such that the first contact point is moved in the center direction of the screen.
  • the screen may be moved so that the center of the screen moves by a predetermined distance in the direction of the first contact point.
  • the moving distance may be configured to be adjusted according to the magnitude of the end pressure, or the screen may be moved while the end pressure of the first contact point is maintained larger than the threshold pressure.
  • the critical pressure can be appropriately set according to the apparatus, application field, etc. to which the critical pressure is applied.
  • the threshold pressure may be set to a predetermined ratio higher pressure, for example, 20% higher than the pressure at the first contact point at the beginning of the zoom-in gesture.
  • the threshold pressure may be set to a pressure higher than the pressure of the first contact point at the beginning of the zoom-in gesture by a predetermined magnitude or to a fixed magnitude of pressure.
  • FIG. 3 is a view for explaining a zoom-in operation according to a second exemplary embodiment of the present invention, which shows an enlarged view of a map screen.
  • the present invention can be applied to enlarged viewing of various screens such as a web browser screen, a photo screen, a CAD screen, in addition to the enlarged view of the map screen.
  • the zoom-in gesture for enlarging the screen generally uses two fingers to touch two touch points P3 and P4 on the screen (initialization of the zoom-in gesture), and then the two fingers move away from each other, for example, in FIG.
  • the controller 180 enlarges and displays the screen according to the zoom-in gesture while the zoom-in gesture is performed. That is, the screen is displayed while increasing the magnification of the screen according to the degree of the finger spreading from the beginning of the zoom-in gesture until the finger opens and stops.
  • a map having the same resolution is increased during the zoom-in operation, and only after the zoom-in gesture is over, the map is increased to fit the current screen size.
  • a map having a resolution that matches the current map size is displayed only after the user lifts a finger off the screen after the map of the current resolution is enlarged. Therefore, the information desired by the user is displayed during the zoom-in gesture. It was not possible to see if the resolution appeared.
  • the present embodiment is configured to update the resolution of the map to match the size of the map when the user presses harder than the threshold pressure at least one contact point during the zoom-in gesture.
  • the controller 180 enlarges and displays the screen accordingly.
  • the touch screen is updated to display and update the resolution of the screen so as to fit the enlarged screen when the pressure increases.
  • the resolution may be updated when the pressures of the two contact points P3 'and P4' are both greater than the threshold pressure.
  • the critical pressure can be appropriately set according to the apparatus, application field, etc. to which it is applied.
  • the threshold pressure may be set to a predetermined ratio higher pressure, for example, 20% higher than the pressure at the first contact point at the beginning of the zoom-in gesture.
  • the threshold pressure may be set to a pressure higher than the pressure of the contact point at the beginning of the zoom-in gesture by a predetermined magnitude or to a fixed magnitude of pressure.
  • FIG. 4 is a view for explaining a zoom-in operation according to a third exemplary embodiment of the present invention, which shows a case where the map screen is enlarged.
  • the present embodiment may be applied to a zoom out operation in which the map screen is reduced in addition to the enlarged view of the map screen.
  • the present invention can be applied to zoom-in and zoom-out operations of various screens, such as a web browser screen, a photo screen, and a CAD screen.
  • the zoom-in gesture for enlarging the screen generally uses two fingers to touch two contact points P1 and P2 on the screen (initialization of the zoom-in gesture) and then the two fingers move away from each other, for example, in FIG.
  • the fingers are opened in the direction of the arrow (the state (b) of FIG. 4), and the finger is removed from the screen.
  • the controller 180 enlarges and displays the screen according to the zoom-in gesture while the zoom-in gesture is performed. That is, the screen is displayed while increasing the magnification of the screen according to the degree of the finger spreading from the beginning of the zoom-in gesture until the finger opens and stops.
  • the zoom-out gesture is displayed while reducing the magnification of the screen in response to pinching fingers in the direction opposite to the arrow shown in (a) of FIG. 4 in the position of (b) of FIG.
  • the magnification is increased directly from the screen of the minimum magnification to the screen of the maximum magnification.
  • the magnification changes too rapidly in one operation, making it difficult to adjust the magnification as much as desired. Therefore, in the related art, it is configured to enlarge from the screen of the minimum magnification to the screen of the maximum magnification only through two or more zoom-in gestures. However, this is inconvenient to take a zoom gesture several times in order to obtain a screen of the desired magnification.
  • the screen when the screen is pressed beyond the threshold pressure at the end of the zoom gesture, the screen is enlarged or reduced more than the current magnification.
  • pressing the at least one contact point P5 'or P6' harder than the threshold pressure at the end of the zoom-out gesture further reduces and displays the screen.
  • both of the contact points P5 'and P6' must be pressed harder than the critical pressure to configure this function.
  • the final screen magnification in FIG. 4D may be determined in various ways.
  • the controller 180 may determine the pressure at the contact point (for example, the pressure at the first contact point or the average value of the pressure at the first contact point and the second contact point) detected at the end of the zoom gesture.
  • the final screen magnification can be adjusted according to the size. For example, in the case of the zoom-in operation, when the pressure is higher than the first threshold pressure and lower than the second threshold pressure, the final screen magnification is one step higher than the magnification in FIG. 4C, and the pressure is zero. 2 It can be adjusted at the maximum magnification if it is higher than the grain boundary pressure
  • the controller 180 may control to enlarge or reduce the screen continuously while the pressure of the contact point detected at the end of the zoom gesture is larger than the threshold pressure.
  • the user may be configured to continuously increase the magnification by 2 times while maintaining the state of FIG. 4C.
  • the controller 180 may control to enlarge or reduce the screen continuously until the final screen magnification reaches the limit magnification at the magnification at that time. For example, in the case of the zoom-out operation, if the user presses at least one of the contact points harder at the end of the zoom-out gesture and the pressure exceeds the threshold pressure, the controller 180 may end the zoom-out gesture at the end of the zoom-out gesture. The magnification of the screen may be sequentially reduced until the magnification at the time of pressing) becomes the lowest magnification which is the zoom-out limit magnification.
  • the critical pressure can be appropriately set according to the apparatus, application field, etc. to which it is applied.
  • the threshold pressure may be set to a predetermined ratio higher pressure, for example, 20% higher than the pressure at the first point of contact at the beginning of the zoom gesture.
  • the threshold pressure may be set to a pressure higher than the pressure of the contact point at the beginning of the zoom gesture by a predetermined size or to a fixed size pressure.
  • FIG. 5 is a view for explaining a viewpoint change operation according to a fourth embodiment of the present invention, and illustrates a case of changing and viewing a viewpoint of a 3D map screen.
  • the present invention can be applied to a viewpoint change operation of various screens including three-dimensional vector images, such as a three-dimensional CAD screen and a three-dimensional video game screen, in addition to changing and viewing a three-dimensional map screen.
  • the map image corresponding to the coordinates of the portion where the finger first touched moves with the movement of the finger. Accordingly, the user may move a portion of the map to the center of the screen using a scroll gesture.
  • the view point of the coordinate can be changed without changing the coordinate on the map located at the center of the screen.
  • it is inconvenient to use the viewpoint change icon separately provided to change the viewpoint and the portion where the map is displayed is narrowed because the viewpoint change icon should occupy a certain portion on the screen.
  • the viewpoint of the three-dimensional vector image is corresponding to the position of the contact point. It is controlled to display.
  • Various methods may be used as a method of adjusting the view corresponding to the position of the contact point.
  • a spherical coordinate system used when rendering a 3D vector image will be described. 6 shows a spherical coordinate system.
  • the object to be observed is located at the origin O of the coordinate system and the observer is located at a point P on the surface of a sphere of diameter r.
  • the coordinates of the observer may be represented by a diameter r, a horizontal angle ⁇ , and a vertical angle ⁇ .
  • the viewing point of the observer can be represented by the horizontal angle ⁇ and the vertical angle ⁇ .
  • FIG. 7 is a diagram for describing a correlation between a touch point on a screen and a viewpoint change in a fourth exemplary embodiment of the present invention.
  • the contact point P7 is The coordinate is (x1, y1).
  • the viewpoint is changed according to the value of the coordinate (x1, y1) of the contact point P7.
  • the viewpoint vertical angle corresponds to the vertical angle ⁇ in FIG. 6
  • the viewpoint horizontal angle corresponds to the horizontal angle ⁇ in FIG. 6.
  • the viewpoint may be changed and displayed only according to the coordinate values of the y-axis. For example, if the position of the contact point P7 is above the center of the screen, that is, if y1 has a positive value, the viewpoint vertical angle is increased and if it is below the center of the screen, that is, y1 is a negative value. If so, the viewpoint vertical angle is reduced.
  • FIG. 5 is an example of such a case. Since the position of the contact point P7 is above the center of the screen, the viewpoint vertical angle is increased so that the viewpoint becomes closer to the ground. In the screen of FIG.
  • the viewpoint may be changed and displayed only in accordance with the coordinate value of the x-axis. For example, when the position of the contact point P7 is to the left of the center of the screen, the viewpoint is moved to the right (that is, the horizontal angle of view is increased), and when the position of the contact point P7 is to the right than the center of the screen, the viewpoint is moved to the left ( That is, the viewpoint horizontal angle is reduced). Intuitively, when the left side is pressed from the center of the screen in the screen of FIG. 5 (a), the left part is moved backward from the origin and the right part is brought forward.
  • the viewpoint may be changed in consideration of both the coordinate value of the x-axis and the coordinate value of the y-axis. For example, when the position of the contact point P7 is the upper left of the screen, that is, when the value of y1 is positive and the value of x1 is negative, both the viewpoint vertical angle and the viewpoint horizontal angle are increased, and the position of the contact point P7 is increased.
  • the degree of change of the viewpoint may be determined according to any one or more of the pressure of the contact, the time of the contact, and the distance from the center of the screen to the contact point.
  • the degree of viewpoint change may be increased, and when the contact time is long, the degree of viewpoint change may be increased, and as the distance from the center of the screen is pressed, the degree of viewpoint change may be increased.
  • the pressure at the contact point P7 is maintained above the critical pressure, it may be controlled to continuously adjust and display the viewpoint of the screen.
  • the critical pressure can be appropriately set according to the apparatus, application field, etc. to which it is applied.
  • the critical pressure can be set to a fixed magnitude of pressure, which can be appropriately determined according to hardware characteristics, software characteristics, and the like.
  • the screen display of the vector type image is controlled in response to the touch pressure, thereby simplifying the screen operation.
  • the zoom-in or zoom-out operation using two fingers can be more conveniently utilized.
  • the operation of changing the viewpoint in the 3D vector image can be easily manipulated.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif d'affichage à détection tactile de la présente invention comprend un écran tactile, un moyen de détection de pression et une unité de commande. L'écran tactile comprend un élément d'affichage et un élément de détection tactile pour détecter un contact sur l'écran. L'unité de commande commande l'écran en fonction d'une pression au niveau d'un point de contact. L'unité de commande déplace l'écran, met à jour sa résolution, ou règle en outre le grossissement de l'écran en considérant la position d'un point de contact où une pression plus forte qu'une pression seuil est appliquée. Dans un autre mode de réalisation de la présente invention, lorsqu'un utilisateur pousse un point ("point de contact") sur l'écran à une pression supérieure ou égale à une pression seuil, le point de vue d'une image vectorielle tridimensionnelle est ajusté en fonction de la position du point de contact, et l'image vectorielle ajustée est affichée. Selon la présente invention, une commande pour une image vectorielle est en outre simplifiée.
PCT/KR2018/001506 2017-02-03 2018-02-05 Dispositif d'affichage à détection tactile et procédé de commande de son écran WO2018143744A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0015403 2017-02-03
KR1020170015403A KR101911680B1 (ko) 2017-02-03 2017-02-03 터치 감지 디스플레이 장치 및 그 화면 제어 방법

Publications (1)

Publication Number Publication Date
WO2018143744A1 true WO2018143744A1 (fr) 2018-08-09

Family

ID=63040896

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/001506 WO2018143744A1 (fr) 2017-02-03 2018-02-05 Dispositif d'affichage à détection tactile et procédé de commande de son écran

Country Status (2)

Country Link
KR (1) KR101911680B1 (fr)
WO (1) WO2018143744A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112911362A (zh) * 2021-01-29 2021-06-04 广州虎牙科技有限公司 视频画面移动缩放方法、装置、电子设备及可读存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102484768B1 (ko) 2022-08-25 2023-01-09 주식회사 엔에스로보텍 터칭 로봇
KR102557082B1 (ko) 2022-12-12 2023-07-20 주식회사 엔에스로보텍 다관절 터칭로봇을 이용한 무인환경측정소 대기측정 및 분석장비의 원격 제어방법과 그 시스템

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110008715A (ko) * 2009-07-21 2011-01-27 주식회사 만도 차량 엔진 제어 방법 및 장치
KR20120006672A (ko) * 2010-07-13 2012-01-19 엘지전자 주식회사 이동 단말기 및 그 제어 방법
KR20120135723A (ko) * 2011-06-07 2012-12-17 김연수 터치패널 타입의 신호입력장치
JP2014052852A (ja) * 2012-09-07 2014-03-20 Sharp Corp 情報処理装置
KR20140111188A (ko) * 2013-03-08 2014-09-18 삼성디스플레이 주식회사 단말기 및 그의 조작 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110008715A (ko) * 2009-07-21 2011-01-27 주식회사 만도 차량 엔진 제어 방법 및 장치
KR20120006672A (ko) * 2010-07-13 2012-01-19 엘지전자 주식회사 이동 단말기 및 그 제어 방법
KR20120135723A (ko) * 2011-06-07 2012-12-17 김연수 터치패널 타입의 신호입력장치
JP2014052852A (ja) * 2012-09-07 2014-03-20 Sharp Corp 情報処理装置
KR20140111188A (ko) * 2013-03-08 2014-09-18 삼성디스플레이 주식회사 단말기 및 그의 조작 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112911362A (zh) * 2021-01-29 2021-06-04 广州虎牙科技有限公司 视频画面移动缩放方法、装置、电子设备及可读存储介质
CN112911362B (zh) * 2021-01-29 2024-03-08 广州虎牙科技有限公司 视频画面移动缩放方法、装置、电子设备及可读存储介质

Also Published As

Publication number Publication date
KR20180090488A (ko) 2018-08-13
KR101911680B1 (ko) 2018-10-25

Similar Documents

Publication Publication Date Title
WO2020258929A1 (fr) Procédé de commutation d'interface de dossier et dispositif terminal
WO2021218902A1 (fr) Procédé et appareil de commande d'affichage et dispositif électronique
WO2016195291A1 (fr) Appareil terminal d'utilisateur et son procédé de commande
WO2015016527A1 (fr) Procédé et appareil de commande du verrouillage/déverrouillage
CN110908558B (zh) 一种图像显示方法及电子设备
WO2015046809A1 (fr) Procédé pour afficher des prévisualisations dans un widget
WO2018194275A1 (fr) Appareil apte à détecter un toucher et à détecter une pression de toucher, et son procédé de commande
WO2015030488A1 (fr) Procédé d'affichage multiple, support de stockage et dispositif électronique
WO2020134744A1 (fr) Procédé de déplacement d'icônes et terminal mobile
EP3695591A1 (fr) Dispositif électronique pour commander une pluralité d'applications
WO2019156437A1 (fr) Procédé de détection d'empreinte digitale utilisant une pression en mode veille d'un dispositif d'entrée tactile et dispositif d'entrée tactile
WO2021121121A1 (fr) Procédé de commande d'affichage et dispositif électronique
JP2023500149A (ja) スクリーン表示の制御方法及び電子機器
WO2010151053A2 (fr) Terminal mobile utilisant un capteur tactile fixé au boîtier, et procédé de commande associé
WO2020173235A1 (fr) Procédé de commutation de tâches et dispositif terminal
KR20120009851A (ko) 이동 단말기에서 보호 모드 실행방법 및 그 방법을 이용한 이동 단말기
WO2015178661A1 (fr) Procede et appareil de traitement d'un signal d'entree au moyen d'un dispositif d'affichage
WO2020215982A1 (fr) Procédé de gestion d'icône de bureau et dispositif terminal
WO2018143744A1 (fr) Dispositif d'affichage à détection tactile et procédé de commande de son écran
WO2018236047A1 (fr) Dispositif et procédé de commande permettant une détection tactile et une détection de pression tactile
WO2017126709A1 (fr) Terminal mobile et procédé de commande associé
US11526320B2 (en) Multi-screen interface control method and terminal device
JP7413546B2 (ja) 撮影方法及び電子機器
WO2021133123A1 (fr) Dispositif électronique comprenant un écran flexible et son procédé de fonctionnement
WO2021129721A1 (fr) Procédé d'agencement d'icônes, appareil électronique et support de stockage lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18748427

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18748427

Country of ref document: EP

Kind code of ref document: A1