WO2021201603A1 - Dispositif électronique et son procédé de commande - Google Patents

Dispositif électronique et son procédé de commande Download PDF

Info

Publication number
WO2021201603A1
WO2021201603A1 PCT/KR2021/004011 KR2021004011W WO2021201603A1 WO 2021201603 A1 WO2021201603 A1 WO 2021201603A1 KR 2021004011 W KR2021004011 W KR 2021004011W WO 2021201603 A1 WO2021201603 A1 WO 2021201603A1
Authority
WO
WIPO (PCT)
Prior art keywords
volume control
electronic device
touch screen
volume
control function
Prior art date
Application number
PCT/KR2021/004011
Other languages
English (en)
Korean (ko)
Inventor
김정현
정한철
최봉학
김봉건
복일근
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021201603A1 publication Critical patent/WO2021201603A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/32Monitoring with visual or acoustical indication of the functioning of the machine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/32Monitoring with visual or acoustical indication of the functioning of the machine
    • G06F11/324Display of status information
    • G06F11/327Alarm or error message display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output

Definitions

  • An embodiment of the present disclosure relates to an electronic device capable of controlling a volume through a touch screen and a control method thereof.
  • the electronic device may convert a sound within an audible frequency band, such as music, a notification sound, or a human voice, into an audio signal that is electrical energy and record the sound on a recording medium, or convert the recorded audio signal into sound and output it through a speaker.
  • an audible frequency band such as music, a notification sound, or a human voice
  • the electronic device is shipped with the volume of the output audio signal set to be adjustable between the minimum level and the maximum level, and the user can adjust the volume of the audio signal to be output through the speaker to a desired level.
  • a volume control UI is displayed by using a physical button provided in an external frame of the electronic device or through a touch screen through multiple selections, and the volume control UI is adjusted to adjust the volume. was adjusted.
  • Various embodiments of the present disclosure may provide a method for instantly and accurately controlling a volume through a touch screen without using a physical button, and an electronic device supporting the same.
  • An electronic device includes a touch screen and a processor operatively connected to the touch screen, wherein the processor is configured to activate a volume control function in one area of an edge area of the touch screen.
  • the processor is configured to activate a volume control function in one area of an edge area of the touch screen.
  • the method for controlling an electronic device includes detecting that when a preset touch input for activating a volume control function is detected in one of an edge region of a touch screen of the electronic device, the volume control function is activated It may include an operation of providing feedback to notify, and an operation of changing a volume based on a direction and a distance of the drag input when a drag input is sensed along the edge area after the feedback is provided.
  • a new experience may be provided to the user by controlling the volume by touching an edge region of the touch screen that meets the frame of the electronic device.
  • the volume may be controlled by touching an edge region of the touch screen that meets the frame of the electronic device.
  • the position of the volume control UI can be freely adjusted within the edge area of the screen, so that the user's main hand is the right hand or the left hand, or the user's hand size is customized ( customizing) can be used.
  • a malfunction of the volume control due to an unintentional touch input may be prevented.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to an embodiment.
  • FIG. 2 is a block diagram of an electronic device capable of controlling a volume using a touch screen, according to an embodiment.
  • FIG. 3 is a flowchart illustrating an operation of controlling a volume using a touch screen of an electronic device, according to an exemplary embodiment.
  • FIG. 4 is a diagram for describing one area among edge areas related to activation of a volume control function, according to an exemplary embodiment.
  • FIG. 5 is a view for explaining one of the edge areas related to activation of a volume control function, according to an exemplary embodiment.
  • FIG. 6 is a diagram for describing a volume control UI displayed according to activation of a volume control function, according to an exemplary embodiment.
  • FIG. 7 is a diagram for describing a volume control UI displayed according to activation of a volume control function, according to an exemplary embodiment.
  • FIG. 8 is a diagram for explaining an operation of moving a location of a volume control UI, according to an exemplary embodiment.
  • FIG. 9 is a diagram for explaining an operation of performing a different function according to a type of a touch input, according to an exemplary embodiment.
  • FIG. 10 is a diagram for describing a volume control UI that changes according to a user's situation, according to an exemplary embodiment.
  • FIG. 11 is a diagram for explaining a volume control operation that varies according to a user's situation, according to an exemplary embodiment.
  • FIG. 12 is a view for explaining an embodiment to which the present disclosure is applied when an electronic device is in an out-fold form.
  • FIG. 13 is a diagram for describing an embodiment to which the present disclosure is applied when an electronic device is a smart watch.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to an embodiment.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input device 150 , a sound output device 155 , a display device 160 , an audio module 170 , and a sensor module ( 176 , interface 177 , haptic module 179 , camera module 180 , power management module 188 , battery 189 , communication module 190 , subscriber identification module 196 , or antenna module 197 . ) may be included. In some embodiments, at least one of these components (eg, the display device 160 or the camera module 180 ) may be omitted or one or more other components may be added to the electronic device 101 . In some embodiments, some of these components may be implemented as a single integrated circuit. For example, the sensor module 176 (eg, a fingerprint sensor, an iris sensor, or an illuminance sensor) may be implemented while being embedded in the display device 160 (eg, a display).
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illumina
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be loaded into the volatile memory 132 , process commands or data stored in the volatile memory 132 , and store the resulting data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be loaded into the volatile memory 132 , process commands or data stored in the volatile memory 132 , and store the resulting data in the non-volatile memory 134 .
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and an auxiliary processor 123 (eg, a graphic processing unit, an image signal processor) that can be operated independently or together with the main processor 121 . , a sensor hub processor, or a communication processor). Additionally or alternatively, the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a designated function. The auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • a main processor 121 eg, a central processing unit or an application processor
  • an auxiliary processor 123 eg, a graphic processing unit, an image signal processor
  • the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a designated function.
  • the auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • the auxiliary processor 123 may be, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or when the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the auxiliary processor 123 eg, an image signal processor or a communication processor
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176 ) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input device 150 may receive a command or data to be used by a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls. According to an embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display device 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
  • the display device 160 may include a touch circuitry configured to sense a touch or a sensor circuit (eg, a pressure sensor) configured to measure the intensity of a force generated by the touch. have.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input device 150 , or an external electronic device (eg, a sound output device 155 ) connected directly or wirelessly with the electronic device 101 . The sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • an external electronic device eg, a sound output device 155
  • the sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used for the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a wireless communication module 192 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 eg, : It may include a local area network (LAN) communication module, or a power line communication module.
  • a corresponding communication module is a first network 198 (eg, a short-range communication network such as Bluetooth, WiFi direct, or IrDA (infrared data association)) or a second network 199 (eg, a cellular network, the Internet, or It may communicate with an external electronic device via a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as a LAN or WAN.
  • These various types of communication modules may be integrated into one component (eg, a single chip) or may be implemented as a plurality of components (eg, multiple chips) separate from each other.
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified and authenticated.
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as a part of the antenna module 197 .
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the electronic devices 102 and 104 may be the same or a different type of the electronic device 101 .
  • all or part of the operations performed by the electronic device 101 may be executed by one or more of the external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • the one or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, or client-server computing technology This can be used.
  • FIG. 2 is a block diagram of an electronic device capable of controlling a volume using a touch screen, according to an embodiment.
  • the electronic device 101 may include a touch screen 210 , an audio output unit 220 , a sensor 230 , and a processor 240 .
  • the touch screen 210 may receive a control signal or a graphic signal under the control of the processor 240 to display a corresponding screen.
  • the touch screen 210 includes a lock screen, a home screen, content received from an external device through a communication module (eg, the communication module 190 ), or content stored in a memory (eg, the memory 130 ). can be displayed.
  • the content may mean a single file (or a set of files) that can be loaded by the processor 240 and displayed on the touch screen 210 or transmitted to the outside through a communication module, and the type , or there are no restrictions on the format.
  • the type of content may include at least one of a text document, an application, a photo, a picture, and a video, but the type of content is not limited as described above.
  • the touch screen 210 may detect a touch input sensed by the sensing panel.
  • the touch screen 210 may receive a user input and transmit it to the processor 240 .
  • the touch screen 210 may include at least one of a touch panel and a digitizer.
  • the touch screen 210 may sense a user's touch input.
  • the touch screen 210 may be implemented in various types such as a capacitive type, a pressure sensitive type, a piezoelectric type, and the like.
  • the capacitive type is a method of calculating touch coordinates by sensing micro-electricity excited by the user's body when a part of the user's body is touched on the surface of the touch screen 210 using a dielectric coated on the surface of the touch panel.
  • the pressure-sensitive type includes two electrode plates embedded in the touch panel of the touch screen 210, and when the user touches the screen, the touch coordinates are calculated by sensing that the upper and lower plates of the touched point come into contact and current flows. am.
  • the touch screen 210 detects a touch (or hovering) input by a part of the user's body (eg, a finger), and transmits touch information (eg, at least one of a touch position and a touch type) to the processor 240 .
  • touch panel of the touch screen 210 in the present disclosure may include a touch screen panel including a plurality of electrodes and a touch screen panel IC (TSP IC).
  • TSP IC touch screen panel IC
  • the touch screen 210 may detect a touch (or hovering) input by at least one of an active electrostatic solution (AES) type pen or an electro capacitive resonance (ECR) type pen.
  • AES active electrostatic solution
  • ECR electro capacitive resonance
  • the digitizer may include a sensor substrate (eg, an EMR sensing panel) formed of an array of loop coils to operate in an EMR (ElectroMagnetic Resonance) sensing method.
  • the digitizer may include an EMR sensing panel and a controller for processing an electrical signal output from the EMR sensing panel.
  • the window may be formed of, for example, tempered glass forming an outer layer of the sensing panel.
  • the digitizer mounts a plurality of coils on the sensor substrate and detects the position of the pen by detecting the electromagnetic change generated by the approach of the resonance circuit included in the pen.
  • the EMR sensing panel does not necessarily need to be disposed on the upper part of the touch screen 210 , and may be mounted on the lower part of the touch screen 210 .
  • the EMR sensing panel and the capacitive touch screen 210 capable of detecting a user's finger contact may be provided in the form of a multi-input device that enables simultaneous use.
  • a display is installed on the upper layer of the EMR sensing panel, and the input point of the finger can be measured by measuring the change in capacitance at the point where the user's finger touches the upper layer of the display.
  • the touch panel may be installed.
  • the processor 240 may activate the volume control function.
  • the edge area may mean a partial area including a boundary of the touch screen 210 that can be viewed by a user.
  • the boundary of the touch screen 210 may mean a portion of the touch screen 210 that meets the outer frame of the electronic device 101 .
  • the electronic device 101 is capable of folding the touch screen 210 and has an out-fold shape in which the touch screen 210 has both outer surfaces, the touch screen 210 .
  • the boundary of may include an area in three directions (eg, upper side, one of left and right sides, and lower side) of the touch screen 210 that meets the external frame, and an area in which the touch screen 210 is folded (eg, the other side of left and right). have.
  • one region of the edge region of the touch screen 210 may be a partial region including the boundary of the touch screen 210 in four directions (eg, upper, left, right, lower), and the touch screen It may be a partial region including the boundary of the left and right sides of 210 .
  • the processor 240 may activate a volume control function.
  • a volume control function is activated when a touch input is sensed in at least a portion of an edge region of the touch screen 210 will be described below with reference to FIG. 4 .
  • the processor 240 may display an object (eg, a visual cue) related to a volume control function on one region of an edge region of the touch screen 210 . According to an embodiment, when a preset touch input for changing the volume is detected in the area where the object is displayed, the processor 240 may activate the volume control function.
  • an object eg, a visual cue
  • the object related to the volume control function may be movable within the edge area of the touch screen 210 . This will be described below with reference to FIG. 8 .
  • an object related to the volume control function is displayed on one area of the edge area of the touch screen 210 , and the object is displayed.
  • the processor 240 activates the volume control function, and when the user cannot see the touch screen 210 of the electronic device 101, the touch screen 210
  • the processor 240 may activate the volume control function.
  • the preset touch input for activating the volume control function may be touching (eg, touching and holding, long press) an area of the edge area of the touch screen 210 for more than a preset time.
  • the present invention is not limited thereto, and various types of touch input may be set in order to activate the volume control function (eg, continuous double touch, multi-touch that touches two or more points at the same time, etc.).
  • various functions may be performed according to a type of a touch input input to an area where the object is displayed. For example, when touch and hold, which are preset touch inputs for activating the volume control function, are input to the area where the object is displayed, the processor 240 activates the volume control function, and when the area where the object is displayed is touched and swiped, , a UI for performing a volume control function and other functions may be displayed. For example, when a touch input of swiping after touching an area on which an object is displayed is detected, the processor 240 displays a tray on which icons for executing other applications are displayed, or adjusts the brightness of the touch screen 210 . may be displayed or a UI capable of controlling various volumes (ring tone volume, alarm volume, system volume, etc.) of the electronic device 101 may be displayed.
  • An embodiment of performing different functions according to the type of touch input as described above will be described with reference to FIG. 9 .
  • the processor 240 may provide feedback indicating that the volume control function is activated. have. For example, when a preset touch input for activating the volume control function is detected, the processor 240 may display the volume control UI on the touch screen 210 . In another embodiment, the processor 240 provides vibration feedback and sound feedback as feedback notifying that the volume control function is activated, displays the content displayed on the touch screen 210 to shake for a preset time, or displays the touch screen ( 210) may be brightly displayed for a preset time.
  • the processor 240 may change the volume based on the direction and distance of the drag input. For example, after the volume control function is activated and feedback indicating that the volume control function is activated is provided, if a drag input is detected along the edge area of the touch screen 210 , the processor 240 determines the direction and distance of the drag input You can change the volume based on According to an embodiment, a drag input may be detected continuously with a preset touch input for activating the volume control function, and after the preset touch input for activating the volume control function, the touch input is released and a new drag input is applied. may be detected.
  • the processor 240 adjusts the direction and distance of the drag input. Based on the change, the volume can be changed and the volume control UI can be changed. An embodiment of changing the volume control UI based on a drag input will be described below with reference to FIG. 7 .
  • the processor 240 may display an enlarged volume control UI on the touch screen 210 .
  • the processor 240 allows the user to view the touch screen 210 . It is confirmed that there is no situation, and an enlarged volume control UI may be displayed on the touch screen 210 .
  • An embodiment of displaying a volume control UI having a different size according to a user situation will be described below with reference to FIG. 10 .
  • the processor 240 may be operatively connected to the touch screen 210 , the audio output unit 220 , and the sensor 230 to control overall operations and functions of the electronic device 101 .
  • the processor 240 may control the touch screen 210 to display content.
  • the content displayed on the touch screen 210 may be selected by the user.
  • displaying the content means that the electronic device 101 executes the content using an application corresponding to the content and displays the executed application execution screen (eg, an application execution screen corresponding to the content). .
  • the audio output unit 220 may output a sound signal to the outside of the electronic device 101 .
  • the audio output unit 220 may be a speaker, an earphone jack, or a communication module (eg, the communication module 190 ).
  • the audio signal when the audio signal is output through the communication module, the sound may be output through a wireless earphone or external speaker wirelessly connected to the electronic device 101 .
  • the audio output unit 220 may be used for general audio signal reproduction, such as multimedia reproduction or recording reproduction, or may be used for outputting a received voice signal during a call.
  • the processor 240 may output an audio signal of a volume changed based on the direction and distance of the drag input through the audio output unit 220 .
  • the senor 230 may detect an external environment state (eg, ambient light, movement of the electronic device) and generate data corresponding to the sensed state.
  • the sensor 230 may include an illuminance sensor, a proximity sensor, a grip sensor, an acceleration sensor, or a gyro sensor.
  • the processor 240 may determine whether the user can view the screen of the electronic device 101 based on the data received from the sensor 230 . For example, the processor 240 determines that the electronic device 101 is in the pocket based on the sensing value of the illuminance sensor, the acceleration sensor, or the gyro sensor, and the user sees the screen of the electronic device 101 . It can be seen that this is an impossible situation. In another embodiment, the processor 240 determines that the electronic device 101 is currently in a state in which a call signal is being transmitted and received and that the user is holding the electronic device 101 to the ear based on the sensing value of the proximity sensor, and the user It can be confirmed that the user cannot see the screen of the electronic device 101 .
  • volume function by activating the volume function through a preset touch input for activating the volume function, it is possible to prevent malfunction of the volume control due to an unintentional touch input.
  • FIG. 3 is a flowchart illustrating an operation of controlling a volume using a touch screen of an electronic device, according to an exemplary embodiment.
  • the electronic device receives a preset touch input for activating the volume control function in one of the edge regions of the touch screen. When detected, it can provide feedback indicating that the volume control function is activated.
  • an object 420 related to a volume control function may be displayed on one of the edge regions of the touch screen.
  • the electronic device displays the content screen 410 on the touch screen (eg, the touch screen 210 ), and displays the object 420 related to the activation of the volume control function in one area of the edge area of the touch screen.
  • the content screen 410 may be an execution screen, a home screen, or a lock screen of an application selected by the user.
  • the object 420 related to the volume control function may be displayed only when an audio signal is being output, or may be displayed even when the audio signal is not output.
  • the execution screen of the application selected by the user may be a screen of an application related to the audio signal being output or a screen of an application not related to the audio signal being output.
  • the electronic device when a preset touch input for activating the volume control function is detected in at least some of the edge regions 520 of the touch screen, the electronic device activates the volume control function. can do.
  • the electronic device displays the content screen 510 on the touch screen, and when a preset touch input for activating the volume control function is detected in one area of the edge area 520 of the touch screen, the volume control function is performed. can be activated.
  • the content screen 510 may be an execution screen, a home screen, or a lock screen of an application selected by the user.
  • the electronic device may activate only the edge area 520 of the touch panel of the touch screen. Accordingly, even when the screen of the touch screen is turned off (eg, when the electronic device is in a pocket), only the edge area 520 of the touch screen enables touch input detection, so that power efficiency is increased while using the volume control function. effect can be expected.
  • the execution screen of the application selected by the user may be a screen of an application related to the audio signal being output or a screen of an application not related to the audio signal being output.
  • a preset touch input for activating the volume control function is detected through the edge area 520 in four directions (eg, upper, left, right, lower) of the touch screen. , but is not limited thereto, and a preset touch input for activating the volume control function may be sensed through edge regions in two directions (eg, left and right) of the touch screen.
  • the edge area 520 of the touch screen is displayed for convenience of explanation, but this is only for division of the area, and in actual implementation, the edge area 520 may not be displayed on the touch screen of the electronic device. .
  • the electronic device may provide feedback indicating that the volume control function is activated when a preset touch input is sensed in one of the edge regions of the touch screen shown in FIG. 4 or FIG. 5 .
  • the electronic device may display the volume control UI on the touch screen as feedback indicating that the volume control function is activated.
  • the volume control function when a preset touch input 620 for activating a volume control function is detected while the electronic device is displaying the content screen 610 on the touch screen, the volume control function is activated.
  • the volume control UI 630 may be displayed on the touch screen.
  • the volume control UI 630 may include a UI 631 corresponding to the current volume state.
  • the electronic device may also display a UI 632 for moving the volume control UI 630 together.
  • an audio signal is being output and an application screen related to the audio signal being output is limitedly illustrated.
  • an application screen, a home screen, and a lock screen that are not related to the audio signal being output may be displayed, and the audio signal is not displayed.
  • the volume control function is activated, and the volume control UI may be displayed as feedback informing of this.
  • the electronic device may change the volume based on the direction and distance of the drag input.
  • electronic The device may increase the volume by an amount corresponding to the distance of the drag input.
  • the electronic device may move the UI 731-1 corresponding to the current volume state upward to correspond to the increased volume.
  • the electronic device may decrease the volume by an amount corresponding to the distance of the drag input. According to an embodiment, the electronic device may move the UI 731 - 2 corresponding to the current volume state downward to correspond to the increased volume.
  • the audio signal is being output and the application screen related to the audio signal being output is limitedly illustrated.
  • the application screen, the home screen, and the lock screen not related to the audio signal being output are displayed. It may be in the displayed state, or when a preset touch input is detected even when an audio signal is not being output, the volume control function is activated, and the volume control UI may be displayed as feedback informing this.
  • the electronic device when a preset touch input for activating the volume control function is detected in one of the edge regions of the touch screen, the electronic device provides vibration feedback instead of the volume control UI as feedback indicating that the volume control function is activated. Or, at least one of sound feedback may be provided. According to various embodiments, the electronic device may provide at least one of vibration feedback and sound feedback together with the volume control UI.
  • the electronic device in a state in which the audio signal is being output, may adjust the volume of the audio signal being output based on a drag input and output it.
  • the electronic device when the electronic device is not outputting audio, the electronic device may provide sound feedback with volume adjusted according to volume control or may not provide feedback according to volume control.
  • the sound feedback adjusted to the volume corresponding to the drag input being moved according to a preset period is provided, or when the drag input is finished, the volume It is also possible to provide this conditioned sound feedback.
  • the electronic device may remove the volume control UI 630 if at least one of a touch input or a drag input is not detected for a preset time after the volume control UI 630 is displayed.
  • FIG. 8 is a diagram for explaining an operation of moving a location of a volume control UI, according to an exemplary embodiment.
  • the electronic device may display a volume control UI informing that the volume control function is activated.
  • the volume control UI may also display a UI 832 for moving the location of the volume control UI.
  • the electronic device detects a touch input in the area where the UI 832 for moving the location of the volume control UI is displayed and detects the drag input 833, as shown in FIG. 8(b). Similarly, the position of the volume control UI 810 may be moved based on the drag input 833 . For example, the electronic device may move the location of the volume control UI 810 to one area of the edge area closest to the location where the drag input 833 ends.
  • the electronic device after the electronic device moves and displays the location of the volume control UI 810 , when the drag input 820 is detected along the edge area, the electronic device adjusts the direction and distance of the drag input 820 . You can control the volume based on it.
  • FIG. 8(a) shows an embodiment in which the volume control UI moves from the left edge area to the right edge area
  • the present invention is not limited thereto, and a touch input is applied to the area of the UI 832 for moving the location of the volume control UI.
  • the electronic device may move the volume control UI in the vertical direction along the edge area where the volume control UI is displayed based on the drag input.
  • the electronic device may move the volume control UI based on the drag input input thereafter.
  • a UI for moving the volume control UI may not be displayed.
  • the volume control UI it is possible to adjust the position of the volume control UI according to the usage of the electronic device for each user, and it is possible to customize the user's hand regardless of whether the user's hand is the right hand or the left hand, or the size of the user's hand. Since the hand control becomes more convenient, the user's convenience may be increased.
  • FIG. 9 is a diagram for explaining an operation of performing a different function according to a type of a touch input, according to an exemplary embodiment.
  • the electronic device eg, the electronic device 101
  • the electronic device when a touch input of a different type from a preset touch input for activating the volume control function is detected on the object 910 displayed on the touch screen, the electronic device performs a function corresponding to the different type of touch input. can do. For example, when a touch input of swiping 911 in the opposite direction to the edge area after touching the object 910 displayed on the edge area of the touch screen is sensed, the electronic device displays as shown in FIG. 9(b) . , a function corresponding to a touch input of swiping 911 may be performed.
  • a function corresponding to a touch input of a different type from the preset touch input for activating the volume control function is to display the tray 920 on which icons for executing other applications are displayed, or the brightness of the touch screen.
  • a UI for adjusting the , or a UI capable of controlling various volumes (ringer volume, alarm volume, system volume, etc.) of the electronic device may be displayed.
  • mapping different functions to each of the plurality of touch inputs input to the object displayed on the touch screen there is an effect that many functions can be easily implemented without displaying a menu corresponding to each of the plurality of functions.
  • FIG. 10 is a diagram for describing a volume control UI that changes according to a user's situation, according to an exemplary embodiment.
  • the electronic device checks whether the user can see the touch screen.
  • the electronic device determines whether a call signal is transmitted and received and the sensor (eg, proximity sensor, illuminance sensor) of the electronic device is determined. Based on the sensed value, it may be confirmed that the user cannot see the touch screen because the user is holding the electronic device to his or her ear for a call.
  • the sensor eg, proximity sensor, illuminance sensor
  • the electronic device when it is confirmed that the user cannot see the touch screen, the electronic device activates the volume control function and displays the enlarged volume control UI 1010 on the touch screen as feedback informing of this.
  • the electronic device after displaying the volume control UI 1010 of the enlarged size, when a drag input is detected not only on the edge area of the touch screen but also on the volume control UI 1010 of the enlarged size, the drag input You can change the volume based on the direction and distance of
  • the electronic device determines that the user cannot see the touch screen, and although the volume control UI 1010 may be displayed, the touch screen does not display the volume control UI 1010 of the enlarged size, and allows a touch input only to an area corresponding to the volume control UI 1010 of the enlarged size. You can also activate the touch panel of
  • the area capable of detecting a drag input for changing the volume is expanded to more easily change the volume.
  • FIG. 11 is a diagram for explaining a volume control operation that varies according to a user's situation, according to an exemplary embodiment.
  • the electronic device may determine whether a preset condition is satisfied in operation 1110 .
  • the electronic device may determine whether a preset condition is satisfied based on an operation state of the electronic device and data sensed from a sensor.
  • satisfying a preset condition may mean that the user can see the touch screen of the electronic device. For example, if it is determined that the electronic device is in a pocket based on a sensing value of an illuminance sensor, an acceleration sensor, or a gyro sensor of the electronic device, the electronic device may determine that the user cannot see the touch screen. have. As another embodiment, when the electronic device is in a state in which a call signal is being transmitted and received and it is confirmed that the user is holding the electronic device to his or her ear based on the sensing value of the proximity sensor, the electronic device is a situation in which the user cannot see the touch screen. can be confirmed as
  • the electronic device may display an object in one region of an edge region of the touch screen.
  • the electronic device may display an object related to a volume control function in one area of an edge area of the touch screen.
  • a preset touch input to the object area may be detected.
  • the electronic device may detect a preset touch input for activating a volume control function on an area in which an object is displayed among edge areas of the touch screen.
  • the electronic device may provide a volume control UI feedback informing that the volume control function is activated. For example, when a preset touch input for activating the volume control function is detected, the electronic device may activate the volume control function and display the volume control UI as feedback informing of this. According to an embodiment, the electronic device may provide at least one of vibration feedback and sound feedback along with the volume control UI display.
  • the electronic device may change the volume based on the direction and distance of the drag input. For example, if the position where the touch input is sensed moves upward along the edge area of the touch screen after the volume control UI is displayed, the electronic device may increase the volume by an amount corresponding to the distance of the drag input. According to another embodiment, after the volume control UI is displayed, if the position where the touch input is sensed moves downward along the edge area of the touch screen, the electronic device may decrease the volume by an amount corresponding to the distance of the drag input. . According to an embodiment, the electronic device may change and display the volume control UI by reflecting the changed state of the volume.
  • the electronic device may detect a preset touch input in at least some of the edge regions of the touch screen. For example, if it is determined that the user cannot see the touch screen, the electronic device does not display the object in the edge area of the touch screen, and activates the volume control function in at least some of the edge areas of the touch screen. When a preset touch input is detected, the volume control function may be activated.
  • the electronic device may provide a vibration feedback indicating that the volume control function is activated.
  • the electronic device may provide at least one of vibration feedback and sound feedback, and may not provide feedback according to an embodiment.
  • the electronic device may change the volume based on the direction and distance of the drag input. For example, if the position where the touch input is sensed moves upward along the edge area of the touch screen after the volume control UI is displayed, the electronic device may increase the volume by an amount corresponding to the distance of the drag input. According to another embodiment, after the volume control UI is displayed, if the position where the touch input is sensed moves downward along the edge area of the touch screen, the electronic device may decrease the volume by an amount corresponding to the distance of the drag input. . According to an embodiment, the electronic device may change and display the volume control UI by reflecting the changed state of the volume.
  • the electronic device may ignore even if a preset touch input for activating the volume control function is detected in some of the edge regions of the touch screen.
  • an object is displayed to prevent a collision with other touch input functions, and in a situation in which the user cannot see the touch screen, the object is displayed in at least some of the edge regions of the touch screen.
  • FIG. 12 is a view for explaining an embodiment to which the present disclosure is applied when an electronic device is in an out-fold form.
  • the electronic device eg, the electronic device 101
  • the electronic device may have an out-fold shape in which the touch screen 1210 is folded to form both outer surfaces.
  • the edge area of the touch screen 1210 meets the external frame of the electronic device in three directions of the touch screen 1210 (eg, one of upper, left and right, lower).
  • the touch screen 1210 may include a region in which the touch screen 1210 is folded (eg, the other side of left and right, 1211 ).
  • a preset touch input for activating a volume control function is provided not only in the boundary region of the touch screen 1210 meeting the external frame of the electronic device, but also in one region of the region 1211 in which the touch screen 1210 is folded.
  • the volume control function can be activated, an object related to the volume control function is displayed on one of the regions 1211 in which the touch screen is folded, and a preset touch input for activating the volume control function is detected on the object. You can also activate the volume control function.
  • the electronic device may change the volume based on the direction and distance of the drag input.
  • FIG. 13 is a diagram for describing an embodiment to which the present disclosure is applied when an electronic device is a smart watch.
  • an electronic device (eg, the electronic device 101 ) may be a smart watch including a touch screen 1310 .
  • the edge region of the touch screen 1310 may be a region that meets the external frame of the electronic device.
  • the touch screen 1310 is illustrated as having a circular shape, but the present invention is not limited thereto and may have a polygonal shape.
  • the electronic device when the electronic device is a smart watch, the electronic device controls the volume of an audio signal output from itself or works with an external electronic device (eg, a smartphone or a tablet PC) to be output from the external electronic device. You can control the volume of the audio signal.
  • an external electronic device eg, a smartphone or a tablet PC
  • the electronic device may activate the volume control function, and An object related to the volume control function may be displayed on one of the edge regions, and the volume control function may be activated when a preset touch input for activating the volume control function is detected on the object.
  • the electronic device may change the volume based on the direction and distance of the drag input.
  • the electronic device 101 includes a touch screen 210 and a processor 240 operatively connected to the touch screen 210 , and the processor includes an edge of the touch screen.
  • the processor includes an edge of the touch screen.
  • the processor 240 controls the touch screen 210 to display an object for activating the volume control function on the one area of the edge area, and the preset touch on the object When an input is detected, a feedback indicating that the volume control function is activated may be provided.
  • the object may be movable within the edge area after the volume control function is activated.
  • the processor 240 when a first touch input for activating the volume control function is detected on the object, the processor 240 provides a volume control UI (user interface) for volume control as the feedback, When a second touch input different from the first touch input is detected on the object, a UI for a function different from the volume control function may be displayed.
  • a volume control UI user interface
  • the processor 240 when it is confirmed that the user cannot see the touch screen, the processor 240 does not display the object on the touch screen 210, but displays the object on at least a part of the edge area.
  • the volume control function may be activated, and at least one of vibration and sound may be provided as the feedback.
  • the processor 240 displays a volume control user interface (UI) for volume control. It is provided as the feedback, and when a drag input is detected along the edge region after the feedback is provided, the volume is changed based on the direction and distance of the drag input, and the volume control UI can be changed based on the changed volume have.
  • UI volume control user interface
  • an enlarged volume control UI is displayed on the touch screen 210 , and on the enlarged volume control UI
  • the volume may be changed based on the direction and distance of the drag input.
  • it further includes an audio output unit 220 , wherein the processor 240 outputs an audio signal through the audio output unit 220, and outputs the preset touch input and When the drag input is sensed, an audio signal of a volume changed based on the direction and distance of the drag input may be output through the audio output unit 220 .
  • the processor 240 may When the preset touch input is detected in one of the folded regions of have.
  • a preset touch input for activating the volume control function is applied to one of the edge regions of the touch screen 210 of the electronic device 101 .
  • detecting providing feedback indicating that the volume control function is activated, and when a drag input is sensed along the edge area after the feedback is provided, changing the volume based on the direction and distance of the drag input can do.
  • the method may further include displaying an object for activating the volume control function on the one area of the edge area, and providing the feedback may include detecting the preset touch input to the object. In this case, feedback indicating that the volume control function is activated may be provided.
  • the object may be movable within the edge area after the volume control function is activated.
  • the providing of the feedback may include, when a first touch input for activating the volume control function is sensed on the object, providing a volume control user interface (UI) for volume control as the feedback, , when a second touch input different from the first touch input is detected on the object, displaying a UI for a function different from the volume control function may be further included.
  • UI volume control user interface
  • the displaying of the object includes providing the feedback without displaying the object on the touch screen 210 when it is confirmed that the user cannot see the touch screen 210 .
  • the operation may include activating the volume control function and providing at least one of vibration and sound as the feedback when the preset touch input is sensed in at least a portion of the edge region.
  • the providing of the feedback may include, when the preset touch input for activating the volume control function is detected in the one area among the edge areas, a volume control user interface (UI) for volume control is provided as the feedback, and the operation of changing the volume includes: when a drag input is detected along the edge region after the feedback is provided, changing the volume based on the direction and distance of the drag input, and changing the volume It may further include an operation of changing the volume control UI based on the.
  • UI volume control user interface
  • the method further includes displaying an enlarged volume control UI on the touch screen 210 , and changing the volume
  • the volume may be changed based on the direction and distance of the drag input.
  • the drag input when the preset touch input and the drag input are detected during the operation of outputting an audio signal through the audio output unit 220 of the electronic device 101 and outputting the audio signal, the drag input is The method may further include outputting an audio signal of a volume changed based on a direction and a distance through the audio output unit 220 .
  • the operation of providing the feedback includes the touch screen 210 .
  • the electronic device 101 may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smartphone, a tablet PC, an e-book, etc.), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device. .
  • a portable communication device eg, a smartphone, a tablet PC, an e-book, etc.
  • a computer device eg., a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • the electronic device according to an embodiment of the present disclosure is not limited to the above-described devices.
  • first”, “second”, or “first” or “second” may simply be used to distinguish the component from other components in question, and may refer to components in other aspects (e.g., importance or order) is not limited. that one (eg first) component is “coupled” or “connected” to another (eg, second) component with or without the terms “functionally” or “communicatively” When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in the present disclosure may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • One or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • the processor eg, the processor 120
  • the device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to an embodiment disclosed in this document may be provided by being included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play StoreTM) or on two user devices (eg, It can be distributed (eg downloaded or uploaded) directly, online between smartphones (eg: smartphones).
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, omitted, or , or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Dispositif électronique selon divers modes de réalisation comprenant : un écran tactile ; et un processeur connecté de manière fonctionnelle à l'écran tactile, le processeur pouvant, lorsqu'une entrée tactile prédéfinie pour activer une fonction de commande de volume est détectée dans une partie d'une zone de bord de l'écran tactile, fournir une rétroaction indiquant que la fonction de commande de volume est activée, et lorsqu'une entrée de glissement est détectée le long de la zone de bord après que la rétroaction a été fournie, changer le volume sur la base de la direction et de la distance de l'entrée de glissement. Divers autres modes de réalisation sont possibles.
PCT/KR2021/004011 2020-03-31 2021-03-31 Dispositif électronique et son procédé de commande WO2021201603A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200039367A KR20210121918A (ko) 2020-03-31 2020-03-31 전자 장치 및 이의 제어 방법
KR10-2020-0039367 2020-03-31

Publications (1)

Publication Number Publication Date
WO2021201603A1 true WO2021201603A1 (fr) 2021-10-07

Family

ID=77929529

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/004011 WO2021201603A1 (fr) 2020-03-31 2021-03-31 Dispositif électronique et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20210121918A (fr)
WO (1) WO2021201603A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080204423A1 (en) * 2007-02-28 2008-08-28 Lg Electronics Inc. Executing functions through touch input device
KR20130114413A (ko) * 2012-04-09 2013-10-17 김정훈 엣지 슬라이딩 ui가 적용된 스마트 리모트 컨트롤러
US20150248213A1 (en) * 2014-02-28 2015-09-03 Samsung Electrônica da Amazônia Ltda. Method to enable hard keys of a device from the screen
KR20160080036A (ko) * 2014-12-29 2016-07-07 삼성전자주식회사 사용자 단말 장치 및 그의 제어 방법
KR20170080799A (ko) * 2015-12-30 2017-07-11 삼성디스플레이 주식회사 플렉서블 디스플레이 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080204423A1 (en) * 2007-02-28 2008-08-28 Lg Electronics Inc. Executing functions through touch input device
KR20130114413A (ko) * 2012-04-09 2013-10-17 김정훈 엣지 슬라이딩 ui가 적용된 스마트 리모트 컨트롤러
US20150248213A1 (en) * 2014-02-28 2015-09-03 Samsung Electrônica da Amazônia Ltda. Method to enable hard keys of a device from the screen
KR20160080036A (ko) * 2014-12-29 2016-07-07 삼성전자주식회사 사용자 단말 장치 및 그의 제어 방법
KR20170080799A (ko) * 2015-12-30 2017-07-11 삼성디스플레이 주식회사 플렉서블 디스플레이 장치

Also Published As

Publication number Publication date
KR20210121918A (ko) 2021-10-08

Similar Documents

Publication Publication Date Title
WO2020045947A1 (fr) Dispositif électronique de commande de propriété d'écran sur la base de la distance entre un dispositif d'entrée de stylet et le dispositif électronique et son procédé de commande
WO2020085789A1 (fr) Dispositif électronique pliable pour commander une interface utilisateur et son procédé de fonctionnement
WO2020013528A1 (fr) Affichage souple et dispositif électronique le comportant
WO2021118061A1 (fr) Dispositif électronique et procédé de configuration d'agencement utilisant ledit dispositif
WO2019117566A1 (fr) Dispositif électronique et procédé de commande d'entrée associé
WO2021162435A1 (fr) Dispositif électronique et procédé d'activation de capteur d'empreinte digitale
WO2019103396A1 (fr) Procédé de configuration d'interface d'entrée et dispositif électronique associé
WO2020017743A1 (fr) Dispositif électronique comprenant une unité d'affichage sur laquelle est affiché un écran d'exécution pour de multiples applications, et procédé de fonctionnement du dispositif électronique
WO2020085628A1 (fr) Procédé d'affichage d'objets et dispositif électronique d'utilisation associé
WO2020159308A1 (fr) Dispositif électronique et procédé permettant de mapper une fonction avec une entrée de bouton
WO2021091286A1 (fr) Dispositif électronique comprenant un capteur pour détecter une entrée externe
WO2020171608A1 (fr) Dispositif électronique permettant la fourniture d'une fonction d'entrée d'écriture manuscrite et procédé de fonctionnement correspondant
WO2019035607A1 (fr) Dispositif électronique et procédé de commande de signaux de détection de toucher, et support d'informations
WO2021133123A1 (fr) Dispositif électronique comprenant un écran flexible et son procédé de fonctionnement
AU2018321518B2 (en) Method for determining input detection region corresponding to user interface and electronic device thereof
WO2020091538A1 (fr) Dispositif électronique pour afficher un écran par l'intermédiaire d'un panneau d'affichage en mode de faible puissance et son procédé de fonctionnement
WO2021194252A1 (fr) Dispositif électronique et procédé de partage d'écran
WO2020091530A1 (fr) Procédé et dispositif de détermination de compensation pour des données tactiles sur la base d'un mode de fonctionnement d'un dispositif d'affichage
WO2020013542A1 (fr) Dispositif électronique et procédé d'exécution de fonction de dispositif électronique
WO2021137321A1 (fr) Dispositif électronique destiné à la fourniture de contenu et son procédé de commande
WO2020032512A1 (fr) Dispositif électronique et procédé d'affichage d'une mise à disposition pour fournir une charge de batterie de dispositif externe par l'intermédiaire d'un dispositif d'affichage
WO2021145614A1 (fr) Dispositif électronique pour commander un dispositif électronique externe et procédé associé
WO2021025456A1 (fr) Procédé de commande reposant sur une entrée tactile et dispositif électronique associé
WO2021201603A1 (fr) Dispositif électronique et son procédé de commande
WO2022085940A1 (fr) Procédé et appareil de commande d'affichage d'une pluralité d'objets sur un dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21782026

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21782026

Country of ref document: EP

Kind code of ref document: A1