WO2018131928A1 - Appareil et procédé de fourniture d'interface utilisateur adaptative - Google Patents

Appareil et procédé de fourniture d'interface utilisateur adaptative Download PDF

Info

Publication number
WO2018131928A1
WO2018131928A1 PCT/KR2018/000610 KR2018000610W WO2018131928A1 WO 2018131928 A1 WO2018131928 A1 WO 2018131928A1 KR 2018000610 W KR2018000610 W KR 2018000610W WO 2018131928 A1 WO2018131928 A1 WO 2018131928A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
size
input
user
frequency value
Prior art date
Application number
PCT/KR2018/000610
Other languages
English (en)
Korean (ko)
Inventor
미로소 세프척데미안
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to EP18738701.4A priority Critical patent/EP3565284B1/fr
Priority to US16/477,851 priority patent/US10852904B2/en
Publication of WO2018131928A1 publication Critical patent/WO2018131928A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04897Special input arrangements or commands for improving display capability
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/008Teaching or communicating with blind persons using visual presentation of the information for the partially sighted

Definitions

  • the present disclosure relates generally to an electronic device, and more particularly, to an apparatus and a method for providing a user interface (UI) adaptively according to a user's state.
  • UI user interface
  • Electronic devices such as laptop computers, tablet PCs, ultra-mobile PCs, and smartphones provide users with greater mobility and convenience.
  • the electronic device is required to provide an improved user interface (UI) according to the user's situation. For example, if the user is moving, if the sensitivity of the touch input is reduced due to winter weather, if the user wears gloves, if the user's vision is poor, or if the user is unable to make the correct touch input due to certain diseases In this case, the user may not correctly select a button which the user intends to input from among the displayed input buttons. Accordingly, the electronic device is required to provide a UI adaptively according to the state of the user.
  • UI user interface
  • the present disclosure provides an apparatus and method for adaptively providing a user interface (UI) according to a state of a user.
  • UI user interface
  • a method of operating an electronic device may include: a user interface including an object having a first size for executing at least one function of an application executed in the electronic device; UI), detecting the movement of the electronic device based on data acquired by the sensor of the electronic device, and detecting the detected movement and the frequency of input of the user input detected by the object.
  • the method may include displaying, on the UI, the object in which the first size is converted to a second size based on the information to bet.
  • an electronic device may include a display displaying a UI and a processor coupled to the display, wherein the processor is configured to execute at least one function of an application executed in the electronic device.
  • Displaying the UI on the object having a first size based on the data obtained through the sensor of the electronic device, detects the movement of the electronic device, the detected movement and the user detected in the object Based on the information for indicating the input frequency of the input, the object may be configured to be displayed on the UI in which the first size is converted into a second size.
  • An electronic device and a method according to various embodiments of the present disclosure may improve the accuracy of a user's touch input by displaying an object converted based on previously stored information and a detected movement of the electronic device.
  • FIG. 1 is a flowchart illustrating an operation of an electronic device for providing a user interface (UI) according to various embodiments of the present disclosure.
  • FIG. 2 illustrates an operation of displaying a UI including objects whose sizes are converted according to various embodiments of the present disclosure.
  • 3A to 3C illustrate an operation of an electronic device that stores heat map information on a user input according to various embodiments of the present disclosure.
  • 4A to 4C illustrate an operation of an electronic device for adaptively providing a UI according to various embodiments of the present disclosure.
  • 5A and 5B illustrate an operation of an electronic device for providing a UI based on an input frequency value level of each object according to various embodiments of the present disclosure.
  • 6A and 6B illustrate an operation of an electronic device for providing a UI based on a face image of a user acquired according to various embodiments of the present disclosure.
  • FIG. 7A and 7B illustrate an operation of an electronic device for providing a UI based on information about a detected movement of the electronic device according to various embodiments of the present disclosure.
  • FIG. 8 illustrates an operation of an electronic device to provide a UI in response to detecting that the vehicle enters a vehicle according to an embodiment of the present disclosure.
  • FIG. 9 illustrates an operation of an electronic device for providing a UI in response to detecting that the vehicle enters a vehicle according to another embodiment.
  • FIG. 10 illustrates an operation of displaying a UI including an enlarged object according to various embodiments of the present disclosure.
  • FIG. 11 illustrates a network environment including an electronic device according to various embodiments of the present disclosure.
  • FIG. 12 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • FIG. 13 is a block diagram of a program module according to various embodiments of the present disclosure.
  • expressions such as “A or B” or “at least one of A and / or B” may include all possible combinations of items listed together. Expressions such as “first,” “second,” “first,” or “second,” etc. may modify the components, regardless of order or importance, to distinguish one component from another. Used only and do not limit the components. When any (eg first) component is said to be “connected” or “connected” to another (eg second) component, the other component is said other The component may be directly connected or connected through another component (eg, a third component).
  • the expression “device configured to” may mean that the device “can” together with other devices or components.
  • processor configured (or configured to) perform A, B, and C may be implemented by executing a dedicated processor (eg, an embedded processor) to perform its operation, or one or more software programs stored in a memory device. It may mean a general purpose processor (eg, a CPU or an application processor) capable of performing the corresponding operations.
  • An electronic device may be, for example, a smartphone, a tablet PC, a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a netbook computer, a workstation, a server, a PDA, a PMP ( portable multimedia player), an MP3 player, a medical device, a camera, or a wearable device.
  • Wearable devices may be accessory (e.g. watches, rings, bracelets, anklets, necklaces, eyeglasses, contact lenses, or head-mounted-devices (HMDs), textiles or clothing integrated (e.g.
  • the electronic device may be, for example, a television, a digital video disk (DVD) player, or audio.
  • Refrigerator, air conditioner, cleaner, oven, microwave, washing machine, air purifier, set-top box, home automation control panel, security control panel, media box (e.g. Samsung HomeSync TM , Apple TV TM , or Google TV TM ), game consoles (for example: Xbox TM, PlayStation TM), may include electronic dictionary, at least one of the electronic key, a camcorder, or a digital photo frame.
  • the electronic device may include various medical devices (eg, various portable medical measuring devices (such as blood glucose meters, heart rate monitors, blood pressure monitors, or body temperature meters), magnetic resonance angiography (MRA), magnetic resonance imaging (MRI), Computed tomography (CT), cameras or ultrasounds), navigation devices, global navigation satellite systems (GNSS), event data recorders (EDRs), flight data recorders (FDRs), automotive infotainment devices, ship electronics (E.g.
  • various portable medical measuring devices such as blood glucose meters, heart rate monitors, blood pressure monitors, or body temperature meters
  • MRA magnetic resonance angiography
  • MRI magnetic resonance imaging
  • CT Computed tomography
  • GNSS global navigation satellite systems
  • EDRs event data recorders
  • FDRs flight data recorders
  • automotive infotainment devices ship electronics
  • an electronic device may be a piece of furniture, a building / structure or a car, an electronic board, an electronic signature receiving device, a projector, or various measurement devices (eg, water, electricity, Gas, or a radio wave measuring instrument).
  • the electronic device may be flexible or a combination of two or more of the aforementioned various devices.
  • An electronic device according to an embodiment of the present disclosure is not limited to the above-described devices.
  • the term user may refer to a person who uses an electronic device or a device (eg, an artificial intelligence electronic device) that uses an electronic device.
  • an object is represented by a display and includes an image, text, and the like, capable of sensing user input.
  • the object may be an icon displayed on a background screen, a play button displayed by a music application, a pause button, a stop button, and / or a keyboard input button.
  • a user using the electronic device may not normally select an object displayed on the display of the electronic device according to a specific user state. For example, when the user is moving while using the electronic device, the user may not accurately select an object displayed on the display. For example, when the user wears gloves, when the user's eyesight is poor, when the sensitivity of the input sensing sensor included in the display of the electronic device decreases due to cold weather, or because of a specific disease that the user is experiencing May not correctly select the displayed object.
  • FIG. 1 is a flowchart illustrating an operation of an electronic device for providing a UI according to various embodiments of the present disclosure.
  • the operations illustrated in FIG. 1 are illustrated as being performed by an electronic device, but each of the operations may be performed by a component included in the electronic device.
  • each of the operations may be executed by at least one processor included in the electronic device.
  • Each of the operations may be an application programming interface (API) or a set of instructions stored in a memory of the electronic device.
  • API application programming interface
  • each operation illustrated in FIG. 1 may be omitted or added to some operations, and the same operation may be repeatedly performed.
  • the electronic device displays a UI including an object having a first size for executing at least one function of an application executed in the electronic device.
  • the first size means a default size preset by an application.
  • the object may include an album image, a play button, a pause button, a stop button, a rewind button, a fast forward button, a repeat play button, and a random button of a specific artist. And the like.
  • the electronic device detects a movement of the electronic device based on data obtained through the sensor of the electronic device.
  • Methods of detecting the movement of the electronic device may vary according to implementation methods.
  • the electronic device moves by using one or more of a global positioning system (GPS) or a geo-magnetic sensor, a gravity sensor, an acceleration sensor, or a gyro sensor included in the electronic device.
  • GPS global positioning system
  • the electronic device may detect that the electronic device has entered the vehicle by communicating with a beacon device or a vehicle system installed in the vehicle.
  • the electronic device displays a UI including an object in which the first size is converted to the second size, based on the information indicating the detected motion and the input frequency of the electronic device.
  • the second size means a size different from the first size, and the second size may be determined according to various methods.
  • the second size may be a size according to a preset ratio in the electronic device.
  • the second size may be differently determined according to a level of classifying an input frequency value of the object on a predetermined basis. For example, when the level corresponding to the input frequency value of the object is one level, the electronic device may enlarge the size of the object by a ratio corresponding to the first level.
  • the electronic device when the input frequency value of the object is larger than the input frequency value corresponding to the first level (that is, the second level), the electronic device is divided by the ratio corresponding to the second level (that is, the first level).
  • the size of the object can be enlarged by a ratio larger than the corresponding ratio.
  • the second size may be determined based on information about an incorrect input. For example, when it is detected by the electronic device that the user of the electronic device frequently inputs the upper region of the object without normally inputting the object, the electronic device may enlarge the size of the object in the vertical direction. Can be.
  • Information for indicating the input frequency may vary depending on the implementation method.
  • the information for indicating the frequency of the input may be heatmap information indicating the input frequency of the user input detected within a region of the object.
  • the heat map is information for displaying the frequency of occurrence of an event (eg, sensed user input) occurring in a specific area in different colors.
  • the electronic device may determine what object the user of the electronic device frequently inputs using the heat map information generated according to the frequency value of the user input detected for a certain time.
  • the information indicating the frequency of the input may be a frequency value of a user input detected in an area other than the object.
  • the user may not input the play button displayed by the electronic device normally while walking, but may input an area other than the play button.
  • the electronic device may generate heat map information according to a frequency value of a user input detected outside of the area of the object for a predetermined time (in the present disclosure, a user input detected outside the area of the object may be referred to as an incorrect input). By using, the user of the electronic device may identify a pattern that is input incorrectly.
  • the electronic device may update previously stored information in addition to the information indicating the frequency of the input to the object.
  • the pre-stored information may be size expansion information preset in the electronic device.
  • the electronic device may display a UI in which the first size is enlarged (or reduced) to the second size according to a preset ratio in the electronic device.
  • the pre-stored information may include one or more of information on the age, gender, vision, and disease history of the user of the electronic device.
  • the user's vision is below a certain reference value (e.g., visual acuity of 0.1), making it difficult to identify objects displayed on the electronic device, or the user may be attributable to certain diseases (e.g. Parkinson's disease).
  • a certain reference value e.g., visual acuity of 0.1
  • the electronic device may display a UI including an object in which the first size is converted to the second size.
  • the pre-stored information may be updated every certain period (eg, one day, one week, or one month).
  • FIG. 2 illustrates an operation of displaying a UI including objects whose sizes are converted according to various embodiments of the present disclosure.
  • FIG. 2 assumes that a music application is executed in the electronic device, and the electronic device is the same even when other applications (for example, a media playback application, a game application, a schedule application, a text or a call application, etc.) are executed.
  • UI can be provided according to the principle.
  • the electronic device displays a UI 200a indicating a state in which music is being played.
  • the UI 200a includes at least one object.
  • the UI 200a includes an image 210 related to the music currently being played, a rewind button 212a, a pause button 212b, a stop button 212c, a fast forward button 212d, a random button 214a, and a repeat play button 214b.
  • another object may be included in the UI 200a according to an embodiment.
  • a play button, a song list button, and / or a menu button may be further included.
  • the user of the electronic device may normally select an object included in the UI 200a in a general situation. However, if the electronic device detects that the user is unable to select the object normally (for example, when the user is moving), the electronic device may display the UI 200b in which the size of some objects is enlarged. Can be. In addition, the remaining objects that are determined not to be input frequently by the user may be displayed in the same size or in a smaller size than the general state. For example, the rewind button 212a, the pause button 212b, the stop button 212c, and the fast forward button 212d may be displayed in a larger size. In addition, if it is determined that the user does not frequently select the image 210, the electronic device may display the reduced size image 210.
  • Each object included in the UI 200b may be rearranged according to the changed size.
  • the rewind button 212a and the fast forward button 212d may be located at the center of the UI 200b
  • the pause button 212b and the stop button 212c may be located at the bottom of the UI 200b.
  • the image 210, the random button 214a, and the repeat play button 214b may be aligned at regular intervals.
  • FIG. 3A to 3C illustrate an operation of an electronic device that stores heat map information on a user input according to various embodiments of the present disclosure.
  • Each operation illustrated in FIG. 3A may be omitted or added to some operations, and the same operation may be repeatedly performed.
  • the electronic device determines whether a user input is detected within an area of the displayed object in step 310.
  • the area of the object means an input range in which a user can recognize that the object is selected.
  • the electronic device may be configured to operate within the area of the respective objects. It can be determined that an input has been detected. In this case, the electronic device performs step 320.
  • the electronic device performs step 330.
  • the method of determining whether the user input is sensed outside the area of the object varies depending on the implementation method. In one embodiment, if a user input is detected outside the area of the displayed plurality of objects (ie, the area where the objects are not displayed), the electronic device determines that the user input is not detected within the area of the object. Can be. For example, referring to FIG. 3C, a user of the electronic device does not normally input the rewind button 212a according to a specific user situation (for example, when eyesight is poor) and does not normally input the upper portion of the rewind button 212a. You can enter it repeatedly.
  • the electronic device may determine that the user input detected at the upper portion of the rewind button 212a is an incorrect input.
  • the user of the electronic device may repeatedly input the area between the pause button 212b and the stop button 212c without properly inputting the pause button 212b and the stop button 212c according to a specific user situation.
  • the electronic device may determine that a user input detected between the pause button 212b and the stop button 212c is an incorrect input.
  • the electronic device may determine that the user selects an object not intended. For example, when the electronic device receives a user input for one object from among a plurality of displayed objects, and receives an input for canceling the user input within a specified time after the user input is received, the electronic device receives the electronic input. The device may determine that the user inputs an unintended object (that is, a wrong input is detected).
  • the electronic device stores heat map information on the object generated based on an input frequency value of the user input sensed for a certain period of time.
  • the heat map is information for displaying the frequency of occurrence of an event (eg, sensed user input) occurring within a specific area in different colors.
  • the heat map information may be input frequency values (eg, 10, 50, or 100) accumulated in user inputs for a period of time (eg, 1 year, 6 months, 1 month, or a week). Information may be generated based on the time).
  • the heat map may be displayed in various colors according to the input frequency value. As the frequency of input increases, the heat map may be displayed in various colors. For example, referring to FIG.
  • the electronic device hits the electronic device according to the detected input frequency value. Create a map.
  • the frequency value of the user input detected on the rewind button 212a and the fast forward button 212d is the frequency value of the user input detected on the pause button 212b and the stop button 212c (eg, Greater than 75 times per week, the electronic device may generate a heat map representing various colors in the rewind button 212a and the fast forward button 212d.
  • the heatmap for each of the pause button 212b and the stop button 212cd may be displayed using one or more of green and yellow
  • the heatmap for each of the rewind button 212a and the fast forward button 212d may be green, yellow, It may be displayed using one or more of red.
  • the electronic device may store information about the generated heat map in a storage unit of the electronic device.
  • the electronic device may update the heat map information based on a newly detected user input for a certain period of time.
  • the electronic device stores heat map information generated based on an input frequency value of the wrong input.
  • the electronic device may generate heat map information on the erroneous input using a principle similar to the operation implemented in operation 340. For example, referring to FIG. 3C, the electronic device may generate heat map information on a user input repeatedly detected on the rewind button 212a (or fast forward button 212d). In addition, the electronic device may generate heat map information on a user input repeatedly detected between the pause button 212b and the stop button 212c.
  • FIG. 4A to 4C illustrate an operation of an electronic device for adaptively providing a UI according to various embodiments of the present disclosure.
  • the operations illustrated in FIG. 4A are more specifically implemented by the operations of operation 130 illustrated in FIG. 1.
  • Each operation illustrated in FIG. 4A may be omitted or added to some operations, and the same operation may be repeatedly performed.
  • step 410 the electronic device loads an input frequency value of a user input detected within an area of an object or inputs an input frequency value of an incorrect input. Determine whether to load.
  • the electronic device may automatically load an input frequency value of a user input detected within an area of the object. In this case, the electronic device may change the setting according to the update history of the information indicating the detected frequency of input of the user input.
  • the electronic device is set to load an input frequency value of a user input detected within an area of an object, but a frequency of incorrect input detected during a certain period is greater than a frequency of user input detected within an area of an object. If the predetermined ratio is higher, the electronic device may change the setting by loading an input frequency value of an incorrect input.
  • information for loading by the user of the electronic device may be specified. For example, when the movement of the electronic device is detected, the user of the electronic device may set to load an input frequency value of a user input detected within an area of the object. If it is determined that the input frequency value of the user input detected within the area of the object is determined, the electronic device loads heat map information on the user input detected within the area of the object in step 420. If it is determined that the input frequency value of the wrong input is loaded, the electronic device loads heat map information on the incorrect input in step 450.
  • the electronic device determines whether an input frequency value of the object is equal to or greater than a predetermined reference frequency value.
  • the reference frequency value may be variously set according to an implementation method.
  • the reference frequency value may be a default value preset in the electronic device.
  • the reference frequency value may be a value specified by a user of the electronic device. For example, the user of the electronic device may set a reference frequency value to detect 100 user inputs during one week period in one object. In this case, if 120 user inputs are detected in a specific object for one week, the electronic device may determine that an input frequency value for the object is equal to or greater than a predetermined reference frequency value.
  • the electronic device terminates the algorithm for the object. In other words, the electronic device may display an object of the same size or smaller size than the object already displayed. If the input frequency value of the object is equal to or greater than a predetermined reference frequency value, the electronic device performs step 440.
  • the electronic device displays a UI including an object in which the first size is converted to a second size according to a predetermined ratio.
  • the ratio may be determined in various ways depending on the implementation method.
  • the ratio may be a preset value in the electronic device.
  • the ratio may be a value specified by the user of the electronic device.
  • the ratio may be set in percentage (%) or multiples (eg 1.5 times, 2 times, 3 times).
  • the electronic device may display an image, a rewind button 210, a random button 214a, a repeat play button 214b, a rewind button 212a, a pause button 212b, and a stop button. And display UI 400a including 212c and fast forward button 212d.
  • the electronic device may advance.
  • the UI 400b including the rewind button 212a, the pause button 212b, the stop button 212c, and the fast forward button 212d enlarged according to a predetermined ratio may be displayed.
  • the image 210 may be displayed in a smaller size than before in the UI 400b since the user input sensed for a predetermined period is less than the reference frequency value.
  • the random button 214a and the repeat play button 214b may be displayed on the UI 400b in the same size as before.
  • the electronic device determines whether an input frequency value of an incorrect input is equal to or greater than a predetermined reference frequency value.
  • the reference frequency value for the wrong input may be set in various ways according to the implementation method.
  • the reference frequency value for the wrong input may be the same value as the reference frequency value described in step 430 or may be an independently determined value. If the frequency value of the wrong input for the object is less than a predetermined reference frequency value, the electronic device terminates the algorithm for the object. In other words, the electronic device may display an object of the same size or smaller size than the object already displayed. If the frequency value of the wrong input to the object is greater than or equal to a predetermined reference frequency value, the electronic device performs step 470.
  • the electronic device displays a UI including an object whose first size is converted based on the wrong input. For example, referring to FIG. 4C, before the movement of the electronic device is detected, the electronic device may display an image, a rewind button 210, a random button 214a, a repeat play button 214b, a rewind button 212a, a pause button 212b, and a stop button. And display UI 400a including 212c and fast forward button 212d. When the movement of the electronic device is detected, the electronic device may display the UI 400c in which the shape of each object is changed.
  • the electronic device In the case of the rewind button 212a and the fast-forward button 212d, since the user input is mainly sensed above the area of each object, the electronic device is configured to rewind the button 212a and the fast-forward in the vertical direction to correct the wrong input of the user.
  • a button 212d can be displayed.
  • the electronic device may increase the horizontal size of the pause button 212b to correct an incorrect input of the user. And stop button 212c.
  • FIG. 5A and 5B illustrate an operation of an electronic device for providing a UI based on an input frequency value level of each object according to various embodiments of the present disclosure.
  • Each operation illustrated in FIG. 5A may be omitted or added to some operations, and the same operation may be repeatedly performed.
  • the electronic device after the electronic device loads heat map information on a user input detected within an area of the object in step 420, the electronic device corresponds to a level corresponding to an input frequency value of the object in step 510.
  • the level may be set variously according to the implementation method. According to an embodiment of the present disclosure, the level may be a preset value in the electronic device. In another embodiment, the level may be a value specified by a user of the electronic device. For example, if the user of the electronic device has a frequency value of less than 50 user inputs detected in one object for a period of time (for example, 1 year, 6 months, 1 month, or 1 week), the level of the object This level can be set to one level.
  • the user of the electronic device may set the level of the corresponding object to be two levels when the frequency value of the user input detected in one object is 50 or more and less than 100 times during the same period.
  • the user of the electronic device may set the level of the corresponding object to be three levels when the frequency value of the user input detected in one object is 100 or more times during the same period.
  • the electronic device may display an image 210, a random button 214a, and a repeat play button 214b in which the user input is not detected (that is, the input frequency value is less than 50 times) in the displayed UI 500a.
  • the level of can be determined as one level.
  • the electronic device may determine the levels of the pause button 212b and the stop button 212c having the input frequency value of the user input 75 times as two levels.
  • the electronic device may determine the level of the rewind button 212a and the fast forward button 212d having an input frequency value of 130 times as three levels.
  • the period and number of times for classifying the level are for illustration only, and the right is not limited to a specific period and number.
  • FIG. 5A illustrates only three levels, three or more levels may be set depending on the implementation method.
  • the electronic device determines whether the level corresponding to the object is one level. When the level of the object is one level, the electronic device displays an object of the same size as the previously displayed object or smaller than the previously displayed object and ends the algorithm. If the level of the object is not one level, the electronic device performs step 530.
  • the electronic device determines whether the level corresponding to the object is two levels. If the level of the object is two levels, the electronic device displays an object in which the first size is converted to the second size in step 540. If the level of the object is not the second level, in step 550, the electronic device displays the object whose first size is converted to the third size.
  • the third size means an enlarged size than the second size. For example, referring to FIG. 5B, the pause button 212b and the stop button 212c corresponding to the second level are displayed in an enlarged size than the image 210, the random button 214a, and the repeat play button 214b corresponding to the first level.
  • the rewind button 212a and the fast forward button 212d corresponding to the three levels are displayed in an enlarged size than the pause button 212b and the stop button 212c corresponding to the two levels.
  • the rate at which the size of the object is enlarged at each level may be determined in various ways according to the implementation method.
  • the ratio may be a preset value in the electronic device.
  • the ratio may be a value specified by the user of the electronic device.
  • the ratio may be set in percentage (%) or multiples (eg 1.5 times, 2 times, 3 times).
  • the electronic device may display a UI including an object in which the first size is converted to the second size using information previously stored in the electronic device, in addition to information for indicating the frequency of input to the object.
  • the pre-stored information may include visual information of the user and information indicating whether the user wears glasses normally. If the user's vision is below a certain level and the user does not wear glasses, the electronic device will need to provide a UI in which the size of the displayed object is enlarged.
  • FIG. 6A and 6B illustrate an operation of an electronic device for providing a UI based on a face image of a user acquired according to various embodiments of the present disclosure.
  • Each operation illustrated in FIG. 6A may be omitted or added to some operations, and the same operation may be repeatedly performed.
  • the electronic device when the movement of the electronic device is detected in step 120, the electronic device obtains a face image of a user using an image sensor included in the electronic device in step 610.
  • the operation of acquiring the face image by the electronic device may be triggered according to various methods.
  • the electronic device may acquire the face image in response to detecting that the inclination of the electronic device changes.
  • the electronic device may acquire the face image in response to initiating execution of an application in the electronic device. For example, referring to reference numeral 600a of FIG.
  • the electronic device 610a when the user of the electronic device 610a brings the electronic device 610a in front of the user's eye to check the UI displayed on the electronic device 610a, the electronic device 610a
  • the image sensor 610b included in the may acquire a face image of the user.
  • the electronic device determines whether the user of the electronic device wears glasses based on the acquired face image. When it is determined that the user wears glasses as shown by reference numeral 600b of FIG. 6B, the electronic device ends the algorithm and displays the UI 600c in which the size of the previously displayed objects is maintained. If it is determined that the user does not wear the glasses as shown by reference numeral 600d of FIG. 6B, the electronic device performs step 630.
  • the electronic device displays a UI including an object in which the first size is converted to the second size.
  • the electronic device may display a UI 600e including a rewind button 212a, a pause button 212b, a stop button 212c, and a fast forward button 212d enlarged according to a predetermined ratio.
  • the ratio may be a value preset in the electronic device or may be a value specified by a user of the electronic device.
  • the ratio may be set in percentage (%) or multiples (eg 1.5 times, 2 times, 3 times).
  • FIG. 7A and 7B illustrate an operation of an electronic device for providing a UI based on information about a detected movement of the electronic device according to various embodiments of the present disclosure.
  • Each operation illustrated in FIG. 7A may be omitted or added to some operations, and the same operation may be repeatedly performed.
  • the modes (for example, the general mode, the walking mode, and the running mode) of the electronic device described below are merely names arbitrarily referred to for convenience of description, and the scope of rights is not limited thereto.
  • the electronic device measures a moving speed of the electronic device using at least one sensor included in the electronic device.
  • the moving speed of the electronic device can be variously measured according to the implementation method.
  • the electronic device measures a position of the electronic device using a GPS included in the electronic device, and measures a moving speed of the electronic device based on a distance of the electronic device moved for a predetermined time. Can be.
  • the electronic device may measure the tilt, rotation, and / or movement speed of the electronic device using at least one of a gravity sensor, a gyro sensor, and an acceleration sensor.
  • the electronic device may measure the degree of movement of the electronic device in addition to the moving speed of the electronic device. For example, the electronic device may determine the number of times the movement of the electronic device is detected by the at least one sensor (for example, 3 seconds, 5 seconds, or 10 seconds) (for example, 5 times, 10 or 10 times) can be measured. The electronic device measures the moving speed (or the degree of movement) of the electronic device to determine whether the user is not moving (normal mode), walking (walking mode), and running (running mode). You can decide.
  • the electronic device determines whether the measured speed is equal to or greater than a first speed threshold.
  • the first speed threshold may be variously set according to an implementation method. According to an embodiment of the present disclosure, the first speed threshold may be a preset value in the electronic device. In another embodiment, the first speed threshold may be arbitrarily set by a user of the electronic device. In another embodiment, the first speed threshold may be a value updated in consideration of a pre-stored gender, age, and disease history of the user. For example, the first speed threshold may be set to a walking speed of a typical adult male (eg, 6-8 km / h, or 100 movements per minute).
  • the first speed threshold value may be lower than the lower limit value in consideration of the user's situation. For example, 2 to 3 km / h, or 50 moves per minute) can be updated. If it is determined that the measured speed is less than the first speed threshold, the electronic device performs step 740. If it is determined that the measured speed is greater than or equal to the first speed threshold, the electronic device performs step 750.
  • the electronic device determines that the electronic device is in a normal mode. For example, referring to FIG. 7B, as shown by reference numeral 700a, the electronic device may determine that the user is not moving. In this case, the electronic device terminates the algorithm and displays the UI 700b in which the size of previously displayed objects is maintained.
  • the electronic device determines whether the measured speed is greater than or equal to a second speed threshold.
  • the second speed threshold may be variously set in a similar principle to the first speed threshold.
  • the second speed threshold may be a preset value, a value arbitrarily designated by a user, or an updated value based on prestored user information.
  • the second speed threshold may be set to a running speed of a typical adult male (eg, 8 km / h or more, or 150 times per minute). If it is determined that the measured speed is less than the second speed threshold, the electronic device performs step 760. If it is determined that the measured speed is greater than or equal to the second speed threshold, the electronic device performs step 780.
  • the electronic device determines that the electronic device is in a walking mode. For example, referring to FIG. 7B, as shown by reference numeral 700c, the electronic device may determine that the user is walking at a constant speed.
  • the electronic device may display a UI including an object in which the first size is converted to the second size according to a predetermined ratio.
  • the electronic device may display a UI 700d including a rewind button 212a, a pause button 212b, a stop button 212c, and a fast forward button 212d having a size larger than the size displayed in the normal mode.
  • the ratio may be determined in various ways depending on the implementation method.
  • the ratio may be a value preset in the electronic device or may be a value specified by a user of the electronic device.
  • the ratio may be set in percentage (%) or multiples (eg 1.5 times, 2 times, 3 times).
  • the electronic device determines that the electronic device is in a running mode. For example, referring to FIG. 7B, as shown by reference numeral 700e, the electronic device may determine that the user is running at a constant speed. In this case, in operation 790, the electronic device may display a UI including an object in which the first size is converted into a third size having a constant ratio greater than the second size. For example, as shown in FIG. 7B, the electronic device includes a rewind button 212a, a pause button 212b, a stop button 212c, and a fast forward button 212d having a size larger than the size displayed in the walking mode. Can be displayed.
  • the ratio may be determined in various ways depending on the implementation method. The ratio may be a value preset in the electronic device or may be a value specified by a user of the electronic device. The ratio can be set in percentage or multiples.
  • FIG. 8 illustrates an operation of an electronic device to provide a UI in response to detecting that the vehicle enters a vehicle according to an embodiment of the present disclosure.
  • Each operation illustrated in FIG. 8 may be omitted or added to some operations, and the same operation may be repeatedly performed.
  • the electronic device 810 displays a UI including an object having a first size.
  • the object may be an album image, a play button, a pause button, a stop button, a rewind button, a fast forward button, a repeat play button, a random button, etc. of a specific artist. .
  • the electronic device 810 receives a beacon signal from the beacon device 820.
  • the electronic device 810 may receive a beacon signal from the beacon device 820 through various communication methods.
  • the beacon device 820 may broadcast fixed information included in a packet to the electronic device 810.
  • the fixed information may include various information.
  • the beacon signal may include an identifier of the beacon device 820.
  • the electronic device 810 may determine that the electronic device 810 has entered the vehicle, based on the received beacon signal. For example, the electronic device 810 may identify the vehicle system 830 of the vehicle in which the electronic device 810 is located based on the identification information included in the beacon signal.
  • the vehicle system 830 may be a device embedded in a vehicle, or may be an external device that is wired or wirelessly connected to the vehicle.
  • the electronic device 810 displays a UI including an object in which the first size is converted to the second size according to a predetermined ratio based on previously stored information.
  • the pre-stored information may include heat map information indicating an input frequency value of a user input for the object, heat map information indicating an input frequency value of a detected user input in addition to an area of the object, or information about a user of the electronic device 810. For example, one or more of the user's eyesight information.
  • the ratio may be a value preset in the electronic device 810, a value predetermined by the user of the electronic device 810, or updated information based on the information about the user. FIG.
  • the electronic device 810 illustrates an operation of automatically displaying a UI including an object having a first size converted to the second size in response to the electronic device 810 receiving a beacon signal, but according to an implementation method, the electronic device 810 A new UI for checking whether to display the UI including the object converted to the second size may be displayed. In this case, the electronic device 810 may display a UI including the object converted to the second size in response to a user input that the UI including the object converted to the second size is displayed.
  • the electronic device 810 establishes a communication link with the vehicle system 830 to share information about an application running in the electronic device 810 with the vehicle system 830.
  • the electronic device 810 may include WiFi direct (wireless fidelity direct), infrared ray (IR), Bluetooth (Bluetooth), Zigbee, Z-Wave, Visible Light Communication (VLC), 3G, LTE, and 5G.
  • a communication connection with the vehicle system 830 may be established through a communication standard.
  • the electronic device 810 may transmit information about an application running in the electronic device 810 to the vehicle system 830 through the connected wireless communication connection.
  • the information about the application may include information about the UI of the application displayed on the electronic device 810 and information about the size expansion of the objects included in the UI.
  • the vehicle system 830 displays the UI displayed on the electronic device 810 based on the information received from the electronic device 810.
  • the vehicle system 830 may include a UI including objects such as a play button, a pause button, a stop button, a rewind button, a fast forward button, a repeat play button, and a random button. It can be displayed.
  • the objects may be displayed with their size enlarged.
  • FIG. 9 illustrates an operation of an electronic device for providing a UI in response to detecting that the vehicle enters a vehicle according to another embodiment. For each of the operations shown in FIG. 9, some operations may be omitted or added according to an implementation method, and the same operation may be repeatedly performed.
  • the electronic device 910 displays a UI including an object having a first size.
  • the object may be an album image, a play button, a pause button, a stop button, a rewind button, a fast forward button, a repeat play button, a random button, etc. of a specific artist. .
  • the electronic device 910 detects a communication event with the vehicle system 920.
  • the electronic device 910 communicates with the vehicle system 920 through communication standards such as WiFi direct, IR, near field communication (NFC), Bluetooth, Zigbee, Z-Wave, VLC, 3G, LTE, and 5G. You can see that it can be done.
  • the vehicle system 920 may be a device embedded in the vehicle, or may be an external device connected to the vehicle by wire or wirelessly.
  • the electronic device 910 may determine that the electronic device 910 enters the inside of the vehicle including the vehicle system 920.
  • the electronic device 910 displays a UI including an object in which the first size is converted to the second size according to a predetermined ratio based on previously stored information.
  • FIG. 9 illustrates an operation of displaying a UI including an object that is automatically converted to the second size in response to the electronic device 910 receiving a beacon signal, but according to an implementation method, the electronic device 910 may be configured to the second size.
  • a new UI may be displayed to confirm whether to display the UI including the converted object.
  • the electronic device 910 may display a UI including the object converted to the second size in response to a user input that the UI including the object converted to the second size is displayed.
  • the electronic device 910 transmits information about the application to share the information about the application running on the electronic device 910 with the vehicle system 920.
  • the information about the application may include an identifier of the application and information (eg, heat map information, enlargement ratio, etc.) for expanding the first size to the second size.
  • the vehicle system 920 displays the UI displayed on the electronic device 910 based on the information about the application received from the electronic device 910.
  • the vehicle system 920 may include a UI including objects such as a play button, a pause button, a stop button, a rewind button, a fast forward button, a repeat play button, and a random button. It can be displayed. The objects may be displayed with their size enlarged.
  • FIG. 10 illustrates an operation of displaying a UI including an enlarged object according to various embodiments of the present disclosure.
  • the electronic device illustrates a UI 1000a for an application executed in the electronic device in a general situation (ie, a normal mode).
  • a general situation ie, a normal mode
  • the electronic device may display a UI including images or input buttons that are not enlarged in size.
  • the electronic device In response to the movement of the electronic device being detected, the electronic device displays the UI 1010 superimposed on the displayed UI 1000a to determine whether to switch to the magnification mode.
  • the magnification mode is a mode in which the sizes of some objects are enlarged based on heat map information, a mode in which the sizes of some objects are enlarged based on a level corresponding to each object, and based on an input detected outside the area of each object. It may include one of modes in which the sizes of the corresponding objects are enlarged.
  • the detected movement may include one of a moving speed of the electronic device, a face image of the user, whether the user wears glasses, and whether the electronic device enters the vehicle.
  • the UI 1010 may include selection buttons 1012 and 1014.
  • the electronic device When a user input is detected on the selection button 1012 indicating "YES”, the electronic device displays the UI 1000c having enlarged sizes of the objects. If a user input is detected on the selection button 1014 indicating "NO”, the electronic device maintains the previously displayed UI 1000a.
  • a method of operating an electronic device providing a user interface includes displaying an UI including an object having a first size for executing at least one function of an application executed in the electronic device. Storing the information indicating the user input frequency detected by the object, detecting the movement of the electronic device based on data acquired through a sensor of the electronic device, And displaying a UI including the object in which the first size is converted to a second size in the UI based on the information for indicating the movement and the input frequency.
  • the storing of the information for indicating the input frequency may include determining an input frequency value of a user input detected within an area of the object, and a heat map of the input frequency of the object based on the determined input frequency value. And storing information in the electronic device.
  • the displaying of the UI including the object having the first size converted to the second size may include checking whether the input frequency value of the object is equal to or greater than a predetermined reference frequency value, and the object. If the input frequency value is equal to or greater than the predetermined reference frequency value, the method may include displaying a UI including the object in which the first size is converted to the second size.
  • the displaying of the UI including the object having the first size converted to the second size may include determining a level corresponding to an input frequency value of the object; And displaying a UI including the object having the first size converted to the second size or the third size according to a level, wherein the third size is larger than the second size, and the second size is It may be larger than the first size.
  • the storing of information indicating the input frequency may include determining a user input detected in an area other than the object as an incorrect input, and heat map information on the incorrect input to the electronic device. And displaying the UI including the object having the first size converted to the second size, based on the detected motion and a frequency value for the wrong input.
  • the method may further include displaying a UI including the object whose shape is converted based on the wrong input.
  • the method may further include displaying a UI including the object in which the first size is converted to the second size in the UI based on the detected movement and information previously stored in the electronic device.
  • the stored information may include at least one of size magnification information preset in the electronic device, information regarding the age, gender, vision, and disease history of the user of the electronic device.
  • the method may further include determining whether the user wears the glasses, and displaying the UI including the object in which the first size is converted to the second size, confirms that the user does not wear the glasses.
  • the method may include displaying a UI including the object having the first size converted to the second size.
  • the detecting of the movement of the electronic device may include measuring a speed at which the electronic device moves through at least one sensor included in the electronic device, and the first speed at which the measured speed is predetermined. If the threshold value is less than the electronic device determines that the normal mode, and if the measured speed is greater than the first speed threshold and less than the second predetermined speed threshold, the electronic device determines that the walking mode and And determining that the electronic device is in a running mode when the measured speed is greater than the second speed threshold, and in the UI, the object whose first size is converted to a second size.
  • the displaying of the UI that includes the at least one of the electronic device in a walking mode may include converting the first size into a second size in the UI.
  • the third size may be larger than the second size, and the second size may be larger than the first size.
  • the detecting of the movement of the electronic device may include checking that the electronic device has entered the vehicle, and checking the electronic device has entered the vehicle. Confirming that the electronic device has entered the interior by receiving a beacon signal installed therein and confirming that the electronic device has entered the interior by performing close communication with a vehicle system installed in the vehicle; Perform any one of the processes.
  • FIG. 11 illustrates a network environment including an electronic device according to various embodiments of the present disclosure.
  • an electronic device 1101 may include a bus 1110, a processor 1120, a memory 1130, an input / output interface 1150, a display 1160, and a communication interface 1170.
  • the electronic device 1101 may omit at least one of the components or additionally include other components.
  • the bus 1110 may include circuits that connect the components 1110-1170 to each other and transfer communication (eg, control messages or data) between the components.
  • the processor 1120 may include one or more of a central processing unit (CPU), an application processor (AP), or a communication processor (CP).
  • the processor 1120 may execute an operation or data processing related to control and / or communication of at least one other element of the electronic device 1101.
  • the memory 1130 may include volatile and / or nonvolatile memory.
  • the memory 1130 may store commands or data related to at least one other element of the electronic device 1101.
  • the memory 1130 may store software and / or a program 1140.
  • the program 1140 may include, for example, a kernel 1141, middleware 1143, an application programming interface (API) 1145, and / or an application program (or “application”) 1147. At least a portion of kernel 1141, middleware 1143, or API 1145 may be referred to as an operating system.
  • the kernel 1141 may be, for example, system resources (eg, bus 1110, processor 1120, or the like) used to execute an action or function implemented in other programs (eg, middleware 1143, API 1145, or application program 1147). Memory 1130, etc.) may be controlled or managed. In addition, the kernel 1141 may provide an interface for controlling or managing system resources by accessing individual components of the electronic device 1101 from the middleware 1143, the API 1145, or the application program 1147.
  • system resources eg, bus 1110, processor 1120, or the like
  • the kernel 1141 may provide an interface for controlling or managing system resources by accessing individual components of the electronic device 1101 from the middleware 1143, the API 1145, or the application program 1147.
  • the middleware 1143 may serve as an intermediary to allow the API 1145 or the application program 1147 to communicate with the kernel 1141 to exchange data.
  • the middleware 1143 may process one or more work requests received from the application program 1147 according to priority.
  • the middleware 1143 gives priority to use system resources (eg, bus 1110, processor 1120, or memory 1130, etc.) of the electronic device 1101 to at least one of the application programs 1147, and assigns the one or more work requests.
  • API 1145 is an interface for application 1147 to control functions provided by kernel 1141 or middleware 1143.
  • at least one interface or function such as file control, window control, image processing, or character control may be used. Command)).
  • the input / output interface 1150 may, for example, transmit a command or data input from a user or another external device to other component (s) of the electronic device 1101, or a command received from other component (s) of the electronic device 1101. Alternatively, data can be output to the user or other external device.
  • the display 1160 may be, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, or a microelectromechanical system (Micro Electromechanical System). Mechanical systems, MEMS) displays, or electronic paper displays.
  • the display 1160 may display various contents (or objects, for example, text, images, videos, icons, and / or symbols, etc.) to the user.
  • the display 1160 may include a touch screen.
  • the display 1160 may receive a touch, gesture, proximity, or hovering input using an electronic pen or a part of a user's body.
  • the communication interface 1170 may establish communication between the electronic device 1101 and an external device (eg, the first external electronic device 1102, the second external electronic device 1104, or the server 1106).
  • the communication interface 1170 may be connected to the network 1162 through wireless or wired communication to communicate with an external device (eg, the second external electronic device 1104 or the server 1106).
  • the wireless communication may be, for example, LTE, LTE Advance (LTE-A), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications system (UMTS), wireless broadband (WiBro), or global network (GSM).
  • LTE Long Term Evolution
  • LTE-A LTE Advance
  • CDMA code division multiple access
  • WCDMA wideband CDMA
  • UMTS universal mobile telecommunications system
  • WiBro wireless broadband
  • GSM global network
  • the wireless communication includes, for example, wireless fidelity (WiFi), Bluetooth, Bluetooth low power (BLE), Zigbee, near field communication (NFC), magnetic secure transmission, and radio. It may include at least one of a frequency (RF) or a body area network (BAN).
  • RF frequency
  • BAN body area network
  • the wireless communication may include a GNSS.
  • the GNSS may be, for example, a Global Positioning System (GPS), a Global Navigation Satellite System (Glonass), a Beidou Navigation Satellite System (hereinafter referred to as "Beidou”), or a Galileo, the European global satellite-based navigation system.
  • GPS Global Positioning System
  • Glonass Global Navigation Satellite System
  • Beidou Beidou Navigation Satellite System
  • Galileo the European global satellite-based navigation system.
  • Wired communication may include, for example, at least one of a universal serial bus (USB), a high definition multimedia interface (HDMI), a standard standard232 (RS-232), a power line communication, a plain old telephone service (POTS), and the like.
  • the network 1162 may include at least one of a telecommunication network, for example, a computer network (eg, LAN or WAN), the Internet, or a telephone network.
  • Each of the first external electronic device 1102 and the second external electronic device 1104 may be the same or different type of device as the electronic device 1101.
  • all or part of operations executed in the electronic device 1101 may be executed in another or a plurality of electronic devices (for example, the electronic device 1102, the electronic device 1104, or the server 1106).
  • the electronic device 1101 when the electronic device 1101 needs to perform a function or service automatically or by request, the electronic device 1101 may instead execute or execute the function or service by itself, or at least some associated function thereof.
  • the other device eg, the electronic device 1102, the electronic device 1104, or the server 1106).
  • the other electronic device may execute the requested function or the additional function and transmit the result to the electronic device 1101.
  • the electronic device 1101 may process the received result as it is or additionally to provide the requested function or service.
  • cloud computing distributed computing, or client-server computing technology may be used.
  • the electronic device 1201 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 1201 may include all or part of the electronic device 1101 illustrated in FIG. 11.
  • the electronic device 1201 may include at least one processor (eg, AP) 1210, communication module 1220, subscriber identification module 1224, memory 1230, sensor module 1240, input device 1250, display 1260, interface 1270, audio module 1280, It may include a camera module 1291, a power management module 1295, a battery 1296, an indicator 1297, and a motor 1298.
  • processor eg, AP
  • the processor 1210 may, for example, run an operating system or an application program to control a plurality of hardware or software components connected to the processor 1210 and perform various data processing and operations.
  • the processor 1210 may be implemented with, for example, a system on chip (SoC).
  • SoC system on chip
  • the processor 1210 may further include a graphic processing unit (GPU) and / or an image signal processor.
  • the processor 1210 may include at least some of the components illustrated in FIG. 12 (eg, the cellular module 1221).
  • the processor 1210 may load and process instructions or data received from at least one of the other components (eg, nonvolatile memory) into the volatile memory, and store the result data in the nonvolatile memory.
  • the communication module 1220 may have a configuration that is the same as or similar to that of the communication interface 1170 illustrated in FIG. 11, for example.
  • the communication module 1220 may include, for example, a cellular module 1221, a WiFi module 1223, a Bluetooth module 1225, a GNSS module 1227, an NFC module 1228, and an RF module 1229.
  • the cellular module 1221 may provide, for example, a voice call, a video call, a text service, or an internet service through a communication network.
  • the cellular module 1221 may perform identification and authentication of the electronic device 1201 in a communication network using a subscriber identification module (eg, a SIM card) 1224.
  • a subscriber identification module eg, a SIM card
  • the cellular module 1221 may perform at least some of the functions that the processor 1210 may provide.
  • the cellular module 1221 may include a communication processor (CP).
  • CP communication processor
  • at least some (eg, two or more) of the cellular module 1221, the WiFi module 1223, the Bluetooth module 1225, the GNSS module 1227, or the NFC module 1228 may be included in one integrated chip (IC) or IC package.
  • the RF module 1229 may transmit / receive, for example, a communication signal (eg, an RF signal).
  • the RF module 1229 may include, for example, a transceiver, a power amp module (PAM), a frequency filter, a low noise amplifier (LNA), an antenna, or the like.
  • PAM power amp module
  • LNA low noise amplifier
  • at least one of the cellular module 1221, the WiFi module 1223, the Bluetooth module 1225, the GNSS module 1227, or the NFC module 1228 may transmit and receive an RF signal through a separate RF module.
  • Subscriber identification module 1224 may include, for example, a card or embedded SIM that includes a subscriber identification module, and may include unique identification information (such as an integrated circuit card identifier (ICCID)) or subscriber information (such as an international IMSI). mobile subscriber identity)).
  • ICCID integrated circuit card identifier
  • subscriber information such as an international IMSI). mobile subscriber identity
  • the memory 1230 may include, for example, an internal memory 1232 or an external memory 1234.
  • the internal memory 1232 may be, for example, volatile memory (such as DRAM, SRAM, or SDRAM), nonvolatile memory (such as one time programmable ROM (OTPROM), PROM, EPROM, EEPROM, mask ROM, flash ROM, flash). It may include at least one of a memory, a hard drive, or a solid state drive (SSD)
  • the external memory 1234 may be a flash drive, for example, compact flash (CF), secure digital (SD), Micro-. It may include an SD, a Mini-SD, an extreme digital (xD), a multi-media card (MMC), a memory stick, etc.
  • the external memory 1234 may be functionally or physically connected to the electronic device 1201 through various interfaces.
  • the sensor module 1240 may measure a physical quantity or detect an operation state of the electronic device 1201 to convert the measured or detected information into an electrical signal.
  • the sensor module 1240 is, for example, gesture sensor 1240A, gyro sensor 1240B, barometric pressure sensor 1240C, magnetic sensor 1240D, acceleration sensor 1240E, grip sensor 1240F, proximity sensor 1240G, color sensor 1240H (e.g., RGB (red, green, blue sensor), biometric sensor 1240I, temperature / humidity sensor 1240J, illuminance sensor 1240K, or UV (ultra violet) sensor 1240M.
  • sensor module 1240 may include, for example, an e-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EGE) sensor, an electrocardiogram (Eletrocardiogram, ECG) sensors, infrared (IR) sensors, iris sensors and / or fingerprint sensors.
  • the sensor module 240 may further include a control circuit for controlling at least one or more sensors belonging therein.
  • the electronic device 1201 may further include a processor configured to control the sensor module 1240 as part of or separately from the processor 1210 to control the sensor module 1240 while the processor 1210 is in a sleep state. .
  • the input device 1250 may include, for example, a touch panel 1252, a (digital) pen sensor 1254, a key 1256, or an ultrasonic input device 1258.
  • the touch panel 1252 may use at least one of capacitive, resistive, infrared, or ultrasonic methods, for example.
  • the touch panel 1252 may further include a control circuit.
  • the touch panel 1252 may further include a tactile layer to provide a tactile response to the user.
  • the (digital) pen sensor 1254 may be, for example, part of a touch panel or may include a separate recognition sheet.
  • the key 1256 may include, for example, a physical button, an optical key, or a keypad.
  • the ultrasonic input device 1258 may detect ultrasonic waves generated by an input tool through a microphone (for example, a microphone 1288) and check data corresponding to the detected ultrasonic waves.
  • the display 1260 may include a panel 1262, a hologram device 1264, a projector 1266, and / or a control circuit for controlling them.
  • the panel 1262 may be implemented to be, for example, flexible, transparent, or wearable.
  • the panel 1262 may be configured with the touch panel 1252 and one or more modules.
  • the panel 1262 may include a pressure sensor (or force sensor) capable of measuring the intensity of the input to the user's touch.
  • the pressure sensor may be integrated with the touch panel 1252 or may be implemented with one or more sensors separate from the touch panel 1252.
  • the hologram device 1264 may show a stereoscopic image in the air by using interference of light.
  • the projector 1266 may display an image by projecting light onto a screen.
  • the screen may be located inside or outside the electronic device 1201.
  • the interface 1270 may include, for example, an HDMI 1272, a USB 1274, an optical interface 1276, or a D-subminiature 1278.
  • the interface 1270 may be included in, for example, the communication interface 1170 illustrated in FIG. 11. Additionally or alternatively, the interface 1270 may include, for example, a mobile high-definition link (MHL) interface, an SD card / multi-media card (MMC) interface, or an infrared data association (IrDA) standard interface.
  • MHL mobile high-definition link
  • MMC Secure Digital Cellularity
  • IrDA infrared data association
  • the audio module 1280 may bilaterally convert, for example, a sound and an electrical signal. At least some components of the audio module 1280 may be included in, for example, the input / output interface 1145 illustrated in FIG. 11.
  • the audio module 1280 may process sound information input or output through, for example, a speaker 1282, a receiver 1284, an earphone 1286, a microphone 1288, or the like.
  • the camera module 1291 is, for example, a device capable of capturing still images and moving images. According to an embodiment, the camera module 1291 may include at least one image sensor (eg, a front sensor or a rear sensor), a lens, an image signal processor (ISP), or It may include a flash (eg LED or xenon lamp, etc.).
  • ISP image signal processor
  • the power management module 1295 may manage, for example, power of the electronic device 1201.
  • the power management module 1295 may include a power management integrated circuit (PMIC), a charger IC, or a battery or fuel gauge.
  • the PMIC may have a wired and / or wireless charging scheme.
  • the wireless charging method may include, for example, a magnetic resonance method, a magnetic induction method, an electromagnetic wave method, or the like, and may further include additional circuits for wireless charging, such as a coil loop, a resonance circuit, a rectifier, and the like. have.
  • the battery gauge may measure, for example, the remaining capacity of the battery 1296, the voltage, the current, or the temperature during charging.
  • Battery 1296 may include, for example, a rechargeable cell and / or a solar cell.
  • the indicator 1297 may display a specific state of the electronic device 1201 or a portion thereof (for example, the processor 1210), for example, a booting state, a message state, or a charging state.
  • the motor 1298 may convert electrical signals into mechanical vibrations, and may generate vibrations or haptic effects.
  • the electronic device 1201 may be, for example, a mobile TV supporting device (eg, a GPU) capable of processing media data according to a standard such as digital multimedia broadcasting (DMB), digital video broadcasting (DVB), or mediaFlo TM . ) May be included.
  • DMB digital multimedia broadcasting
  • DVD digital video broadcasting
  • mediaFlo TM mediaFlo TM .
  • Each of the components described in this document may be composed of one or more components, and the name of the corresponding component may vary according to the type of electronic device.
  • the electronic device (for example, the electronic device 201) may include some components, omit additional components, or combine some of the components to form a single object, but not before the combination. The functions
  • the program module 1310 (eg, the program 1140) may be configured to operate on an operating system and / or various applications (eg, an application program 1147) that controls resources related to the electronic device (eg, the electronic device 1101). It may include.
  • the operating system may include, for example, Android TM , iOS TM , Windows TM , Symbian TM , Tizen TM , or Bada TM .
  • the program module 1310 may include a kernel 1320 (eg, kernel 1141), middleware 1330 (eg, middleware 1143), API 1360 (eg, API 1145), and / or application 1370 (eg, application program 1147). It may include. At least a part of the program module 1310 may be preloaded on the electronic device or may be downloaded from an external electronic device (eg, the electronic device 1102, the electronic device 1104, the server 1106, etc.).
  • a kernel 1320 eg, kernel 1141
  • middleware 1330 eg, middleware 1143
  • API 1360 eg, API 1145
  • application 1370 eg, application program 1147
  • At least a part of the program module 1310 may be preloaded on the electronic device or may be downloaded from an external electronic device (eg, the electronic device 1102, the electronic device 1104, the server 1106, etc.).
  • the kernel 1320 may include, for example, a system resource manager 1321 and / or a device driver 1323.
  • the system resource manager 1321 may perform control, allocation, or retrieval of system resources.
  • the system resource manager 1321 may include a process manager, a memory manager, or a file system manager.
  • the device driver 1323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, an audio driver, or an inter-process communication (IPC) driver.
  • IPC inter-process communication
  • the middleware 1330 may provide, for example, a function commonly required by the application 1370, or provide various functions to the application 1370 through the API 1360 so that the application 1370 may use limited system resources inside the electronic device.
  • the middleware 1330 may include a runtime library 1335, an application manager 1341, a window manager 1342, a multimedia manager 1343, a resource manager 1344, a power manager 1345, a database manager 1346, a package manager 1347, a connectivity manager 1348, and a notification manager 1349.
  • a location manager 1350, a graphic manager 1351, or a security manager 1352 may be a device that provides, for example, a function commonly required by the application 1370, or provide various functions to the application 1370 through the API 1360 so that the application 1370 may use limited system resources inside the electronic device.
  • the middleware 1330 may include a runtime library 1335, an application manager 1341, a window manager 1342, a multimedia manager 1343, a resource manager 1344, a power manager
  • the runtime library 1335 may include, for example, a library module that the compiler uses to add new functionality through the programming language while the application 1370 is running.
  • the runtime library 1335 may perform input / output management, memory management, or arithmetic function processing.
  • the application manager 1341 may manage, for example, the life cycle of the application 1370.
  • the window manager 1342 may manage GUI resources used on the screen.
  • the multimedia manager 1343 may identify a format required for playing the media files, and may encode or decode the media file using a codec suitable for the format.
  • the resource manager 1344 may manage space of source code or memory of the application 1370.
  • the power manager 1345 may manage, for example, the capacity or power of the battery and provide power information necessary for the operation of the electronic device.
  • the power manager 345 may interwork with a basic input / output system (BIOS).
  • BIOS basic input / output system
  • the database manager 1346 may create, retrieve, or change a database to be used, for example, in the application 1370.
  • the package manager 1347 may manage installation or update of an application distributed in the form of a package file.
  • the connectivity manager 1348 may manage, for example, a wireless connection.
  • the notification manager 1349 may provide an event to the user, for example, an arrival message, an appointment, a proximity notification, and the like.
  • the location manager 1350 may manage location information of the electronic device, for example.
  • the graphic manager 1351 may manage, for example, graphic effects to be provided to the user or a user interface related thereto.
  • the security manager 1352 may provide system security or user authentication, for example.
  • the middleware 1330 may include a telephony manager for managing a voice or video telephony function of the electronic device or a middleware module capable of forming a combination of functions of the above-described components.
  • the middleware 1330 may provide a module specialized for each type of OS.
  • the middleware 1330 may dynamically delete some of the existing components or add new components.
  • API 1360 is a set of API programming functions, for example, and may be provided in different configurations depending on the operating system. For example, in the case of Android or iOS, one API set may be provided for each platform, and in Tizen, two or more API sets may be provided for each platform.
  • Applications 1370 are, for example, home 1371, dialer 1372, SMS / MMS 1373, instant message (IIM) 1374, browser 1375, camera 1376, alarm 1377, contact 1378, voice dial 1379, email 1380, calendar 1381, media player. 1382, album 1383, watch 1384, health care (eg, measuring exercise or blood glucose), or environmental information (eg, barometric pressure, humidity, or temperature information).
  • the application 1370 may include an information exchange application capable of supporting information exchange between the electronic device and the external electronic device.
  • the information exchange application may include, for example, a notification relay application for delivering specific information to the external electronic device, or a device management application for managing the external electronic device.
  • the notification delivery application may deliver notification information generated by another application of the electronic device to the external electronic device, or receive notification information from the external electronic device and provide the notification information to the user.
  • the device management application may be, for example, the ability of an external electronic device to communicate with the electronic device (e.g. turn-on / turn-off of the external electronic device itself (or some component) or the brightness (or resolution) of the display). Control), or install, delete, or update an application running on the external electronic device.
  • the application 1370 may include an application (eg, a health care application of a mobile medical device) designated according to an attribute of the external electronic device.
  • the application 1370 may include an application received from an external electronic device.
  • At least some of the program module 1310 may be implemented (eg, executed) in software, firmware, hardware (eg, the processor 1210), or a combination of at least two or more thereof, and may include modules, programs, routines, It can include instruction sets or processes.
  • module includes a unit composed of hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit.
  • the module may be an integrally formed part or a minimum unit or part of performing one or more functions.
  • Modules may be implemented mechanically or electronically, for example, application-specific integrated circuit (ASIC) chips, field-programmable gate arrays (FPGAs), or known or future developments that perform certain operations. It can include a programmable logic device.
  • ASIC application-specific integrated circuit
  • FPGAs field-programmable gate arrays
  • At least a portion of an apparatus (e.g., modules or functions thereof) or method (e.g., operations) may be implemented with instructions stored on a computer-readable storage medium (e.g., memory 1130) in the form of a program module. Can be.
  • a processor e.g. the processor 1120
  • the processor may perform a function corresponding to the command.
  • Computer-readable recording media include hard disks, floppy disks, magnetic media (e.g. magnetic tape), optical recording media (e.g. CD-ROM, DVD), magnetic-optical media (e.g. floppy disks), internal memory And the like.
  • Instructions can include code generated by a compiler or code that can be executed by an interpreter.
  • Modules or program modules may include at least one or more of the above-described components, some may be omitted, or may further include other components. According to various embodiments, operations performed by a module, program module, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or at least some of the operations may be executed in a different order, omitted, or another operation may be added. Can be.
  • an electronic device providing a user interface includes a display for displaying a UI and a processor coupled to the display, wherein the processor includes at least one of an application executed in the electronic device.
  • the UI is configured to display a UI including the object in which the first size is converted to a second size, based on information for indicating a frequency of a movement and an input to the object stored in the electronic device.
  • the processor is configured to determine an input frequency value of a user input sensed in the area of the object and to store heat map information for the object generated based on the determined input frequency value in the electronic device. Is further configured.
  • the processor may determine whether an input frequency value of the object is greater than or equal to a predetermined reference frequency value, and if the input frequency value of the object is greater than or equal to the predetermined reference frequency value, the first size may be determined by the processor. And display the UI including the object converted to the second size.
  • the processor determines a level corresponding to an input frequency value of the first object and converts the first size to the second size or the third size according to the level of the object. And display the UI including the object, wherein the third size is larger than the second size and the second size is larger than the first size.
  • the processor may determine that a user input detected in an area other than the object is a wrong input, determine a frequency value for the wrong input, and determine the detected motion and a frequency value for the wrong input. Based on, further configured to display a UI including the object in which the shape of the object is converted based on the wrong input.
  • the processor is further configured to display a UI including the object in which the first size is converted to the second size in the UI based on the detected movement and information previously stored in the electronic device.
  • the pre-stored information includes at least one of size magnification information preset in the electronic device, information on an age, gender, and visual acuity of a user of the electronic device.
  • the processor acquires a face image of the user using an image sensor included in the electronic device, and the acquired face Determine whether the user wears the glasses based on the image, and if it is determined that the user does not wear the glasses, further configured to display a UI including the object in which the first size is converted to the second size. do.
  • the processor measures, by the sensor included in the electronic device, a speed at which the electronic device moves, and when the measured speed is less than a first predetermined speed threshold, the electronic device is in a normal mode. Determine that the electronic device is in a walking mode when the measured speed is greater than or equal to the first speed threshold and less than a second predetermined speed threshold, and wherein the measured speed is the second speed.
  • the electronic device determines that the electronic device is in a running mode, and when the electronic device is in the walking mode, displays the UI including the object in which the first size is converted to a second size in the UI.
  • the processor is further configured to confirm that the electronic device has entered the vehicle, and the processor confirms that the electronic device has entered the interior by receiving a beacon signal installed in the vehicle. Or by performing proximity communication with a vehicle system installed in the vehicle, performing any one of operations for confirming that the electronic device has entered the interior.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne, selon divers modes de réalisation, un dispositif électronique et un procédé de fourniture d'une interface utilisateur (UI). Le dispositif électronique de fourniture d'une UI comprend : un dispositif d'affichage permettant d'afficher une UI ; et un processeur connecté à l'affichage, le processeur étant conçu pour: afficher une UI comprenant un objet présentant une première taille, afin d'exécuter au moins une fonction d'une application exécutée par le dispositif électronique ; détecter un mouvement du dispositif électronique en fonction de données acquises par l'intermédiaire d'un capteur du dispositif électronique ; et afficher une UI comprenant l'objet dont la taille est passée de la première taille dans l'UI à une seconde taille, en fonction du mouvement détecté et des informations indiquant la fréquence d'entrées d'utilisateur détectées à partir de l'objet.
PCT/KR2018/000610 2017-01-12 2018-01-12 Appareil et procédé de fourniture d'interface utilisateur adaptative WO2018131928A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP18738701.4A EP3565284B1 (fr) 2017-01-12 2018-01-12 Appareil et procédé de fourniture d'interface utilisateur adaptative
US16/477,851 US10852904B2 (en) 2017-01-12 2018-01-12 Apparatus and method for providing adaptive user interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0005430 2017-01-12
KR1020170005430A KR102488580B1 (ko) 2017-01-12 2017-01-12 적응적인 사용자 인터페이스를 제공하기 위한 장치 및 방법

Publications (1)

Publication Number Publication Date
WO2018131928A1 true WO2018131928A1 (fr) 2018-07-19

Family

ID=62840129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/000610 WO2018131928A1 (fr) 2017-01-12 2018-01-12 Appareil et procédé de fourniture d'interface utilisateur adaptative

Country Status (4)

Country Link
US (1) US10852904B2 (fr)
EP (1) EP3565284B1 (fr)
KR (1) KR102488580B1 (fr)
WO (1) WO2018131928A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7070495B2 (ja) * 2019-04-18 2022-05-18 カシオ計算機株式会社 電子機器、停止判定方法、および停止判定プログラム
KR20210152789A (ko) * 2020-06-09 2021-12-16 삼성전자주식회사 디스플레이 장치, 디스플레이 장치 제어방법 및 디스플레이 시스템
WO2022108019A1 (fr) 2020-11-23 2022-05-27 Samsung Electronics Co., Ltd. Dispositif électronique et procédé d'optimisation d'interface utilisateur d'application
WO2023106862A1 (fr) * 2021-12-08 2023-06-15 삼성전자 주식회사 Dispositif électronique et procédé de fonctionnement d'un dispositif électronique
US12074956B2 (en) 2021-12-08 2024-08-27 Samsung Electronics Co., Ltd. Electronic device and method for operating thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080074384A1 (en) * 2006-09-22 2008-03-27 Research In Motion Limited System and method for adjusting icons, text and images on an electronic device
KR20130065317A (ko) * 2011-12-09 2013-06-19 도시바삼성스토리지테크놀러지코리아 주식회사 그래픽 사용자 인터페이스를 제공하는 방법 및 장치
US20130234929A1 (en) * 2012-03-07 2013-09-12 Evernote Corporation Adapting mobile user interface to unfavorable usage conditions
KR101367060B1 (ko) * 2009-12-23 2014-02-24 모토로라 모빌리티 엘엘씨 시각적 보상을 위한 방법 및 장치
KR20160060386A (ko) * 2014-11-20 2016-05-30 삼성전자주식회사 윈도우의 크기를 변경하는 디바이스 및 그 제어 방법

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7401300B2 (en) 2004-01-09 2008-07-15 Nokia Corporation Adaptive user interface input device
US20070277111A1 (en) * 2006-02-01 2007-11-29 Wright State University Graphical user interface using perception-action icons and a method thereof
US20100146444A1 (en) 2008-12-05 2010-06-10 Microsoft Corporation Motion Adaptive User Interface Service
CN102549533A (zh) 2009-09-25 2012-07-04 日本电气株式会社 输入接收设备、输入接收方法、记录介质和移动通信终端
JP5566676B2 (ja) * 2009-12-18 2014-08-06 富士通コンポーネント株式会社 タッチパネル、及びタッチパネルの座標検出方法
AU2010241260B2 (en) * 2010-10-29 2013-12-19 Canon Kabushiki Kaisha Foreground background separation in a scene with unstable textures
US9536197B1 (en) * 2011-04-22 2017-01-03 Angel A. Penilla Methods and systems for processing data streams from data producing objects of vehicle and home entities and generating recommendations and settings
JP5902505B2 (ja) * 2012-02-21 2016-04-13 京セラ株式会社 携帯端末
US20150089360A1 (en) * 2013-09-25 2015-03-26 At&T Mobility Ii Llc Intelligent Adaptation of User Interfaces
WO2015044830A1 (fr) 2013-09-27 2015-04-02 Visuality Imaging Ltd Procédés et système permettant d'améliorer la lisibilité d'un texte affiché sur l'écran d'un dispositif électronique
US10353567B2 (en) * 2013-11-28 2019-07-16 Kyocera Corporation Electronic device
JP6201770B2 (ja) * 2014-01-15 2017-09-27 富士通株式会社 ジェスチャui装置、ジェスチャui方法及びプログラム
US20160080438A1 (en) * 2014-08-04 2016-03-17 Place Pixel Inc. Method and Apparatus for Tile-Based Geographic Social Interaction
US9367129B1 (en) 2015-02-05 2016-06-14 Wipro Limited Method and system for controlling display of content to user
JP6206459B2 (ja) * 2015-09-02 2017-10-04 カシオ計算機株式会社 ネットワークシステム、情報機器、表示方法並びにプログラム
KR102354328B1 (ko) * 2015-09-22 2022-01-21 삼성전자주식회사 영상 표시 장치 및 그 동작 방법
US10409480B2 (en) * 2016-12-28 2019-09-10 Amazon Technologies, Inc. Interruption and resumption of feedback animation for touch-based interactions

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080074384A1 (en) * 2006-09-22 2008-03-27 Research In Motion Limited System and method for adjusting icons, text and images on an electronic device
KR101367060B1 (ko) * 2009-12-23 2014-02-24 모토로라 모빌리티 엘엘씨 시각적 보상을 위한 방법 및 장치
KR20130065317A (ko) * 2011-12-09 2013-06-19 도시바삼성스토리지테크놀러지코리아 주식회사 그래픽 사용자 인터페이스를 제공하는 방법 및 장치
US20130234929A1 (en) * 2012-03-07 2013-09-12 Evernote Corporation Adapting mobile user interface to unfavorable usage conditions
KR20160060386A (ko) * 2014-11-20 2016-05-30 삼성전자주식회사 윈도우의 크기를 변경하는 디바이스 및 그 제어 방법

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3565284A4 *

Also Published As

Publication number Publication date
US20190361588A1 (en) 2019-11-28
EP3565284A4 (fr) 2020-02-19
EP3565284C0 (fr) 2023-12-27
EP3565284A1 (fr) 2019-11-06
EP3565284B1 (fr) 2023-12-27
KR102488580B1 (ko) 2023-01-13
US10852904B2 (en) 2020-12-01
KR20180083185A (ko) 2018-07-20

Similar Documents

Publication Publication Date Title
WO2018084580A1 (fr) Dispositif d'exécution de charge par voie sans fil et son procédé
WO2017188577A1 (fr) Procédé de commande de charge de batterie et dispositif électronique associé
WO2017209528A1 (fr) Appareil électronique et son procédé de fonctionnement
WO2018093060A1 (fr) Dispositif électronique et procédé de commande de dispositif électronique
WO2018128389A1 (fr) Dispositif électronique et procédé d'affichage d'historique d'application exécutée dans celui-ci
WO2016204551A1 (fr) Dispositif et procédé de fourniture d'une notification concernant l'état de charge sans fil
AU2018216529B2 (en) Method for switching applications, and electronic device thereof
WO2018131928A1 (fr) Appareil et procédé de fourniture d'interface utilisateur adaptative
WO2018074898A1 (fr) Procédé et dispositif électronique permettant de commander une puissance d'émission
WO2017142195A1 (fr) Dispositif électronique et procédé de commutation et d'alignement d'applications correspondantes
WO2018155928A1 (fr) Dispositif électronique permettant d'effectuer une authentification à l'aide de multiples capteurs biométriques et son procédé de fonctionnement
WO2018034544A1 (fr) Procédé de commande de connexion de réseau de communication, support d'informations et dispositif électronique associé
WO2018004275A1 (fr) Procédé pour déterminer le rôle d'un dispositif électronique, et dispositif électronique correspondant
WO2018128460A1 (fr) Procédé d'affichage d'image d'écran et son dispositif électronique
WO2018143643A1 (fr) Dispositif électronique et procédé de commande de bio-capteur connecté à un affichage à l'aide de celui-ci
WO2017175962A1 (fr) Dispositif électronique permettant l'affichage d'image et son procédé de commande
WO2018034416A1 (fr) Dispositif électronique et procédé d'affichage d'image du dispositif électronique
WO2017082554A1 (fr) Dispositif électronique pour détecter un dispositif accessoire et son procédé de fonctionnement
WO2018106022A2 (fr) Dispositif électronique et procédé de commande de chaleur générée sur la surface d'un dispositif électronique
WO2018124775A1 (fr) Procédé de permettant de relier un dispositif externe et dispositif électronique le prenant en charge
WO2018021726A1 (fr) Dispositif électronique et procédé de commande d'activation d'un module de caméra
WO2017126767A1 (fr) Dispositif électronique et procédé pour faire fonctionner le dispositif électronique
WO2017047967A1 (fr) Dispositif électronique et procédé de commande associé
WO2017183931A1 (fr) Dispositif électronique et procédé de commande du dispositif électronique
WO2017138722A1 (fr) Dispositif électronique et procédé pour fournir des informations d'itinéraire

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18738701

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018738701

Country of ref document: EP

Effective date: 20190801