WO2017142341A1 - Method for integrating and providing collected data from multiple devices and electronic device for implementing same - Google Patents

Method for integrating and providing collected data from multiple devices and electronic device for implementing same Download PDF

Info

Publication number
WO2017142341A1
WO2017142341A1 PCT/KR2017/001752 KR2017001752W WO2017142341A1 WO 2017142341 A1 WO2017142341 A1 WO 2017142341A1 KR 2017001752 W KR2017001752 W KR 2017001752W WO 2017142341 A1 WO2017142341 A1 WO 2017142341A1
Authority
WO
WIPO (PCT)
Prior art keywords
processor
data
electronic device
activity
steps
Prior art date
Application number
PCT/KR2017/001752
Other languages
French (fr)
Inventor
Kwangjo LEE
Hyunsu Kim
Jihye SON
Moonbae SONG
Sanghwa Lee
Woosang LEE
Jae-Hwan Lee
Jaeyoun Jeong
Bum-Sung CHO
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to EP17753502.8A priority Critical patent/EP3403166A4/en
Priority to CN201780012097.XA priority patent/CN108701495B/en
Priority to MYPI2018702903A priority patent/MY193558A/en
Publication of WO2017142341A1 publication Critical patent/WO2017142341A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
    • G01C22/02Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers by conversion into electric waveforms and subsequent integration, e.g. using tachometer generator
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72457User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • Various embodiments relate to a method for integrating and providing collected data from multiple devices, and an electronic device for implementing the same.
  • the electronic device has various functions such as a voice call, message transmission like a Short Message Service (SMS)/Multimedia Message Service (MMS), a video call, electronic organizer, photography, email transmission/reception, broadcast reproduction, Internet, music reproduction, schedule management, Social Networking Service (SNS), messenger, dictionary, game, and the like.
  • SMS Short Message Service
  • MMS Multimedia Message Service
  • SNS Social Networking Service
  • wearable devices that measure user activity, or applications that show the measured user activity are actively being developed.
  • the wearable devices may also widely be used for medical services in the future. Since the conventional user activity display technology usually uses a single device (e.g., a wearable device) to display, it cannot provide a seamless user experience using a plurality of devices when a user uses the plurality of devices.
  • An electronic device includes: a housing; a display exposed through a part of the housing; a first motion sensor disposed within the housing and configured to detect the movement of the housing; a wireless communication circuit disposed within the housing; a processor disposed within the housing and electrically connected to the display, the first motion sensor, and the wireless communication circuit; and a memory electrically connected to the processor, wherein the memory stores instructions which, when executed by a processor, cause the processor to perform operations including: generating a wireless communication channel with an external electronic device including a second motion sensor, using the wireless communication circuit; monitoring the movement of the housing using the first motion sensor, so as to generate first data for a first time period; receiving second data acquired for the first time period through the wireless communication channel, using the second motion sensor; calculating, as a value for the first time period, a value smaller than the sum of a first value based on the first data and a second value based on the second data; and displaying the calculated value through a user interface displayed on the display.
  • An electronic device includes: a housing; a display exposed through a part of the housing; a motion sensor disposed within the housing and configured to detect the movement of the housing; a wireless communication circuit disposed within the housing; a processor disposed within the housing and electrically connected to the display, the motion sensor, and the wireless communication circuit; and a memory electrically connected to the processor, wherein the memory stores instructions which, when executed by a processor, cause the processor to perform operations including: monitoring the movement of the housing using the motion sensor so as to generate first data for a first time period; determining a first attribute of the movement during a first session of the first time period, using a first portion of the first data; determining a second attribute of the movement during a second session of the first time period, using a second portion of the first data; selecting one of the first attribute and the second attribute; and displaying at least one of an image, a text, or a symbol representing the selected attribute through a user interface displayed on the display.
  • An electronic device includes a memory, a display, a communication interface, and a processor functionally connected to the memory, the display, or the communication interface, wherein the processor may be configured to acquire, through the communication interface, health related data collected from an external device, correct the acquired data per unit time, analyze the corrected data to extract activity information, store the extracted activity information in the memory, and display a user interface including the activity information on the display in response to a user request.
  • An operation method for an electronic device may include: acquiring health related data; correcting the acquired data per unit time; analyzing the corrected data to extract activity information; and displaying a user interface including the activity information in response to a user request.
  • a user interface or user experience can be provided, which can analyze health related data collected from one or more electronic devices so as to find a meaningful active interval for the user, and allow the user to intuitively determine data characteristics according to the active interval.
  • health related data collected from a plurality of electronic devices may be integrated, so as to provide a seamless user experience for the number of steps, activity information, or non-activity information of the user.
  • FIG. 1 illustrates an electronic device within a network environment according to various embodiments
  • FIG. 2 illustrates a configuration of an electronic device according to various embodiments
  • FIG. 3 illustrates a program module according to various embodiments
  • FIG. 4 illustrates the configuration of an electronic device and a wearable device according to various embodiments
  • FIG. 5 illustrates an example of a health related data flow according to various embodiments
  • FIG. 6 illustrates an operation method for an electronic device according to various embodiments
  • FIGS. 7A and 7B illustrate an example of data analysis according to various embodiments
  • FIG. 8 illustrates a data integration method by an electronic device according to various embodiments
  • FIG. 9 illustrates a method for integrating the number of steps by an electronic device according to various embodiments.
  • FIG. 10 illustrates an example of integrating the number of steps according to various embodiments
  • FIG. 11 illustrates a method for integrating activity information by an electronic device according to various embodiments
  • FIG. 12 illustrates an example of integrating activity information according to various embodiments
  • FIG. 13 illustrates a method for integrating non-activity information by an electronic device according to various embodiments
  • FIGS. 14A and 14B illustrate an example of integrating non-activity information according to various embodiments
  • FIG. 15 illustrates an example of integrating data of various activity type according to various embodiments
  • FIG. 16 illustrates an example of a user interface for activity information according to various embodiments
  • FIG. 17 illustrates an example of a user interface for sharing activity information according to various embodiments
  • FIG. 18 illustrates an example of a user interface for configuring a recognition priority of a wearable device according to various embodiments
  • FIG. 19 illustrates an example of a user interface for configuring location information according to various embodiments
  • FIG. 20 illustrates a method for displaying a user interface by an electronic device according to various embodiments
  • FIG. 21 illustrates an example of calculating an active area according to various embodiments
  • FIG. 22A to FIG. 24B illustrate an example of correcting an icon in an active area based on a threshold according to various embodiments.
  • FIG. 25A to FIG. 26C illustrate an example of processing overlapped icons according to various embodiments.
  • FIGURES 1 through 26C discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged electronic device.
  • the expression “A or B”, “at least one of A or/and B”, or “one or more of A or/and B” may include all possible combinations of the items listed.
  • the expression “A or B”, “at least one of A and B”, or “at least one of A or B” refers to all of (1) including at least one A, (2) including at least one B, or (3) including all of at least one A and at least one B.
  • the expression “a first”, “a second”, “the first”, or “the second” used in various embodiments of the present disclosure may modify various components regardless of the order and/or the importance but does not limit the corresponding components.
  • a first user device and a second user device indicate different user devices although both of them are user devices.
  • a first element may be termed a second element, and similarly, a second element may be termed a first element without departing from the scope of the present disclosure.
  • first element when an element (e.g., first element) is referred to as being (operatively or communicatively) "connected,” or “coupled,” to another element (e.g., second element), it may be directly connected or coupled directly to the other element or any other element (e.g., third element) may be interposer between them.
  • first element when an element (e.g., first element) is referred to as being “directly connected,” or “directly coupled” to another element (second element), there are no element (e.g., third element) interposed between them.
  • the expression “configured to” used in the present disclosure may be exchanged with, for example, “suitable for”, “having the capacity to”, “designed to”, “adapted to”, “made to”, or “capable of” according to the situation.
  • the term “configured to” may not necessarily imply “specifically designed to” in hardware.
  • the expression “device configured to” may mean that the device, together with other devices or components, “is able to”.
  • the phrase “processor adapted (or configured) to perform A, B, and C” may mean a dedicated processor (e.g. embedded processor) only for performing the corresponding operations or a generic-purpose processor (e.g., central processing unit (CPU) or application processor (AP)) that can perform the corresponding operations by executing one or more software programs stored in a memory device.
  • a dedicated processor e.g. embedded processor
  • a generic-purpose processor e.g., central processing unit (CPU) or application processor (AP)
  • An electronic device may include at least one of, for example, a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic book reader (e-book reader), a desktop PC, a laptop PC, a netbook computer, a workstation, a server, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a MPEG-1 audio layer-3 (MP3) player, a mobile medical device, a camera, and a wearable device.
  • a smart phone a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic book reader (e-book reader), a desktop PC, a laptop PC, a netbook computer, a workstation, a server, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a MPEG-1 audio layer-3 (MP3) player, a mobile medical device, a camera, and a wearable device.
  • PC Personal Computer
  • PMP Portable Multimedia Player
  • MP3 MPEG-1 audio layer-3
  • the wearable device may include at least one of an accessory type (e.g., a watch, a ring, a bracelet, an anklet, a necklace, a glasses, a contact lens, or a Head-Mounted Device (HMD)), a fabric or clothing integrated type (e.g., an electronic clothing), a body-mounted type (e.g., a skin pad, or tattoo), and a bio-implantable type (e.g., an implantable circuit).
  • the electronic device may be a home appliance.
  • the home appliance may include at least one of, for example, a television, a Digital Video Disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washing machine, an air cleaner, a set-top box, a home automation control panel, a security control panel, a TV box (e.g., SAMSUNG HOMESYNC TM , APPLE TV ® , or GOOGLE TV ® ), a game console (e.g., XBOX ® and PLAYSTATION ® ), an electronic dictionary, an electronic key, a camcorder, and an electronic photo frame.
  • DVD Digital Video Disk
  • the electronic device may include at least one of various medical devices (e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood pressure measuring device, a body temperature measuring device, etc.), a Magnetic Resonance Angiography (MRA), a Magnetic Resonance Imaging (MRI), a Computed Tomography (CT) machine, and an ultrasonic machine), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR) , a Flight Data Recorder (FDR) , a Vehicle Infotainment Devices, an electronic devices for a ship (e.g., a navigation device for a ship, and a gyro-compass), avionics, security devices, an automotive head unit, a robot for home or industry, an automatic teller's machine (ATM) in banks, point of sales (POS) in a shop, or internet device of things (e.g., a light bulb, various sensors, electric or gas
  • the electronic device may include at least one of a part of furniture or a building/structure, an electronic board, an electronic signature receiving device, a projector, and various kinds of measuring instruments (e.g., a water meter, an electric meter, a gas meter, and a radio wave meter).
  • the electronic device according to various embodiments of the present disclosure may be a combination of one or more of the aforementioned various devices.
  • the electronic device according to some embodiments of the present disclosure may be a flexible device.
  • the electronic device according to an embodiment of the present disclosure is not limited to the aforementioned devices, and may include a new electronic device according to the development of technology.
  • the term "user” may indicate a person who uses an electronic device or a device (e.g., an artificial intelligence electronic device) that uses an electronic device.
  • FIG. 1 illustrates a network environment including an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may include a bus 110, a processor 120, a memory 130, an input/output interface 150, a display 160, and a communication interface 170. According to an embodiment of the present disclosure, the electronic device 101 may omit at least one of the above components or may further include other components.
  • the bus 110 may include, for example, a circuit which interconnects the components 110 to 170 and delivers a communication (e.g., a control message and/or data) between the components 110 to 170.
  • a communication e.g., a control message and/or data
  • the processor 120 may include one or more of a Central Processing Unit (CPU), an Application Processor (AP), and a Communication Processor (CP).
  • the processor 120 may carry out, for example, calculation or data processing relating to control and/or communication of at least one other component of the electronic device 101.
  • the memory 130 may include a volatile memory and/or a non-volatile memory.
  • the memory 130 may store, for example, commands or data relevant to at least one other component of the electronic device 101.
  • the memory 130 may store software and/or a program 140.
  • the program 140 may include, for example, a kernel 141, middleware 143, an Application Programming Interface (API) 145, and/or application programs (or "applications") 147.
  • At least some of the kernel 141, the middleware 143, and the API 145 may be referred to as an Operating System (OS).
  • OS Operating System
  • the kernel 141 may control or manage system resources (e.g., the bus 110, the processor 120, or the memory 130) used for performing an operation or function implemented in the other programs (e.g., the middleware 143, the API 145, or the application programs 147). Furthermore, the kernel 141 may provide an interface through which the middleware 143, the API 145, or the application programs 147 may access the individual components of the electronic device 101 to control or manage the system resources.
  • system resources e.g., the bus 110, the processor 120, or the memory 130
  • the kernel 141 may provide an interface through which the middleware 143, the API 145, or the application programs 147 may access the individual components of the electronic device 101 to control or manage the system resources.
  • the middleware 143 may serve as an intermediary for allowing the API 145 or the application programs 147 to communicate with the kernel 141 to exchange data. Also, the middleware 143 may process one or more task requests received from the application programs 147 according to priorities thereof. For example, the middleware 143 may assign priorities for using the system resources (e.g., the bus 110, the processor 120, the memory 130, or the like) of the electronic device 101, to at least one of the application programs 147. For example, the middleware 143 may perform scheduling or loading balancing on the one or more task requests by processing the one or more task requests according to the priorities assigned thereto.
  • system resources e.g., the bus 110, the processor 120, the memory 130, or the like
  • the API 145 is an interface through which the applications 147 control functions provided from the kernel 141 or the middleware 143, and may include, for example, at least one interface or function (e.g., instruction) for file control, window control, image processing, character control, and the like.
  • interface or function e.g., instruction
  • the input/output interface 150 may function as an interface that may transfer commands or data input from a user or another external device to the other element(s) of the electronic device 101. Furthermore, the input/output interface 150 may output the commands or data received from the other element(s) of the electronic device 101 to the user or another external device.
  • Examples of the display 160 may include a Liquid Crystal Display (LCD), a Light-Emitting Diode (LED) display, an Organic Light-Emitting Diode (OLED) display, a MicroElectroMechanical Systems (MEMS) display, and an electronic paper display.
  • the display 160 may display, for example, various types of contents (e.g., text, images, videos, icons, or symbols) to users.
  • the display 160 may include a touch screen, and may receive, for example, a touch, gesture, proximity, or hovering input using an electronic pen or a user's body part.
  • the communication interface 170 may establish communication, for example, between the electronic device 101 and an external device (e.g., a first external electronic device 102, a second external electronic device 104, or a server 106).
  • the communication interface 170 may be connected to a network 162 through wireless or wired communication, and may communicate with an external device (e.g., the second external electronic device 104 or the server 106).
  • the wireless communication may use at least one of, for example, Long Term Evolution (LTE), LTE-Advance (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Universal Mobile Telecommunications System (UMTS), Wireless Broadband (WiBro), and Global System for Mobile Communications (GSM), as a cellular communication protocol.
  • the wireless communication may include, for example, short range communication 164.
  • the short-range communication 164 may include at least one of, for example, Wi-Fi, Bluetooth, Near Field Communication (NFC), and Global Navigation Satellite System (GNSS).
  • GNSS may include, for example, at least one of global positioning system (GPS), global navigation satellite system (GLONASS), Beidou Navigation satellite system (BEIDOU) or Galileo, and the European global satellite-based navigation system, based on a location, a bandwidth, or the like.
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BEIDOU Beidou Navigation satellite system
  • Galileo the European global satellite-based navigation system
  • the wired communication may include, for example, at least one of a Universal Serial Bus (USB), a High Definition Multimedia Interface (HDMI), Recommended Standard 232 (RS-232), and a Plain Old Telephone Service (POTS).
  • the network 162 may include at least one of a telecommunication network such as a computer network (e.g., a LAN or a WAN), the Internet, and a
  • Each of the first and second external electronic devices 102 and 104 may be of a type identical to or different from that of the electronic device 101.
  • the server 106 may include a group of one or more servers. According to various embodiments of the present disclosure, all or some of the operations performed in the electronic device 101 may be executed in another electronic device or a plurality of electronic devices (e.g., the electronic devices 102 and104 or the server 106).
  • the electronic device 101 may request another device (e.g., the electronic device 102 or 104 or the server 106) to execute at least some functions relating thereto instead of or in addition to autonomously performing the functions or services.
  • Another electronic device e.g., the electronic device 102 or 104, or the server 106
  • the electronic device 101 may process the received result as it is or additionally, and may provide the requested functions or services.
  • cloud computing, distributed computing, or client-server computing technologies may be used.
  • FIG. 2 illustrates an electronic device according to various embodiments of the present disclosure.
  • the electronic device 201 may include, for example, all or a part of the electronic device 101 shown in FIG. 1.
  • the electronic device 201 may include one or more processors 210 (e.g., Application Processors (AP)), a communication module 220, a memory 230, a sensor module 240, an input device 250, a display 260, an interface 270, an audio module 280, a camera module 291, a power management module 295, a battery 296, an indicator 297, and a motor 298.
  • processors 210 e.g., Application Processors (AP)
  • AP Application Processors
  • the processor 210 may control a plurality of hardware or software components connected to the processor 210 by driving an operating system or an application program, and perform processing of various pieces of data and calculations.
  • the processor 210 may be embodied as, for example, a System on Chip (SoC).
  • SoC System on Chip
  • the processor 210 may further include a Graphic Processing Unit (GPU) and/or an image signal processor.
  • the processor 210 may include at least some (for example, a cellular module 221) of the components illustrated in FIG. 2.
  • the processor 210 may load, into a volatile memory, commands or data received from at least one (e.g., a non-volatile memory) of the other components and may process the loaded commands or data, and may store various data in a non-volatile memory.
  • the communication module 220 may have a configuration equal or similar to that of the communication interface 170 of FIG. 1.
  • the communication module 220 may include, for example, a cellular module 221, a Wi-Fi module 223, a BT module 225, a GNSS module 227 (e.g., a GPS module 227, a Glonass module, a Beidou module, or a Galileo module), an NFC module 228, and a Radio Frequency (RF) module 229.
  • the cellular module 221 may provide a voice call, a video call, a text message service, or an Internet service through a communication network.
  • the cellular module 221 may distinguish and authenticate the electronic device 201 in a communication network using a subscriber identification module (e.g: SIM card) 224 (for example, the SIM card).
  • a subscriber identification module e.g: SIM card
  • the cellular module 221 may perform at least some of the functions that the AP 210 may provide.
  • the cellular module 221 may include a communication processor (CP).
  • each of the Wi-Fi module 223, the BT module 225, the GNSS module 227, and the NFC module 228 may include a processor for processing data transmitted/received through a corresponding module.
  • at least some (e.g., two or more) of the cellular module 221, the Wi-Fi module 223, the BT module 225, the GNSS module 227, and the NFC module 228 may be included in one Integrated Chip (IC) or IC package.
  • the RF module 229 may transmit/receive a communication signal (e.g., an RF signal).
  • the RF module 229 may include, for example, a transceiver, a Power Amplifier Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), and an antenna.
  • PAM Power Amplifier Module
  • LNA Low Noise Amplifier
  • at least one of the cellular module 221, the Wi-Fi module 223, the BT module 225, the GNSS module 227, and the NFC module 228 may transmit/receive an RF signal through a separate RF module.
  • the subscriber identification module 224 may include, for example, a card including a subscriber identity module and/or an embedded SIM, and may contain unique identification information (e.g., an Integrated Circuit Card Identifier (ICCID)) or subscriber information (e.g., an International Mobile Subscriber Identity (IMSI)).
  • ICCID Integrated Circuit Card Identifier
  • IMSI International Mobile Subscriber Identity
  • the memory 230 may include, for example, an embedded memory 232 or an external memory 234.
  • the embedded memory 232 may include at least one of a volatile memory (e.g., a Dynamic Random Access Memory (DRAM), a Static RAM (SRAM), a Synchronous Dynamic RAM (SDRAM), and the like) and a non-volatile memory (e.g., a One Time Programmable Read Only Memory (OTPROM), a Programmable ROM (PROM), an Erasable and Programmable ROM (EPROM), an Electrically Erasable and Programmable ROM (EEPROM), a mask ROM, a flash ROM, a flash memory (e.g., a NAND flash memory or a NOR flash memory), a hard disc drive, a Solid State Drive (SSD), and the like).
  • a volatile memory e.g., a Dynamic Random Access Memory (DRAM), a Static RAM (SRAM), a Synchronous Dynamic RAM (SDRAM), and the like
  • the external memory 234 may further include a flash drive, for example, a Compact Flash (CF), a Secure Digital (SD), a Micro Secure Digital (Micro-SD), a Mini Secure Digital (Mini-SD), an eXtreme Digital (xD), a MultiMediaCard (MMC), a memory stick, or the like.
  • CF Compact Flash
  • SD Secure Digital
  • Micro-SD Micro Secure Digital
  • Mini-SD Mini Secure Digital
  • xD eXtreme Digital
  • MMC MultiMediaCard
  • the external memory 234 may be functionally and/or physically connected to the electronic device 201 through various interfaces.
  • the sensor module 240 may measure a physical quantity or detect an operation state of the electronic device 201, and may convert the measured or detected information into an electrical signal.
  • the sensor module 240 may include, for example, at least one of a gesture sensor 240A, a gyro sensor 240B, an atmospheric pressure sensor (barometer) 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, a proximity sensor 240G, a color sensor 240H (e.g., red, green, and blue (RGB) sensor), a biometric sensor (medical sensor) 240I, a temperature/humidity sensor 240J, an illuminance sensor 240K, and a Ultra Violet (UV) sensor 240M.
  • the sensor module 240 may include, for example, an E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an Infrared (IR) sensor, an iris scan sensor, and/or a finger scan sensor.
  • the sensor module 240 may further include a control circuit for controlling one or more sensors included therein.
  • the electronic device 201 may further include a processor configured to control the sensor module 240, as a part of the processor 210 or separately from the processor 210, and may control the sensor module 240 while the processor 210 is in a sleep state.
  • the input device 250 may include, for example, a touch panel 252, a (digital) pen sensor 254, a key 256, or an ultrasonic input device 258.
  • the touch panel 252 may use, for example, at least one of a capacitive type, a resistive type, an infrared type, and an ultrasonic type.
  • the touch panel 252 may further include a control circuit.
  • the touch panel 252 may further include a tactile layer, and provide a tactile reaction to the user.
  • the (digital) pen sensor 254 may include, for example, a recognition sheet which is a part of the touch panel or is separated from the touch panel.
  • the key 256 may include, for example, a physical button, an optical key or a keypad.
  • the ultrasonic input device 258 may detect, through a microphone (e.g., the microphone 288), ultrasonic waves generated by an input tool, and identify data corresponding to the detected ultrasonic waves.
  • the display 260 may include a panel 262, a hologram device 264, or a projector 266.
  • the panel 262 may include a configuration identical or similar to the display 160 illustrated in FIG. 1.
  • the panel 262 may be implemented to be, for example, flexible, transparent, or wearable.
  • the panel 262 may be embodied as a single module with the touch panel 252.
  • the hologram device 264 may show a three dimensional (3D) image in the air by using an interference of light.
  • the projector 266 may project light onto a screen to display an image.
  • the screen may be located, for example, in the interior of or on the exterior of the electronic device 201.
  • the display 260 may further include a control circuit for controlling the panel 262, the hologram device 264, or the projector 266.
  • the interface 270 may include, for example, a High-Definition Multimedia Interface (HDMI) 272, a Universal Serial Bus (USB) 274, an optical interface 276, or a D-subminiature (D-sub) 278.
  • the interface 270 may be included in, for example, the communication interface 170 illustrated in FIG. 1. Additionally or alternatively, the interface 270 may include, for example, a Mobile High-definition Link (MHL) interface, a Secure Digital (SD) card/Multi-Media Card (MMC) interface, or an Infrared Data Association (IrDA) standard interface.
  • MHL Mobile High-definition Link
  • SD Secure Digital
  • MMC Multi-Media Card
  • IrDA Infrared Data Association
  • the audio module 280 may bilaterally convert a sound and an electrical signal. At least some components of the audio module 280 may be included in, for example, the input/output interface 150 illustrated in FIG. 1.
  • the audio module 280 may process voice information input or output through, for example, a speaker 282, a receiver 284, earphones 286, or the microphone 288.
  • the camera module 291 is, for example, a device which may photograph a still image and a video. According to an embodiment of the present disclosure, the camera module 291 may include one or more image sensors (e.g., a front sensor or a back sensor), a lens, an Image Signal Processor (ISP) or a flash (e.g., LED or xenon lamp).
  • ISP Image Signal Processor
  • flash e.g., LED or xenon lamp
  • the power management module 295 may manage, for example, power of the electronic device 201.
  • the power management module 295 may include a Power Management Integrated Circuit (PMIC), a charger Integrated Circuit (IC), or a battery or fuel gauge.
  • PMIC Power Management Integrated Circuit
  • IC charger Integrated Circuit
  • the PMIC may use a wired and/or wireless charging method.
  • Examples of the wireless charging method may include, for example, a magnetic resonance method, a magnetic induction method, an electromagnetic wave method, and the like. Additional circuits (e.g., a coil loop, a resonance circuit, a rectifier, etc.) for wireless charging may be further included.
  • the battery gauge may measure, for example, a residual quantity of the battery 296, and a voltage, a current, or a temperature while charging.
  • the battery 296 may include, for example, a rechargeable battery and/or a solar battery.
  • the indicator 297 may display a particular state (e.g., a booting state, a message state, a charging state, or the like) of the electronic device 201 or a part (e.g., the processor 210) of the electronic device 201.
  • the motor 298 may convert an electrical signal into a mechanical vibration, and may generate a vibration, a haptic effect, or the like.
  • the electronic device 201 may include a processing device (e.g., a GPU) for supporting a mobile TV.
  • the processing device for supporting a mobile TV may process, for example, media data according to a certain standard such as Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), or MEDIAFLO TM .
  • DMB Digital Multimedia Broadcasting
  • DVD Digital Video Broadcasting
  • MEDIAFLO TM MEDIAFLO
  • Each of the above-described component elements of hardware according to the present disclosure may be configured with one or more components, and the names of the corresponding component elements may vary based on the type of electronic device.
  • the electronic device may include at least one of the above-described elements. Some of the above-described elements may be omitted from the electronic device, or the electronic device may further include additional elements. Also, some of the hardware components according to various embodiments may be combined into one entity, which may perform functions identical to those of the relevant components before the combination.
  • FIG. 3 illustrates a program module according to various embodiments of the present disclosure.
  • the program module 310 may include an Operating System (OS) for controlling resources related to the electronic device (e.g., the electronic device 101) and/or various applications (e.g., the application programs 147) executed in the operating system.
  • the operating system may be, for example, ANDROID ® , iOS ® , WINDOWS ® , SYMBIAN ® , TIZEN ® , SAMSUNG BADA ® , or the like.
  • the program module 310 may include a kernel 320, middleware 330, an API 360, and/or applications 370. At least some of the program module 310 may be preloaded on an electronic device, or may be downloaded from an external electronic device (e.g., the electronic device 102 or 104, or the server 106).
  • the kernel 320 may include, for example, a system resource manager 321 and/or a device driver 323.
  • the system resource manager 321 may control, allocate, or collect system resources.
  • the system resource manager 321 may include a process management unit, a memory management unit, a file system management unit, and the like.
  • the device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a Wi-Fi driver, an audio driver, or an Inter-Process Communication (IPC) driver.
  • IPC Inter-Process Communication
  • the middleware 330 may provide a function required in common by the applications 370, or may provide various functions to the applications 370 through the API 360 so as to enable the applications 370 to efficiently use the limited system resources in the electronic device.
  • the middleware 330 e.g., the middleware 143 may include at least one of a run time library 335, an application manager 341, a window manager 342, a multimedia manager 343, a resource manager 344, a power manager 345, a database manager 346, a package manager 347, a connectivity manager 348, a notification manager 349, a location manager 350, a graphic manager 351, and a security manager 352.
  • the runtime library 335 may include a library module that a compiler uses in order to add a new function through a programming language while an application 370 is being executed.
  • the runtime library 335 may perform input/output management, memory management, the functionality for an arithmetic function, or the like.
  • the application manager 341 may manage, for example, a life cycle of at least one of the applications 370.
  • the window manager 342 may manage Graphical User Interface (GUI) resources used by a screen.
  • GUI Graphical User Interface
  • the multimedia manager 343 may recognize a format required for reproduction of various media files, and may perform encoding or decoding of a media file by using a codec suitable for the corresponding format.
  • the resource manager 344 may manage resources of a source code, a memory, and a storage space of at least one of the applications 370.
  • the power manager 345 may operate together with, for example, a Basic Input/Output System (BIOS) or the like to manage a battery or power source and may provide power information or the like required for the operations of the electronic device.
  • the database manager 346 may generate, search for, and/or change a database to be used by at least one of the applications 370.
  • the package manager 347 may manage installation or an update of an application distributed in a form of a package file.
  • the connectivity manager 348 may manage wireless connectivity such as WI-FI ® or BLUETOOTH ® .
  • the notification manager 349 may display or notify of an event such as an arrival message, promise, proximity notification, and the like in such a way that does not disturb a user.
  • the location manager 350 may manage location information of an electronic device.
  • the graphic manager 351 may manage a graphic effect which will be provided to a user, or a user interface related to the graphic effect.
  • the security manager 352 may provide all security functions required for system security, user authentication, or the like.
  • the middleware 330 may further include a telephony manager for managing a voice call function or a video call function of the electronic device.
  • the middleware 330 may include a middleware module that forms a combination of various functions of the above-described components.
  • the middleware 330 may provide a module specialized for each type of OS in order to provide a differentiated function. Further, the middleware 330 may dynamically remove some of the existing components or add new components.
  • the API 360 (e.g., the API 145) is, for example, a set of API programming functions, and may be provided with a different configuration according to an OS. For example, in the case of Android or iOS, one API set may be provided for each platform. In the case of Tizen, two or more API sets may be provided for each platform.
  • the applications 370 may include, for example, one or more applications which may provide functions such as a home 371, a dialer 372, an SMS/MMS 373, an Instant Message (IM) 374, a browser 375, a camera 376, an alarm 377, contacts 378, a voice dial 379, an email 380, a calendar 381, a media player 382, an album 383, a clock 384, health care (e.g., measuring exercise quantity or blood sugar), or environment information (e.g., providing atmospheric pressure, humidity, or temperature information).
  • IM Instant Message
  • the applications 370 may include an application (hereinafter, referred to as an "information exchange application" for convenience of description) that supports exchanging information between the electronic device (e.g., the electronic device 101) and an external electronic device (e.g., the electronic device 102 or 104).
  • the information exchange application may include, for example, a notification relay application for transferring specific information to an external electronic device or a device management application for managing an external electronic device.
  • the notification relay application may include a function of transferring, to the external electronic device (e.g., the electronic device 102 or 104), notification information generated from other applications of the electronic device 101 (e.g., an SMS/MMS application, an e-mail application, a health management application, or an environmental information application). Further, the notification relay application may receive notification information from, for example, an external electronic device and provide the received notification information to a user.
  • the external electronic device e.g., the electronic device 102 or 104
  • notification information generated from other applications of the electronic device 101 e.g., an SMS/MMS application, an e-mail application, a health management application, or an environmental information application.
  • the notification relay application may receive notification information from, for example, an external electronic device and provide the received notification information to a user.
  • the device management application may manage (e.g., install, delete, or update), for example, at least one function of an external electronic device (e.g., the electronic device 102 or 104) communicating with the electronic device (e.g., a function of turning on/off the external electronic device itself (or some components) or a function of adjusting the brightness (or a resolution) of the display), applications operating in the external electronic device, and services provided by the external electronic device (e.g., a call service or a message service).
  • an external electronic device e.g., the electronic device 102 or 104
  • the electronic device e.g., a function of turning on/off the external electronic device itself (or some components) or a function of adjusting the brightness (or a resolution) of the display
  • applications operating in the external electronic device e.g., a call service or a message service.
  • the applications 370 may include applications (e.g., a health care application of a mobile medical appliance or the like) designated according to an external electronic device (e.g., attributes of the electronic device 102 or 104).
  • the applications 370 may include an application received from an external electronic device (e.g., the server 106, or the electronic device 102 or 104).
  • the applications 370 may include a preloaded application or a third party application that may be downloaded from a server.
  • the names of the components of the program module 310 of the illustrated embodiment of the present disclosure may change according to the type of operating system.
  • At least a part of the programming module 310 may be implemented in software, firmware, hardware, or a combination of two or more thereof. At least some of the program module 310 may be implemented (e.g., executed) by, for example, the processor (e.g., the processor 210). At least some of the program module 310 may include, for example, a module, a program, a routine, a set of instructions, and/or a process for performing one or more functions.
  • module as used herein may, for example, mean a unit including one of hardware, software, and firmware or a combination of two or more of them.
  • the “module” may be interchangeably used with, for example, the term “unit”, “logic”, “logical block”, “component”, or “circuit”.
  • the “module” may be a minimum unit of an integrated component element or a part thereof.
  • the “module” may be a minimum unit for performing one or more functions or a part thereof.
  • the “module” may be mechanically or electronically implemented.
  • the "module” may include at least one of an Application-Specific Integrated Circuit (ASIC) chip, a Field-Programmable Gate Arrays (FPGA), and a programmable-logic device for performing operations which has been known or are to be developed hereinafter.
  • ASIC Application-Specific Integrated Circuit
  • FPGA Field-Programmable Gate Arrays
  • programmable-logic device for performing operations which has been known or are to be developed hereinafter.
  • at least some of the devices (for example, modules or functions thereof) or the method (for example, operations) according to the present disclosure may be implemented by a command stored in a computer-readable storage medium in a programming module form.
  • the instruction when executed by a processor (e.g., the processor 120), may cause the one or more processors to execute the function corresponding to the instruction.
  • the computer-readable recoding media may be, for example, the memory 130.
  • FIG. 4 illustrates the configuration of an electronic device and a wearable device according to various embodiments.
  • an electronic device 410 may include a web API manager 411, a wearable manager 412, a data analysis unit 413, a data integration unit 414, and a health DB 415.
  • the web API manager 411 may receive health related data, which is collected from a wearable device 3 440 or a wearable device 4 450, using a protocol released in the form of a web application programming interface (API).
  • the wearable manager 412 may receive the health related data, which is collected from a wearable device 1 420 or a wearable device 2 430, using the communication protocol defined for the wearable device 1 420 or the wearable device 2 430.
  • the received data may be health related data (e.g., number of steps, cycling, swimming, sleep, etc.) measured or collected by the wearable device 1 420 to the wearable device 4 450.
  • the received data may be stored in a health database (DB) 415.
  • the received data is measured or collected by one or more wearable devices, and the data may be individually provided to each device, but one piece of integrated data may be provided thereto.
  • the health DB 415 may store data received for each device, or may integrate data received from a plurality of devices and store the same.
  • the data stored in the health DB 415 may be synchronized with a cloud 460.
  • the data analysis unit 413 may analyze the received data.
  • the data analysis unit 413 may analyze the received data and classify the same depending on activity types.
  • the activity type may include a first activity type for the number of steps, a second activity type for an activity (e.g., workout), and a third activity type for non-activity.
  • the non-activity may mean a stationary state such as sleeping, sitting, and the like without walking or activity (e.g., cycling, swimming, etc.). Alternatively, the non-activity may mean that there is no detection.
  • the wearable device 1 420 may collect data for the number of steps and activities
  • the wearable device 2 430 may collect data for the number of steps
  • the wearable device 3 440 may collect data for the number of steps and activities
  • the wearable device 4 450 may collect data on the number of steps.
  • the data analysis unit 413 may classify the data depending on the activity types in order to integrate the data received from the wearable device 1 420 to the wearable device 4 450 into one piece of data .
  • the data integration unit 414 may integrate the data classified depending on activity types. For example, the data integration unit 414 may integrate respective data for the number of steps, an activity, and a non-activity. The data integration unit 414 may store the integrated data in the health DB 415.
  • the wearable device 1 420 may include a step number counting unit 421 for measuring the number of steps, and a workout measurement unit 422 for measuring activity.
  • the wearable device 2 430 may include a step number counting unit 431 for measuring the number of steps.
  • the wearable device 1 420 to the wearable device 4 450 may have different unit time for storing the measured data.
  • the wearable device 1 420 and the wearable device 3 440 may store data in a unit of 5 minutes
  • the wearable device 2 430 and the wearable device 4 450 may store data in a unit of 10 minutes.
  • FIG. 5 illustrates an example of a health related data flow according to various embodiments.
  • data 540 which is collected from an electronic device 510, a wearable device 520, and an application 530, respectively, may be stored in the health DB 415.
  • the data analysis unit 413 may perform an analysis of data (indicated by reference numeral 550) stored in the health DB 415.
  • the data integration unit 414 may perform integration of data based on the result of analysis.
  • the data integration unit 414 may apply an algorithm 560 to the integrated data to correct the data.
  • the corrected data 570 is for the number of steps, activity, and non-activity, and may be provided through a user interface.
  • the electronic device described below may be at least one of the electronic device 101 of FIG. 1, the electronic device 201 of FIG. 2, or the electronic device 410 of FIG. 4.
  • an electronic device is described as the electronic device 101 of FIG. 1, but the electronic device is not limited to the description thereof.
  • An electronic device includes: a housing; a display exposed through a part of the housing; a first motion sensor disposed within the housing and configured to detect the movement of the housing; a wireless communication circuit disposed within the housing; a processor disposed within the housing and electrically connected to the display, the first motion sensor, and the wireless communication circuit; and a memory electrically connected to the processor, wherein the memory stores instructions which, when executed by a processor, cause the processor to perform operations including: generating a wireless communication channel with an external electronic device including a second motion sensor, using the wireless communication circuit; monitoring the movement of the housing using the first motion sensor so as to generate first data for a first time period; receiving second data acquired for the first time period through the wireless communication channel, using the second motion sensor; calculating, as a value for the first time period, a value smaller than the sum of a first value based on the first data and a second value based on the second data; and displaying the calculated value through a user interface displayed on the display.
  • the instructions cause the processor to display the calculated value after the first time period through the user interface when the electronic device and the external electronic device are being worn or carried by a user.
  • An electronic device includes: a housing; a display exposed through a part of the housing; a motion sensor disposed within the housing and configured to detect the movement of the housing; a wireless communication circuit disposed within the housing; a processor disposed within the housing and electrically connected to the display, the motion sensor, and the wireless communication circuit; and a memory electrically connected to the processor, wherein the memory stores instructions which, when executed by a processor, cause the processor to perform operations including: monitoring the movement of the housing using the motion sensor so as to generate first data for a first time period; determining a first attribute of the movement during a first session of the first time period, using a first portion of the first data; determining a second attribute of the movement during a second session of the first time period, using a second portion of the first data; selecting one of the first attribute and the second attribute; and displaying at least one of an image, text, or a symbol representing the selected attribute through a user interface displayed on the display.
  • the instructions cause the processor to display, through the user interface, a map associated with locations where the housing has been located for the first time period, and to display at least one of the image, the text, or the symbol on the map in a superposed manner.
  • the instructions according to an embodiment may cause the processor to receive second data acquired for the first time period from an external electronic device including a motion sensor through the wireless communication circuit; calculate, as a value for the first time period, a value smaller than the sum of a first value based on the first data and a second value based on the second data; and display the calculated value through a user interface displayed on the display.
  • An electronic device includes a memory, a display, a communication interface, and a processor functionally connected to the memory, the display, or the communication interface, wherein the processor may be configured to acquire, through the communication interface, health related data collected from an external device, correct the acquired data per unit time, analyze the corrected data to extract activity information, store the extracted activity information in the memory, and display, on the display, a user interface including the activity information in response to a user request.
  • the processor may be configured to correct the acquired data based on a time unit for storing data of the electronic device.
  • the processor may be configured to analyze data acquired from the external device and data acquired using a sensor module of the electronic device, so as to classify the data depending on activity types, and integrate the data depending on the activity types.
  • the processor may be configured to integrate data based on a priority of at least one of time, an activity type, a workout type, and a device.
  • the processor may be configured to assign different weights to at least one of the activity type, the workout type, and the device, and correct the integrated data based on the weights.
  • the processor may be configured to classify the number of steps for each device per unit time, determine the maximum number of steps per unit time, and calculate the integrated number of steps based on the determined maximum number of steps, so as to integrate data for the number of steps.
  • the processor may be configured to determine a workout type based on the result of data analysis, determine the start and end of the workout type, and integrate activity information for each workout type based on the priority.
  • the processor may be configured to align inactive intervals for each time, and integrate non-active periods that do not include the activity information into one session, so as to process the integrated session as non-activity information.
  • the processor may be configured to extract location information on the activity information, calculate an active area for activity information based on the extracted location information, calculate a distance between two adjacent active areas, and correct an icon in the active area based on the calculated distance.
  • the processor may be configured to determine such that the icon overlap condition is satisfied when the calculated distance is less than a reference distance, and determine at least one of icons included in the adjacent two active areas.
  • FIG. 6 illustrates an operation method for the electronic device according to various embodiments.
  • FIG. 6 illustrates an operation of extracting activity information using one piece of data.
  • the electronic device 101 e.g., the processor 120
  • the processor 120 may acquire health related data.
  • the processor 120 may receive health related data from the external device (or an external electronic device) (e.g., one of the electronic device 102, the electronic device 104, and the wearable device 1 420 to the wearable device 4 450) through the communication interface 170.
  • the electronic device 101 may autonomously acquire health related data.
  • the electronic device 101 may acquire health related data using various sensor modules (e.g., the sensor module 240 of FIG. 2).
  • the health related data may be data collected or measured by the external device or the electronic device 101, for example, the number of steps, running, cycling, swimming, sleeping, resting, and the like.
  • the acquired data may be stored in the memory 130.
  • the electronic device 101 (for example, the processor 120) may correct the acquired data.
  • the electronic device 101 may store data in a unit of minutes
  • the wearable device 1 420 and wearable device 3 440 may store data in a unit of five minutes
  • the wearable device 2 430 and wearable device 4 450 may store data in a unit of ten minutes. Since the unit time for storing data is different for each device, the processor 120 may divide the received data according to the unit time for data storage of the electronic device 101.
  • the processor 120 may divide the received data by a unit of one minute. In operation 601, when the electronic device 101 has autonomously acquired the data not received from the external device, the operation 603 may not be performed.
  • examples of the unit time are only examples, and the unit time for storing data in the electronic device 101 or the external device may be 30 seconds, 3 minutes, 5 minutes, or the like.
  • the electronic device 101 may analyze the corrected data.
  • various clustering or pattern recognition technologies may be utilized for data analysis methods.
  • the processor 120 may analyze the corrected data using various technologies, so as to extract an activity that is determined to be meaningful to a user. According to medical opinion, a person may be healthy if he or she can walk an average of 100 or more steps over 10 minutes.
  • the processor 120 may analyze the received data using density-based spatial clustering of applications with noise (DBSCAN) among various clustering technologies based on the medical opinion.
  • DBSCAN density-based spatial clustering of applications with noise
  • the processor 120 may search for, from data divided by one minute, a cluster having a condition that the minPts of the DBSCAN is 10 steps or more and the eps is one minute, and determine whether a user's activity is walking or running based on the number of steps within the corresponding cluster.
  • the DBSCAN has two variables, and the minPts is the minimum number of objects to be included in one cluster, and the eps may mean the distance between objects.
  • the object is the number of steps, and the distance between the objects may be one minute, which is a unit of storing data of the electronic device 101.
  • the processor 120 may include, in one cluster, objects walking 10 or more steps from data divided by one minute.
  • the processor 120 may maintain a state where data is included in the cluster until a termination condition occurs.
  • the initial condition may be 10 or more steps per minute, or an average of 100 or more steps for 10 minutes.
  • the termination condition may correspond to a case where there are no detected steps or less than 10 steps per minute. Data analysis of the number of steps will be described in detail with reference to FIGS. 7A and 7B below.
  • the electronic device 101 may extract activity information based on the result of analysis.
  • the processor 120 may compare the sum of the walking with the sum of the running so as to extract an activity that has a larger value as activity information. For example, when the sum of the walking is 1000 and the sum of the running is 400 in one cluster, the processor 120 may extract a walking time and a total number of steps as the activity information. Alternatively, when the sum of the walking is 500 and the sum of the running is 2000, the processor 120 may extract a running time and a total running distance as activity information.
  • the location information on the walking distance or the running distance may be extracted as the activity information.
  • the processor 120 may determine the workout types by using the number of steps per unit time. For example, the processor 120 may determine as 'running' when the number of steps is 150 or more per minute, and determine as 'walking' when the number of steps is less than 150 per minute. The processor 120 may compare the determined sum of the walking with the sum of the running, so as to extract an activity that has a larger value as activity information.
  • the processor 120 may analyze the frequency of steps per unit time (e.g., one minute) so as to extract activity information.
  • the processor 120 may calculate the speed information using the GNSS module 227, or may determine whether the activity corresponds to an outdoor activity using an ultraviolet-ray sensor. For example, the processor 120 may calculate the moving speed for 10 minutes based on a change in location information when the number of steps generated for 10 minutes is equal to or greater than a predetermined number of steps (e.g., 1000 steps). When the calculated moving speed is equal to or higher than a predetermined speed (e.g., 20 Km/h), the processor 120 may extract the cycling, the moving speed, the moving distance, and the like as activity information.
  • a predetermined speed e.g. 20 Km/h
  • the processor 120 may extract the cycling, the moving speed, the moving distance, and the like as activity information. For example, when the number of steps is equal to or greater than a predetermined number of steps (e.g., 100 steps) or is generated in a predetermined pattern, the processor 120 may recognize the same as the outdoor activity if the sensor value of the ultraviolet sensor is a value that can be collected outdoors.
  • a predetermined number of steps e.g. 100 steps
  • the processor 120 may recognize the same as the outdoor activity if the sensor value of the ultraviolet sensor is a value that can be collected outdoors.
  • the electronic device 101 may store the extracted activity information.
  • the processor 120 may store the extracted activity information in the memory 130.
  • the activity information represents information such as walking, running, or total number of steps, total running distance, etc., and may be provided through a user interface that may be easily distinguished by a user.
  • FIGS. 7A and 7B illustrate an example of data analysis according to various embodiments.
  • the electronic device 101 may determine activity information based on the number of steps per unit time.
  • the processor 120 may classify health related data in a unit of one minute, and include an object, which has generated 10 steps or more per minute in a workout 1 710.
  • the workout 1 710 may include objects (for example 721 and 722) walking average 100 steps or more for ten minutes and an object 723 walking 10 steps or more per minute.
  • the processor 120 may include, in the cluster 1 720, the number of steps which has generated until the termination condition occurs.
  • the processor 120 may also include the object 723, which has generated 10 steps or more per minute, in the cluster 1 720.
  • the processor 120 may determine the case as an initial condition that is included in one workout (or cluster).
  • the termination condition may correspond to a case where there is no detected steps or less than 10 steps per minute.
  • the processor 120 may not include, in the workout 1 710, objects (for example, 724, 725, and 726) in which there are no detected steps or the object 727 that has generated less than 10 steps.
  • the processor 120 may classify health related data in a unit of one minute, and if the number of steps per minute is less than a predetermined number of steps (e.g., 10 steps), the processor 120 may determine that the initial condition is not satisfied, and may not include the data in a workout 2 750. For example, since the objects (for example 767 and 768) do not generate steps per minute and do not satisfy the initial condition to be included in the cluster, the objects 767 and 768 may not be included in the workout 2 750.
  • a predetermined number of steps e.g. 10 steps
  • the processor 120 may include, in the cluster 2 760, a case where the number of steps per minute is equal to or greater than a predetermined number of steps (e.g., 10 steps), or where the average number of steps for 10 minutes is equal to or greater than a predetermined number of steps (e.g., average 100 steps).
  • a predetermined number of steps e.g. 10 steps
  • the processor 120 may also include the objects (for example 763, 764, and 765), which have generated 10 steps or more per one minute, in the cluster 2 760.
  • the processor 120 may determine that an object 766 walking less than 10 steps corresponds to a termination condition.
  • the processor 120 may do not include the object 766 walking less than 10 steps in the cluster 2 760.
  • FIG. 8 illustrates a data integration method of an electronic device according to various embodiments.
  • FIG. 8 illustrates an operation of extracting activity information using one or more pieces of data.
  • the electronic device 101 e.g., the processor 120
  • the processor 120 may collect health related data from one or more devices.
  • the processor 120 may receive health related data from the external device (or external electronic device) (e.g., one of the electronic device 102, the electronic device 104, and the wearable device 1 420 to the wearable device 450) through the communication interface 170.
  • the electronic device 101 may acquire health related data using various sensor modules (e.g., the sensor module 240 of FIG. 2).
  • the electronic device 101 may include a first wireless sensor in the housing (or body).
  • the housing may be analyzed as a frame (or case) that receives components of the electronic device 101 (e.g., the processor 120, the memory 130, etc.).
  • the first wireless sensor may be a sensor (e.g., the sensor module 240) that measures the number of steps, activity information, and non-activity information.
  • the processor 120 may monitor the movement of the housing using the first wireless sensor so as to generate first data for a first time period.
  • the external device may include a second wireless sensor.
  • the second wireless sensor may be a sensor (e.g., the step number counting unit 421 and workout measurement unit 422) that measures the number of steps, activity information, and non-activity information of the external device.
  • the processor 120 may generate a wireless communication channel (e.g., a communication protocol associated with the external device) with the external device using the communication interface 170 (or referred to as a "wireless communication circuit").
  • the processor 120 may receive the second data acquired for the first time period through the wireless communication channel, and calculate, as a value for the first time period, a value smaller than the sum of a first value based on the first data and a second value based on the second data.
  • the electronic device 101 may analyze the collected data and classify the same depending on activity types. For example, the processor 120 may correct the data as shown in operation 603 of FIG. 6 before the data analysis is performed. Since the unit time for storing data is different for each device, a pre-processing operation is required to process data received from a plurality of devices. The processor 120 may divide the received data according to a unit time (e.g., one minute) for storing data of the electronic device 101. The processor 120 may analyze the corrected data and classify the corrected data depending on the activity types.
  • a unit time e.g., one minute
  • the activity types may be classified into three categories of the number of steps (e.g., first activity type), activity information (e.g., second activity type), and non-activity information (e.g., third activity type).
  • first activity type e.g., first activity type
  • activity information e.g., second activity type
  • non-activity information e.g., third activity type.
  • three classifications of activity types are described as examples, but the activity types are not limited to thereto.
  • the classifications of data depending on the activity types may be for easily integrating data with different characteristics.
  • the processor 120 may classify the number of steps, activity information, and non-activity information for each device.
  • the number of steps may have the characteristics of being constantly taken by a user, and the start and the end may not be clear. This is because, if movement is detected even during sleeping or sitting, it may be determined that a number of steps has been taken by a user.
  • the unit time for storage are very diverse, such as 1 day, 10 minutes, 5 minutes, and the like. Therefore, in order to provide more accurate and meaningful number of steps to the user, it may be required to correct the number of steps detected by each of the plurality of external devices.
  • the activity information may be the record of activities having a clear start and end such as running, walking, swimming, cycling, yoga, and the like. Such activity information may be automatically recognized and stored by a user or an external device or the electronic device 101. However, the activity information may be detected in duplicate when collected from a plurality of devices.
  • the non-activity information may be a record that represents a state where a user is in a state of being inactive, such as sleeping, resting (e.g., sitting), and the like. Such non-activity information may cause an empty space in which there is no detection because a recognition rate differs for each device. When such empty spaces are integrated, some erroneous recognition of activities, that is, a case where a sleeping state or a sitting situation that is falsely recognized as walking may be excluded, more accurate information may be provided to the user.
  • the electronic device 101 may integrate data for each activity type.
  • the processor 120 may list the number of steps, activity information, or non-activity information in a time sequence, and may integrate data of the same activity type into one piece of data.
  • the processor 120 may integrate the data based on the priority of at least one of time, an activity type, a workout type, and a device.
  • the priority may be configured by the user or configured to the electronic device 101 by a default value.
  • the processor 120 may integrate data based on the activity type occurred the very first time when the time has a higher priority.
  • the processor 120 may integrate the data based on the start and end of at least one of the number of steps, the activity information, or the non-activity information, and may integrate data for the remaining activity types. For example, when the activity information has a priority, and when the activity information and the number of steps or the non-activity information overlap to each other, the processor 120 may integrate data on the activity information based on the start and end of the activity information, and integrate data on the number of steps or the non-activity information.
  • the processor 120 may integrate the data based on the start and end of at least one of walking, running, cycling, swimming, and may integrate the data for the remaining activity types. For example, when the cycling has a higher priority, the processor 120 may integrate data for cycling based on the start and end of cycling, and integrate data on the remaining activity information, number of steps, or non-activity information. Alternatively, when the device has a higher priority, the processor 120 may integrate at least one of number of steps, activity information, or non-activity information of a device having a lower priority, based on the start and end of a device having a higher priority.
  • the electronic device 101 may correct the integrated data based on weights.
  • the processor 120 may assign different weights to at least one of an activity type, a workout type, and a device. For example, when a weight is assigned to an activity type, since it may be determined that the activity information is more meaningful to a user than the number of steps or the non-activity information, the processor 120 may assign a higher weight to the activity information. For example, when performing data correction, the processor 120 may configure higher weights in the sequence of activity information, non-activity information, and number of steps. Since the number of steps is always automatically counted and the error recognition rate is high due to technical limitations, it may be configured to assign the lowest weight to the number of steps.
  • more accurate activity information may be provided to the user by assigning a higher weight to cycling, swimming, etc., which have more clear start and end times than walking or running has.
  • more accurate activity information may be provided to the user by assigning a higher weight to the electronic device 101 than the external device.
  • the processor 120 may assign different weights to at least one of an activity type, a workout type, and a device, respectively, and may correct data by comprehensively considering each weight. For example, the processor 120 may configure different weights on workout types or devices depending on activity types. When the second activity type on the activity information has a higher weight, the processor 120 may assign a higher weight to the external device than the electronic device 101. Alternatively, the processor 120 may configure a different weight for each device according to the workout type. When the workout type is swimming, the processor 120 may assign a higher weight to the external device than the electronic device 101, and when the workout type is cycling, the processor 120 may assign a higher weight to the electronic device 101 than the external device. This method may be intended to provide more meaningful and more accurate information to the user.
  • the processor 120 may integrate data based on the priorities, but may also correct the integrated data based on the weights. That is, when integrating data according to the priority in operation 805, the processor 120 may skip the operation 807 without performing thereof. Alternatively, when integrating data according to the priority in operation 805, the processor 120 may perform the operation 807.
  • the weight may be configured by the user or configured to the electronic device 101 by a default value.
  • the processor 120 may assign a different weight to each priority, and may correct data by comprehensively considering each weight.
  • the weight configured to perform the data correction may be proportional or inversely proportional to the priority configured to perform the data integration. For example, when the priority configured to perform the data integration is high, the weight configured to perform the data correction may also be high. Alternatively, when the priority configured to perform the data integration is high, the weight configured to perform the data correction may be low.
  • the electronic device 101 may provide a user interface based on the corrected data.
  • the user interface may provide activity information on the corrected data along with activity information for each device.
  • the processor 120 may generate first data (e.g., number of steps, cycling, swimming, non-activity, etc.) for the first time period by using a motion sensor (e.g., sensor module 240) provided in the electronic device 101, determine a first attribute (e.., the number of steps) of the movement of the electronic device 101 during a first session of a first time period using a first portion of the first data, determine a second attribute (e.g., activity information) of the movement of the electronic device 101 during a second session of the first time period by using a second portion of the first data, select one of the first attribute or the second attribute, and display at least one of the image, text, or symbol representing the selected attribute through a user interface displayed on the display 160.
  • first data e.g., number of steps, cycling, swimming, non-activity, etc.
  • the processor 120 may provide the user interface through the selected application.
  • the processor 120 may execute an application associated with the connected device to provide the user interface.
  • the processor 120 may attach a tag (e.g., an auto tag) to the activity information on the corrected data so as to allow a user to easily recognize the corrected data.
  • FIG. 9 illustrates a method for integrating the number of steps by an electronic device according to various embodiments.
  • FIG. 9 may be a drawing that embodies the data integration operation 805 of FIG. 8. That is, FIG. 9 illustrates the operation of integrating the number of steps for the first activity type.
  • the electronic device 101 e.g., the processor 120
  • the electronic device 101 may classify the number of steps for each device per unit time. Each device may have a different unit time for storing data according to the performance of hardware or software. For example, the number of steps may be stored in each device in various units of times, such as 1 minute, 5 minutes, 10 minutes, and the like. Thus, in order to integrate the number of steps collected from devices having different storage unit, it is required to classify data in a predetermined size. To this end, the processor 120 may classify the number of steps for each device depending on the unit time for storing data of the electronic device 101.
  • the integration of the number of steps a user has taken may be done in various methods, a method which does not allow a user to realize the reduction in the number of steps while minimizing the error may be used.
  • the Max method may be used for the method of integrating the number of steps. Since the reduction in the number of steps may be a factor that hinders a user's experience, the processor 120 may integrate the number of steps using the max method to indicate a value that is always greater than the number of steps checked in the plurality of devices. For reference, when the number of steps is integrated into the average value of a plurality of devices or other integration methods are used, the max method may be suitable because the number of steps may have the possibility of being smaller than the number of steps measured by a single device.
  • the processor 120 may calculate a value, as the value for a first time period, which is less than the sum of a first value based on first data autonomously generated for the first time period and a second value based on second data measured from the external device.
  • a value as the value for a first time period, which is less than the sum of a first value based on first data autonomously generated for the first time period and a second value based on second data measured from the external device.
  • the electronic device 101 may assign an index per unit time. For example, since a day is 24 hours, a total of 1,440 indices may be assigned when the index is assigned in a unit of one minute. In mathematics, since the index is used from 0, the processor 120 may assign indices from 0 to 1339.
  • the electronic device 101 may determine the maximum number of steps per unit time based on the index. For example, the processor 120 may determine the maximum number of steps per unit time using Equation 1.
  • the source i may refer to an external device (e.g., wearable device, electronic device 101) that has acquired the number of steps, x may refer to an index, and combined [x] may refer to the maximum number of steps per unit time. If i is 0, it may refer to the first device that has acquired the number of steps, and if i is 1, it may refer to the second device that has acquired the number of steps, and if i is i, it may refer to the i-th device that has acquired the number of steps.
  • an external device e.g., wearable device, electronic device 101
  • x may refer to an index
  • combined [x] may refer to the maximum number of steps per unit time. If i is 0, it may refer to the first device that has acquired the number of steps, and if i is 1, it may refer to the second device that has acquired the number of steps, and if i is i, it may refer to the i-th device that has acquired the number of steps.
  • the processor 120 may determine the number of steps as the number of steps corresponding to a unit time, which is obtained by measuring the maximum number of steps in the unit time among the number of steps measured by each of the plurality of devices.
  • the electronic device 101 may calculate the integrated number of steps based on the maximum number of steps.
  • the processor 120 may calculate the number of steps that is obtained by summing all the maximum number of steps per unit time, as the integrated number of steps during the time when the maximum number of steps is determined.
  • the processor 120 may integrate the number of steps measured by the wearable device 1 420 and the number of steps measured by the wearable device 3 440.
  • the processor 120 may integrate the number of steps measured by the wearable device 1 420, the number of steps measured by the electronic device 101 itself, and the number of steps measured by the application installed in the electronic device 101.
  • FIG. 10 illustrates an example of integrating the number of steps according to various embodiments.
  • reference numeral '1040' represents the number of steps measured by each device for a predetermined period of time (e.g., 09:00 to 09:10).
  • the electronic device 1010 may store the number of steps in a unit of one minute, so as to acquire a total number of 443 steps 1016 for ten minutes. Since the electronic device 1010 stores the number of steps in a unit of one minute, the processor 120 may acquire, as the number of steps, 100 steps 1011 at 09:01, zero at 09:02, 110 steps 1012 at 09:03, 120 steps 1013 at 09:04, 110 steps 1014 at 09:05, zero at 09:06, 3 steps 1015 at 09:07, and zero from 09:08 to 09:10.
  • the wearable device 1020 may store the number of steps in a unit of ten minutes, so as to acquire a total of 200 steps 1027 for ten minutes. For example, the wearable device 1020 may store the number of steps in units of 10 minutes, the wearable device 1020 may acquire, as the number of steps, 200 steps 1025 from 09:00 to 09:10.
  • the application 1030 installed in the electronic device 1010 may store the number of steps in a unit of 5 minutes, and acquire a total of 50 steps 1037 for 10 minutes. Since the application 1030 stores the number of steps in a unit of 5 minutes, the application 1030 may acquire, as the number of steps, 50 steps 1031 from 09:00 to 09:05, and zero from 09:06 to 09:10.
  • the processor 120 may not accurately know the time point of the occurrence of the number of steps so that the total number of steps may be divided by 10 to become an average value.
  • Reference numeral '1050' indicates the classification of the number of steps measured by each device in a unit of one minute for a predetermined time (e.g., from 09:00 to 09:10). Since the wearable device 1020 stores the number of steps in a unit of 10 minutes, the number of steps in a unit of one minute may be calculated by dividing the total number of steps 1027 by 10 in order to divide in a unit of one minute . The number of steps in a unit of one minute of the wearable device 1020 may be 20 steps 1021. That is, the processor 120 may acquire the number of steps in a unit of one minute of the wearable device 1020, as 20 steps 1021 at 09:01, 20 steps 1022 at 09:02, and 20 steps from 09:03 to 09:10, respectively.
  • the number of steps in a unit of one minute may be calculated by dividing 50 steps 1031 by five, which is the number of steps in units of five minutes (e.g., from 09:00 to 09:05), and dividing 0 steps by five, which is the remaining number of steps in a unit of five minutes (e.g., from 09:06 to 09:10) in order to divide in a unit of one minute.
  • the number of steps in a unit of one minute of the application 1060 may be 10 steps 1035 from 09:00 to 09:05, and may be zero 1036 from 09:06 to 09:10.
  • the processor 120 may acquire 10 steps 1035 from 09:00 to 09:05, and zero 1036 from 09:06 to 09:10, so as to acquire the number of steps of the application 1030 in a unit of one minute.
  • the processor 120 may determine the maximum number of steps among the number of steps per unit time as the number of steps per unit time.
  • the processor 120 may perform the integration 1060 of the number of steps of a plurality of devices by applying Equation 1. For example, since, at 09:01, the number of steps of the electronic device 1010 is 100 steps 1011, the number of steps of the wearable device 1020 is 20 steps 1021, and the number of steps of the application 1030 is 10 steps 1035, and therefore, the maximum number of steps may be 100 steps 1011, which is the number of steps of the electronic device 1010. In this case, the processor 120 may determine the number of steps at a unit time of 09: 01 as 100 steps 1061, which is the maximum number of steps.
  • the maximum number of steps may be 20 steps 1022, which is the number of steps of the wearable device 1020.
  • the processor 120 may determine the number of steps at a unit time of 09: 02 as 20 steps 1062, which is the maximum number of steps.
  • the processor 120 may calculate the maximum number of steps during 09:03 to 09:10 as described above, and calculate the integrated number of steps, as 550 steps 1070, which has summed all the calculated maximum number of steps.
  • steps 1070 which is the integrated number of steps, is greater than the integrated number of steps (e.g., 1016, 1027, and 1037) measured by the individual sources.
  • the integrated number of steps e.g., 1016, 1027, and 1037
  • the user experience related to the reduction of the number of steps may be reduced.
  • the processor 120 may calculate, as a value (e.g., the integrated number of steps) for a first time period, a value (e.g., 550 steps 1070) which is smaller than the sum of a first value (e.g., 443 steps 1016) based on the first data (e.g., the number of steps measured by the electronic device 1010) acquired for the first time period (e.g., from 09:00 to 09:10) and a second value (e.g., 200 steps 1027) based on the second data (e.g., the number of steps measured by the wearable device 1020).
  • a value e.g., the integrated number of steps
  • a value e.g., 550 steps 1070
  • the processor 120 may calculate, as a value (e.g., the integrated number of steps) for a first time period, a value (e.g., 550 steps 1070) which is smaller than the sum of a first value (e.g., 443 steps 1016) based on the
  • FIG. 11 illustrates a method for integrating activity information by an electronic device according to various embodiments.
  • FIG. 11 may be a drawing that embodies the data integration operation 805 of FIG. 8. That is, FIG. 11 illustrates the operation of integrating the activity information on the second activity type.
  • the electronic device 101 e.g., the processor 120
  • the workout type may be obtained by classifying various workouts, such as walking, running, cycling, swimming, yoga, and the like.
  • the processor 120 may determine, on the basis of the result of data analysis, that the workout type is 'running' when the number of steps measured for one minute is equal to or greater than the first predetermined number of steps (e.g., 150 steps), and determine that the workout type is 'walking' when the number of steps measured for one minute is less than the first predetermined number of steps.
  • the processor 120 may compare the determined sum of the walking with the sum of the running, and determine a workout, which has a larger value, as the workout type. For example, if the sum of the walking is 1000 and the sum of the running is 400, the processor 120 may determine the type of workout as 'walking'.
  • the first predetermined number of steps may be configured by the user or configured to the electronic device 101 by default.
  • the processor 120 may calculate a movement speed based on a change in location information for 10 minutes when the number of steps taken during a predetermined period of time is equal to or greater than the second predetermined number of steps (e.g., 1000 steps).
  • the processor 120 may identify that the moving speed corresponds to cycling, and determine the workout type as 'cycling'.
  • the processor 120 may identify that the moving speed corresponds to running, and determine the workout type as 'running'.
  • the predetermined speed may be configured by the user or configured to the electronic device 101 by default.
  • the electronic device 101 may identify the start and end of the workout type.
  • the processor 120 determines a workout type using data detected from a plurality of devices, but the start time and end time of the workout type may be different for each device.
  • the processor 120 may check the start time and the end time of the workout type measured by each device. For example, the start time of the cycling, measured by the wearable device 1020, is 09:30, and the start time autonomously measured by the electronic device 101 may be 09:20. Alternatively, the end time of the cycling, measured by the wearable device 1020, is 10:30, and the end time autonomously measured by the electronic device 101 may be 10:20.
  • the electronic device 101 may check the priority.
  • the priority may be at least one of time, an activity type, a workout type, and a device.
  • the priority may be configured by the user or configured to the electronic device 101 by a default value, and then stored in the memory 130.
  • the processor 120 may check whether the priority associated with the activity information is stored in the memory 130.
  • the electronic device 101 may integrate activity information for each workout type based on the priority. For example, when giving a higher priority to time, the processor 120 may integrate activity information based on the start time and the end time of the workout type that has occurred for the very first time. Alternatively, when giving a higher priority to the workout type, the processor 120 may integrate activity information based on the start time and the end time of the workout type that has a higher priority. For example, when the priority is higher in the sequence of cycling, walking, running, and swimming, and the workout times of cycling and walking are partially overlapped, the processor 120 may integrate the activity information of the cycling based on the start time and the end time of cycling. The processor 120 may determine the workout time of walking so as not to overlap with the workout time of cycling, and integrate activity information of the walking.
  • the processor 120 may integrate the activity information based on the workout time of the workout type measured by the external device, and integrate the activity information using the workout time of the workout type measured by the electronic device 101.
  • FIG. 12 illustrates an example of integrating activity information according to various embodiments.
  • the processor 120 may integrate activity information based on the start time of the workout type. For example, when it is recognized that the cycling 1211 has started first, as a result of analysis of the data measured by the application 1210, the processor 120 may integrate the activity information 1241 on the cycling based on the start time t1 of the cycling 1211.
  • the application 1210 may refer to an application installed in the electronic device 1220.
  • the processor 120 may check the end time t3 of the cycling 1211 and determine whether the end time t3 of the cycling 1211 overlaps with other activity information.
  • the processor 120 may integrate the second activity information based on the end time of the first activity information. For example, it can be seen that the end time t3 of the cycling 1211 overlaps the start time t2 of the swimming 1 1221 measured by the electronic device 1220. In this case, the processor 120 may integrate the activity information 1241 on the cycling until the end time t3 of the cycling 1211, and integrate the activity information on the swimming after the end time t3 of the cycling 1211.
  • a user may swim after configuring the swimming start time on the electronic device 1220 before beginning swimming.
  • a user may configure an expected swimming time (e.g., 30 minutes, one hour, etc.) to the electronic device 1220.
  • the swimming time actually taken by a user may be different from the swimming workout time measured by the electronic device 1220.
  • a user may swim by attaching the electronic device 1220 to the body, and in general the user may swim by wearing the wearable device 1230.
  • the start time t2 of the swimming 1 1221 measured by the electronic device 1220 may be different from the time t4 measured for the swimming 2 1231 measured by the wearable device 1230.
  • the end time t5 of the swimming 1 1221 measured by the electronic device 1220 may be different from the end time t6 measured for the swimming 2 1231 measured by the wearable device 1230.
  • the processor 120 may integrate the activity information 1242 on the swimming based on data on the swimming 1 1221 measured by the electronic device 1220 and data on the swimming 2 1231 measured by the wearable device 1230.
  • the processor 120 may integrate the activity information 1242 on the swimming based on the workout time t2 to t5 of the swimming 1 1221, measured by the electronic device 1220, after the end time t3 of the cycling 1211, and the workout time t4 to t6 of the swimming 2 1231 measured by the wearable device 1230.
  • activity information acquired from different devices is processed as one continuous activity, so that the user may intuitively identify the activity information.
  • the activity information when integrating activity information, may be integrated by assigning different priorities to each device. Unlike cycling, it can be seen that two devices have measured the swimming.
  • the processor 120 may assign different priorities to each device, so as to integrate the activity information. For example, in FIG. 12, the wearable device 1230 may have a higher priority than the electronic device 1220.
  • the processor 120 may assign a higher priority to the workout time t4 to t6 of the swimming 2 1231 measured by the wearable device 1230 than the workout time t2 to t5 of the swimming 1 1221 measured by the electronic device 1220, so as to integrate the activity information 1242 on the swimming.
  • FIG. 13 illustrates a method for integrating non-activity information by an electronic device according to various embodiments.
  • FIG. 13 may be a drawing that embodies the data integration operation 805 of FIG. 8. That is, FIG. 13 illustrates the operation of integrating non-activity information on a third activity type.
  • the electronic device 101 e.g., the processor 120
  • the inactive intervals may mean a stationary state such as taking a sleep, sitting, and the like without walking or activity (e.g., cycling, swimming, etc.). Alternatively, the inactive intervals may mean that there is no detection.
  • the processor 120 may align the inactive intervals for each time.
  • the electronic device 101 may determine whether activity information is included between inactive intervals. For example, the number of steps may be taken by a walker during a break. In this case, since the number of steps has been taken by a user when moving, such as going to the restroom in the middle of rest, and the like, the activity information may be included between inactive intervals.
  • the operation 1309 When the activity information is included between the inactive intervals, the operation 1309 may be performed, and when the activity information is not included therebeween, an operation 1305 may be performed.
  • the electronic device 101 may integrate the inactive intervals into one session.
  • the inactive interval may be integrated through a distance based clustering. For example, when the inactive interval is continuously displayed, the processor 120 may integrate the inactive intervals that have been continuously detected into one session.
  • the electronic device 101 may process the integrated session as non-activity information.
  • the processor 120 may store the processed non-activity information in the memory 130.
  • the electronic device 101 may determine whether activity information included between inactive intervals exceeds the reference activity information.
  • the reference activity information may be configured by the user or configured to the electronic device 101 by a default value, and then stored in the memory 130.
  • the reference activity information may be used for processing a small movement between inactive intervals as an error, for example, the reference activity information may be 10 steps or less, 5 minutes or less, and so on.
  • the processor 120 may return to the operation 1305 when the activity information included between the inactive intervals is less than or equal to the reference activity information.
  • the processor 120 may perform an operation 1311, and when the activity information included between the inactive intervals is less than or equal to the reference activity information, the processor 120 may perform the operation 1305.
  • the electronic device 101 may perform an activity information integration process when the activity information included between inactive intervals exceeds the reference activity information.
  • the activity information integration process may be an operation described in FIG. 11.
  • FIGS. 14A and 14B illustrate an example of integrating non-activity information according to various embodiments.
  • the inactive intervals may be aligned in a time sequence.
  • the processor 120 may process the case as an 'unknown interval' such as an object 1411 and an object 1412.
  • the unknown interval may refer to an interval in which there is no data detection in an external device, the electronic device 101, or an application, which measure health related data.
  • the processor 120 may analyze data collected in one or more devices, and when there is no measured data, as a result of analysis, the processor 120 may process the case as an 'unknown interval'.
  • the processor 120 may analyze the data measured by the electronic device 101 as sleep 1420.
  • the processor 120 may analyze the data measured by the external device as sitting 1430.
  • the processor 120 may recognize the sleep 1420 measured by the electronic device 101 as an inactive interval (e.g., first inactive interval), and recognize the sitting 1430 measured by the external device as an inactive interval (e.g., second inactive interval). When there is no data detection between data measured by the electronic device 101 and the data measured by the external device, the processor 120 may process the case as an unknown interval 1413.
  • an inactive interval e.g., first inactive interval
  • an inactive interval e.g., second inactive interval
  • the processor 120 may integrate inactive intervals, as shown in FIG. 14B. For example, the processor 120 may determine an unknown interval 1413 between the first inactive interval (e.g., sleep 1420) and the second inactive interval (e.g., sitting 1430) as continuous inactive intervals. The processor 120 may use the minimum distance by 10 minutes and determine total 55 minutes as inactivity information. The processor 120 may integrate the interval from a first inactive interval (e.g., sleep 1420) to a second inactive interval (e.g., sitting 1430) into one session. The processor 120 may process an integrated session that includes a first inactive interval (e.g., sleep 1420), an unknown interval 1413, a second inactive interval (e.g., sitting 1430) as inactivity information (stationary) 1440.
  • a first inactive interval e.g., sleep 1420
  • an unknown interval 1413 e.g., sleep 1420
  • a second inactive interval e.g., sitting 1430
  • FIG. 15 illustrates an example of integrating various activity type data according to various embodiments.
  • the processor 120 may integrate number of steps 1510, inactivity information 1520, and activity information 1530 according to each activity type, as shown in FIG. 9 to FIG. 14B (indicated by reference numeral 1540), and correct the integrated data by considering the priority (or weight) of the activity type.
  • the processor 120 may integrate the number of steps 1510, inactivity information 1520, and activity information 1530 by considering the priority (or weight) of the activity type.
  • the processor 120 may integrate the number of steps 1 (Step 1, 1511) from t1 to t3 (e.g., 9 minutes), integrate the number of steps 2 (Step 2, 1512) from t5 to t8 (e.g., 9 minutes), and integrate the number of steps 3 (Step 3, 1513) from t9 to t11 (e.g., 30 minutes).
  • the processor 120 may integrate inactivity information 1521 from t6 to t10.
  • the processor 120 may integrate activity information on the cycling 1531 from t2 to t4, and integrate activity information on the swimming 1532 from t4 to t7.
  • the priority may be higher in the order of the activity information 1530, the inactivity information 1520, and the number of steps 1510.
  • the processor 120 may configure a higher priority to the activity information 1530 because it may be determined that the activity information 1530 is more meaningful to a user than the number of steps 1510 or the inactivity information 1520. For reference, since the number of steps 1510 has a high error recognition rate, the processor 120 may configure the lowest priority to the number of steps 1510.
  • the processor 120 may correct the number of steps 1 1511 based on the activity information on the cycling 1531. For example, when the start time t2 of the cycling 1531 between the start time t1 and the end time t3 of the number of steps 1 1511 overlap with each other, the processor 120 may correct the end time t3 of the number of steps 1 1511 to be the start time t2 of the cycling 1531. That is, the corrected number of steps (S) 1541 may be corrected so as to be generated during the time from t1 to t2. In addition, activity information 1542 on the cycling 1531 may be corrected so as to be generated during the time from t2 to t4.
  • the processor 120 may correct the workout time based on at least one of a time, a workout type, and a device. For example, the processor 120 may correct the activity information 1543 on the swimming 1532, after the correction of the activity information 1542 on the cycling 1531 has been completed, which has first started based on time.
  • the processor 120 may place a priority on the swimming 1532 based on the priority of the activity type, and correct the number of steps 2 1512 and the inactive interval 1521 after the end time t7 of the swimming 1532. Accordingly, activity information 1543 on the swimming 1532 may be corrected so as to be generated during the time from t4 to t7.
  • the processor 120 may correct the number of steps 2 1512 based on the start time t6 of the inactive interval 1521 when the start time t5 of the number of steps 2 1512 and the start time t6 of the inactive interval 1521 are overlapped.
  • the number of steps 2 1512 may not be included in the integrated data. That is, when the number of steps 2 1512 is temporarily generated during the inactive interval 1521, the processor 120 may process the number of steps 2 1512 as an error.
  • the processor 120 may determine whether the number of steps 2 1512 that has been generated from the start time of the inactive interval 1521 exceeds the reference activity information.
  • the reference activity information may be used for processing a small movement between inactive intervals as an error, for example, the reference activity information may be 10 steps or less, 5 minutes or less, and so on.
  • the processor 120 may process the number of steps 2 1512 as an error and do not include the same in the integrated data. That is, the processor 120 may ignore the number of steps 2 1512 and do not include the same in the integrated data.
  • the processor 120 may correct the number of steps 3 1513 based on the end time t10 of the inactive interval 1521. In this case, the processor 120 may determine whether the number of steps 3 1513 that has been generated from the start time t9 of the number of steps 3 1513 to the end time t10 of the inactive interval 1521 exceeds the reference activity information. The processor 120 may process a part of the number of steps 3 1513 as an error when the number of steps 3 1513 that has been generated until the end time t10 of the inactive interval 1521 is less than or equal to the reference activity information. Therefore, the inactivity information 1544 may be corrected so as to be generated during the time from t7 to t10. In this case, the corrected number of steps 1545 may be corrected so as to be generated during the time from t10 to t11.
  • the processor 120 may store the corrected data 1540 to the memory 130.
  • FIG. 16 illustrates an example of a user interface for activity information according to various embodiments.
  • a user interface of FIG. 16 may be a user interface provided in operation 807 of FIG. 8.
  • the processor 120 may display a first user interface 1610 based on the corrected data.
  • the first user interface 1610 may include a map image 1611 displaying activity areas for walking and activity information, integrated information 1612 for consumed calories, moved distance, the longest period of active times, and activity information on running 1613, walking 1 1614, and walking 2 1615 in a time sequence.
  • the processor 120 may include a tag (e.g., auto) in the activity information on the running 1613, the walking 1 1614, and the walking 2 1615.
  • the processor 120 may shade the item color for the corrected data.
  • the processor 120 may differently display the item colors of the corrected data and the uncorrected data.
  • the first user interface 1610 may display such that at least one of the images, text, or symbols, associated with the corrected data, is overlapped in a map associated with locations that have been occupied for a predetermined time period.
  • the first user interface 1610 may display, on the map, data associated with the activity information, as icons, text, and the like.
  • the processor 120 may display a second user interface 1620 based on the corrected data.
  • the processor 120 may display the second user interface 1620.
  • the processor 120 may display the second user interface 1620 that includes brief information 1621 indicating the time and distance for the running, a map image 1622 indicating the running area on the map, and a graph 1623 indicating the running speed.
  • the processor 120 may display a third user interface 1630 based on the corrected data.
  • the processor 120 may display the third user interface 1630.
  • the processor 120 may display the third user interface 1630 that includes a duration 1631 and distance 1632 for the running, an amount of calories consumed by the running 1633, speed information 1634, pace information 1635, weather information 1636, and a photo button 1637.
  • the speed information 1634 may include an average speed and a maximum speed according to the time of the running.
  • the pace information 1635 may include an average stride (pace) and a maximum pace according to the distance of the running.
  • the weather information 1636 may include at least one of a weather icon, temperature, a weather type (e.g., clear, cloudy, rain, etc.), humidity, and wind direction.
  • FIG. 17 illustrates an example of a user interface for sharing activity information according to various embodiments.
  • the processor 120 may display a first user interface 1710 for sharing the activity information.
  • a health-related application is input, selected by a user, the processor 120 may display a first user interface 1710 through the selected application.
  • the health-related application may be installed in the electronic device 101 by a default value even without the user's request. Alternatively, the health-related application may be installed in the electronic device 101 at the request of the user.
  • the first user interface 1710 may be the first screen after executing the application, or may be provided when the user selects 'share' in order to share activity information. For example, when activity information on the walking 1 1614 is selected through the first user interface 1610 of FIG. 16, the processor 120 may display a first user interface 1710.
  • the first user interface 1710 may include an image 1711, an image add button 1712, a photo button 1713, a chart view button 1714, and a share button 1715.
  • the image 1711 may have been photographed by a user or already registered in an application.
  • a gallery application including the photographed photos may be executed.
  • various pop-up menus such as (1) camera execution, (2) gallery execution, and (3) cancellation may be displayed.
  • a camera application is executed and a preview image photographed by the camera may be displayed on the screen of the display 160.
  • the chart view button 1714 may display detailed information including a map image, a moved distance, a speed, and the like for the activity information displayed on the image 1711.
  • a menu for sharing the activity information displayed on the image 1711 with other users may be displayed.
  • the menu for sharing may be a list of applications (e.g., message, e-mail, etc.) for sharing or a list of partner information (e.g., name, phone number, etc.) included in the contact list.
  • the processor 120 may display a second user interface 1720 for sharing the activity information.
  • the second user interface 1720 may include a map image 1721, activity information 1722, a menu list 1723 such as photo, rewards, map view, and chart view, and a share button 1724.
  • the map image 1721 may be a map that designates activity information on the running 1613 as an activity area on the map.
  • the activity information 1722 may include the distance and time for the running 1613.
  • a screen for the corresponding item may be displayed.
  • the share button 1724 is selected, a menu for sharing the activity information displayed on the map image 1721 with other users may be displayed, similarly to the share button 1715.
  • FIG. 18 illustrates an example of a user interface for configuring a recognition priority of a wearable device according to various embodiments.
  • the processor 120 may display a first user interface 1810 for configuring a recognition priority.
  • the first user interface 1810 may include a recognition on/off item 1811 for configuring such that the currently connected device is preferentially recognized, a connection guide message 1812 generated according to turning on the on/off item 1811, and a location on/off item 1813 for configuring location information on the currently connected device.
  • the processor 120 may collect location information on the currently connected device.
  • the processor 120 may display a second user interface 1820 for configuring a recognition priority.
  • the second user interface 1820 may include a recognition on/off item 1821 and a location on/off item (1822) for one of the currently unconnected devices (e.g., the first device) or a device (e.g., the first device) selected by a user among a list of devices.
  • the second user interface 1820 may be configured such that the recognition on/off item 1811 is ON and the location on/off item 1822 is OFF. In order to protect the personal information of a user, the location on/off item 1822 may generally be turned OFF.
  • the processor 120 may display a third user interface 1830.
  • the third user interface 1830 may be configured such that a recognition on/off item 1831 is OFF and a location on/off item 1822 is OFF.
  • the processor 120 may display the third user interface 1830.
  • the user input for changing the ON state to the OFF state may be a drag input for moving the touched location to the OFF direction after touching ON.
  • FIG. 19 illustrates an example of a user interface for configuring location information according to various embodiments.
  • the processor 120 may display a first user interface 1910.
  • the first user interface 1910 may be similar to the second user interface 1820 of FIG. 18.
  • the first user interface 1910 may be configured such that a recognition on/off item 1911 is ON and a location on/off item 1912 is OFF.
  • the processor 120 may display a second user interface 1920.
  • the second user interface 1920 may be configured such that the first user interface 1910 includes a pop-up message 1921 thereon.
  • the pop-up message 1921 may include, a notification message indicating the entrance to the setting menu of the electronic device 101, in order to change the setting to provide the location information, a location setting 1922, a cancel button 1923, and a setting button 1924.
  • the location setting 1922 may be provided in a form of a check box. When the check box is checked (e.g., selected), the location setting 1922 is activated, and when the check box is not checked (e.g., not selected), the location setting 1922 may be deactivated.
  • the processor 120 may display a third user interface 1930.
  • the third user interface 1930 may be the one to be used for permission setting of the application of the electronic device 101.
  • the third user interface 1930 may display various items (e.g., contacts, locations, phones, etc.) associated with the permission setting of the application.
  • the electronic device 101 may be in an OFF state in order to protect the personal information of the user. That is, the location setting item 1931 of the third user interface 1930 may be in an OFF state.
  • the processor 120 may provide a fourth user interface 1940.
  • the processor 120 may display the fourth user interface 1940.
  • the location setting item 1941 in the fourth user interface 1940 may be in an ON state.
  • the processor 120 may provide a second user interface 1950.
  • the second user interface 1950 may be configured such that a location on/off item 1951 is configured to ON in the first user interface 1910.
  • the processor 120 may collect location information on a device associated with the second user interface 1950.
  • FIG. 20 illustrates a method for displaying a user interface by an electronic device according to various embodiments.
  • the electronic device 101 may detect an event for displaying activity information.
  • the event may be at least one of an application execution event displaying health related data, an event for selecting a map image through the application, and an event for selecting activity information through the application.
  • the processor 120 may provide a user interface or user experience associated with the activity information in response to the detected event. For example, the processor 120 may display the user interface (e.g., 1610 and 1620) of FIG. 16 in response to the detected event.
  • the electronic device 101 may determine whether the location information for the activity information exists.
  • the processor 120 may display an activity area for activity information based on the location information.
  • the processor 120 may not display an activity area associated with the activity information on the map.
  • the processor 120 may perform an operation 2005 when the location information on the activity information exists, and perform an operation 2015 when the location information on the activity information does not exist.
  • the electronic device 101 may calculate the activity area associated with the activity information.
  • the processor 120 may or may not provide location information for each device.
  • the sampling rate for the location information may be different for each device.
  • the processor 120 may represent the activity area in a circular form in order to provide a user interface. For example, when one piece of location information exists for one piece of activity information, the processor 120 may calculate an activity area having a predetermined radius with reference to location information. The predetermined radius may be configured by the user or configured to the electronic device 101 by a default value. When two pieces of location information exist for one piece of activity information, the processor 120 may calculate the center point using two end points.
  • the processor 120 may calculate the center point using the average method as shown in Equation 2.
  • Center (a, b) may be the coordinates of the center point, a x and a y are the coordinates of the first end point, and b x and b y are the coordinates of the second end point.
  • the processor 120 may calculate an activity area for activity information extracted from each device.
  • the electronic device 101 may determine whether there are multiple activity areas.
  • the processor 120 may provide activity information detected in one or more devices, and when there are multiple activity areas, the active areas may overlap with each other.
  • the processor 120 may perform an operation 2009, and when the active areas are not multiple, the processor 120 may perform an operation 2015.
  • the electronic device 101 may calculate the distance between activity areas adjacent to each other.
  • the processor 120 may calculate the distance between the activity areas using the distance between center points of activity areas.
  • the processor 120 may calculate the distance between two activity areas as shown in Equation 3.
  • dist (a, b) may be the distance between two activity areas, a x and a y may be the center coordinates of the first activity area, and b x , b y may be the center coordinates of the second activity area.
  • the processor 120 may calculate the distance between two activity areas using the Euclidean distance of Equation 3, and determine whether the two activity areas overlap with each other, using the calculated distance.
  • the electronic device 101 may determine whether the calculated distance is less than a reference distance.
  • the reference distance may be configured by the user or the electronic device 101.
  • the processor 120 may determine whether two activity areas are overlapped with each other as shown in Equation 4.
  • dist (a, b) may be the distance between two activity areas, ⁇ may be the reference distance, radius of icon 1 may be the center point (or icon) of the first activity area, and radius of icon 2 may be the center point (or icon) of the second activity area.
  • the processor 120 may perform an overlap control process.
  • the reference distance e.g., alpha ( ⁇ )
  • an overlap control process between two activity areas may be performed.
  • the reference distance e.g., alpha ( ⁇ )
  • the two activity areas may not overlap with each other, and when the reference distance is less than 1, the two activity areas may overlap by a predetermined size or more.
  • the processor 120 may configure the reference distance based on a map ratio of a map image where the active area is to be displayed, and based on the size of the activity area. For example, when the map ratio is increased or decreased, the size of the activity area may be increased or decreased. Alternatively, when the map ratio is increased or decreased, the size of the activity area may be decreased or increased. The map ratio and the size of the activity area may be proportional or inversely proportional to each other. According to various embodiments, when the size of the activity area is changed according to the map ratio, the processor 120 may adjust the reference distance according to the map ratio and the size of the activity area.
  • the processor 120 may perform an operation 2013 when the calculated distance is less than the reference distance, and may perform an operation 2015 when the calculated distance is equal to or greater than the reference distance.
  • the electronic device 101 may correct an icon within the activity area.
  • the processor 120 may place a priority on the activity areas in order to prevent as much as possible the case where activity areas on activity information extracted from the plurality of devices are overlapped to each other, thereby reducing the overlapping of the activity areas.
  • the priority may be configured to at least one of a radius of the activity area, a workout type, and a device.
  • the priority may be configured by the user or the electronic device 101.
  • a device which can be processed using a duplicated function and a device which cannot be processed in duplicates may be distinguished and displayed according to the priority. For example, the device that cannot be processed in duplicates may be shaded and then provided.
  • the processor 120 may perform the following operations.
  • the processor 120 may configure the priority in the descending order by the magnitude of the radius of the activity area, maintain icons in the activity area which has a higher priority, and delete icons in an activity area which has a lower priority.
  • the processor 120 may maintain the icon in the activity area based on a workout type having a higher priority, and delete the icon in the activity area of the workout type having a lower priority.
  • the processor 120 may maintain icons in the activity area based on the activity information extracted from the device having a higher priority, and delete icons in the activity area extracted a device having a lower priority.
  • the electronic device 101 may display a user interface for the activity information.
  • the processor 120 may display the user interface for the activity information without correcting icons in the activity area associated with the activity information.
  • the processor 120 may display a user interface for activity information on which the icons have corrected.
  • the processor 120 may display the user interface (e.g., 1610 and 1620) of FIG. 16 as the user interface for the activity information.
  • FIG. 21 illustrates an example of calculating an activity area according to various embodiments.
  • the processor 120 may calculate an activity area based on location information.
  • the processor 120 may calculate the center point using two end points (e.g., 2111 and 2114) .
  • the processor 120 may calculate the activity area 2110 using the calculated center point.
  • one piece of location information e.g., 2121
  • the processor 120 may calculate an activity area 2120 which has a predetermined radius with reference to location information.
  • the processor 120 may calculate the center point using two pieces of location information (e.g., 2132 and 2132).
  • the processor 120 may calculate the activity area 2130 using the calculated center point.
  • FIG. 22A to FIG. 24B illustrate examples of correcting an icon in an activity area based on a threshold according to various embodiments.
  • FIGS. 22A to 22D show an example of correcting an icon in an activity area when the threshold is small.
  • FIG. 22A shows activity areas (e.g., 2211, 2213) for activity information before the overlapped icons are corrected.
  • a first icon 2212 may be displayed in a first activity area 2211
  • a second icon 2214 may be displayed in a second activity area 2213.
  • the processor 120 may perform an icon overlapping process by applying different overlapping ratios to activity areas as shown in FIG. 22A.
  • FIG. 22B shows an example in which the icon overlapping process is performed by lowering the overlapping ratio.
  • FIG. 22C shows an example in which the icon overlapping process is performed by making the overlapping ratio to a medium ratio.
  • FIG. 22D shows an example in which the icon overlapping process is performed by increasing (e.g., high) the overlapping ratio. Referring to FIG.
  • the first icon 2212 included in the first activity area 2211 may be displayed, and the second icon 2214 displayed in the second activity area 2213 may be deleted. That is, when the threshold is small, it can be seen that the second icon 2214 displayed in the second activity area 2213 is deleted even if the overlapping ratio is differently configured.
  • FIGS. 23A to 23D show another example of correcting an icon in an activity area when the threshold is small.
  • FIG. 23A shows activity areas for activity information before the overlapped icons are corrected.
  • a first icon 2312 may be displayed in a first activity area 2311
  • a second icon 2314 may be displayed in a second activity area 2313.
  • Icons may also be displayed in a third activity area 2315 and a fourth activity area 2316, respectively.
  • the processor 120 may perform an icon overlapping process by applying different overlapping ratios to the activity areas as shown in FIG. 23A.
  • FIG. 23B shows an example in which the icon overlapping process is performed by lowering the overlapping ratio. When the overlapping ratio is configured to be low, as shown in FIG.
  • the first icon 2312 may be displayed in the first activity area 2311, and the second icon 2314 may be displayed in the second activity area 2313, as shown in FIG. 23A.
  • Icons may also be displayed in the third activity area 2315 and the fourth activity area 2316, respectively.
  • FIG. 23C shows an example in which the icon overlapping process is performed by making the overlapping ratio to a medium ratio.
  • FIG. 23D shows an example in which the icon overlapping process is performed by increasing (e.g., high) the overlapping ratio.
  • the first icon 2312 displayed in the first activity area 2311 may be deleted
  • the second icon 2314 may be displayed in the second active area 2313
  • icons may also be displayed in the third activity area 2315 and the fourth activity area 2316, respectively.
  • the icon may not be deleted even if the icon overlapping process is performed.
  • FIGS. 24A and 24B illustrate an example of correcting an icon in an activity area when the threshold is high.
  • FIG. 24A shows activity areas for activity information before the overlapped icons are corrected.
  • a first icon 2412 may be displayed in a first activity area 2411, and a second icon 2414 may be displayed in a second activity area 2413.
  • the processor 120 may perform the icon overlapping process by increasing the threshold.
  • the first icon 2412 may be displayed in the first activity area 2411, and the second icon 2414 may be deleted in the second activity area 2413.
  • icons that should not be deleted may be deleted.
  • the processor 120 may weigh the importance of the icon (e.g., the priority of the workout type) and the physical quantity for activity information, etc., so as to maintain an important icon, and delete an unimportant icon.
  • FIG. 25A to FIG. 26C illustrate examples of processing overlapped icons according to various embodiments.
  • FIGS. 25A to 25C illustrate an example of processing an overlapped icon using a greedy set cover.
  • the greedy set cover (Chvatal, V.: A Greedy Heuristic for the Set-Covering Problem. Math. of Oper. Res., Vol. 4, 1979, No. 3, pp. 233 ⁇ 235) may be one of the methods of processing an overlapped icon.
  • the processor 120 may display each icon in a first activity area 2510, a second activity area 2520, a third activity area 2530, a fourth activity area 2540, and a fifth activity area 2550, as shown in FIG. 25A.
  • the processor 120 may display each activity area in a time sequence, as shown in FIG. 25A, if the icons are not prioritized.
  • An icon for each activity area may be configured to 1, 2, 3, 4, and 5 in order.
  • the initial set ⁇ may be an empty state, and become ⁇ 1 ⁇ and ⁇ 1, 2 ⁇ in order, and icons 1 and 2 may satisfy the overlapping condition.
  • the overlapping condition is described in operation 2111 described above, and when the center point distance between the first activity area 2510 and the second activity area 2520 including the icons 1 and 2 is smaller than the reference distance, the processor 120 may determine that the overlapping condition is satisfied. Since the icons 1 and 2 satisfy the overlapping condition, the processor 120 may delete the icon 2. That is, the processor 120 may delete the icon included in the second activity area 2520 as shown in FIG. 25B.
  • the processor 120 may configure set ⁇ 1, 3 ⁇ , and determine the overlapping condition between the icons 1 and 3. Since the overlapping condition is not satisfied between the icons 1 and 3, the processor 120 may configure a set ⁇ 1, 3, 4 ⁇ by successively adding an icon 4. Icons 3 and 4 included in the third activity area 2530 and the fourth activity area 2540, which are two adjacent activity areas, may satisfy the overlapping condition. In this case, the processor 120 may delete the icon included in the fourth activity area 2540 as shown in FIG. 25C.
  • the processor 120 may configure a set ⁇ 1, 3, 5 ⁇ by adding an icon 5 in the state of set ⁇ 1, 3 ⁇ , and determine the overlapping condition between the icons 3 and 5. Since the icons 3 and 5 do not satisfy the overlapping condition, the processor 120 may terminate the icon overlapping process.
  • FIG. 25C which has completed the icon overlapping process, only icons for the first activity area 2510, the third activity area 2530, and the fifth activity area 2550 may be maintained, and the icons for the second activity area 2520 and the fourth activity area 2540 may be deleted.
  • the processor 120 may display a user interface that includes activity information as shown in FIG. 25C, in operation 1215.
  • the processor 120 may assign a priority to at least one of a radius of an activity area, an workout type, and a device.
  • FIGS. 26A to 26C illustrate an example of processing overlapped icons by placing a priority on workout types.
  • the processor 120 may display each icon in a first activity area 2611, a second activity area 2612, a third activity area 2613, and a fourth activity area 2614, as shown in FIG. 26A.
  • the first activity area 2611 and the fourth activity area 2614 may correspond to the activity information when the workout type is 'cycling'
  • the second activity area 2612 and the third activity area 2613 may correspond to the activity information when the type of workout is 'walking'.
  • the processor 120 may configure the set in the sequence of higher priorities and apply the greedy set cover to the same.
  • An icon for each activity area may be configured to 1, 2, 3, and 4 in order.
  • the initial set ⁇ may be an empty state, become ⁇ 1 ⁇ and ⁇ 1, 4 ⁇ in the sequence of higher priorities, and icons 1 and 4 may not satisfy the overlapping condition. That is, since the icons of the first activity area 2611 and the fourth activity area 2614 do not overlap each other, the processor 120 may successively add an icon 2 to configure a set ⁇ 1, 4, 2 ⁇ . Icons 4 and 2 included in the fourth activity area 2614 and the second activity area 2612, which are two adjacent activity areas, may satisfy the overlapping condition. In this case, the processor 120 may delete the icon included in the second activity area 2612 as shown in FIG. 26B. The icon 2 is deleted based on the priority without deleting icon 4.
  • the processor 120 may configure a set ⁇ 1, 4, 3 ⁇ by adding the icon 3 in a state of set ⁇ 1, 4 ⁇ , and determine the overlapping condition between the icons 4 and 3. Since the icons 4 and 3 satisfy the overlapping condition, the processor 120 may delete the icon included in the third activity area 2613, as shown in FIG. 26C. The icon 3 is deleted based on the priority without deleting icon 4. Since the activity area to be included in the set does not exist anymore after performing the above process, the processor 120 may terminate the icon overlapping process. In FIG. 26C, which has completed the icon overlapping process, only icons for the first activity area 2611 and the fourth activity area 2614 may be maintained, and icons for the second activity area 2612 and the third activity area 2613 may be deleted.
  • the processor 120 may perform an icon overlapping process using a greedy set cover and a threshold, so as to provide a user with an activity area in which icons are not overlapped.
  • An operation method for an electronic device may include operations of: acquiring health related data; correcting the acquired data per unit time; extracting activity information by analyzing the corrected data; and displaying a user interface including the activity information in response to a user request.
  • the operation of correcting the acquired data may include an operation of correcting the acquired data based on a time unit for storing data of the electronic device.
  • the operation of extracting the activity information may include an operation of: analyzing data acquired from an external device and data acquired by using the sensor module of the electronic device, so as to classify the data depending on an activity type; and integrating the data depending on the activity type.
  • the data integration operation may include an operation of integrating data based on a priority of at least one of a time, an activity type, a workout type, and a device.
  • the operation method may further include operations of: assigning different weights to at least one of the activity type, the workout type, and the device; and correcting the integrated data based on the weights.
  • the operation method may further include operations of: classifying the number of steps for each device per unit time; determining the maximum number of steps per unit time; and calculating the integrated number of steps based on the determined maximum number of steps.
  • the operation method may further include operations of: determining a workout type based on the result of data analysis; determining the start and end of the workout type; and integrating activity information for each workout type based on the priority.
  • the operation method may further include operations of: aligning inactive intervals for each time; and integrating inactive intervals that do not include the activity information into one session, so as to process the integrated session as non-activity information.
  • the displaying operation may include operations of: extracting location information for the activity information; calculating an activity area for the activity information based on the extracted location information; calculating a distance between two adjacent activity areas; correcting an icon in an activity area based on the calculated distance; and displaying a user interface including the corrected icon.
  • the operation of correcting the icon may include operations of: determining such that the icon overlap condition is satisfied when the calculated distance is less than a reference distance; and deleting at least one of icons included in the adjacent two active areas.
  • a computer-readable recording media can include a hard disk, a floppy disk, a magnetic media (e.g., a magnetic tape), an optical media (e.g., a Compact Disc - Read Only Memory (CD-ROM) and/or Digital Versatile Disk (DVD)), a Magneto-Optical Media (e.g., a floptical disk), an internal memory, etc.
  • An instruction can include a code made by a compiler or a code executable by an interpreter.
  • a module or a program module according to various exemplary embodiments can further include at least one or more of the aforementioned constituent elements, or omit some, or further include another constituent element.
  • Operations carried out by a module, a program module or another constituent element according to various exemplary embodiments can be executed in a sequential, parallel, repeated or heuristic method, or at least some operations can be executed in different order or can be omitted, or another operation can be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • Epidemiology (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • General Business, Economics & Management (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Tourism & Hospitality (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Child & Adolescent Psychology (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Multi-Process Working Machines And Systems (AREA)
  • Arrangements For Transmission Of Measured Signals (AREA)

Abstract

An electronic device includes a housing, with a display exposed through a part of the housing. The housing includes a first motion sensor to detect movement of the housing, a wireless communication circuit, a processor, and a memory that stores instructions to be executed by a processor. The instructions include generating a wireless communication channel with an external electronic device including a second motion sensor; monitoring the movement of the housing to generate first data for a first time period; receiving second data for the first time period through the wireless communication channel; calculating, as a value for the first time period, a value, smaller than the sum of a first value based on the first data and a second value based on the second data; and displaying the calculated value through a user interface displayed on the display.

Description

METHOD FOR INTEGRATING AND PROVIDING COLLECTED DATA FROM MULTIPLE DEVICES AND ELECTRONIC DEVICE FOR IMPLEMENTING SAME
Various embodiments relate to a method for integrating and providing collected data from multiple devices, and an electronic device for implementing the same.
Recently, with the development of a digital technology, various types of electronic devices such as a mobile communication terminal, Personal Digital Assistant (PDA), an electronic scheduler, a smart phone, a tablet Personal Computer (PC), a wearable device, and the like, have been widely used. The electronic device has various functions such as a voice call, message transmission like a Short Message Service (SMS)/Multimedia Message Service (MMS), a video call, electronic organizer, photography, email transmission/reception, broadcast reproduction, Internet, music reproduction, schedule management, Social Networking Service (SNS), messenger, dictionary, game, and the like.
As interests in health increase, wearable devices that measure user activity, or applications that show the measured user activity are actively being developed. The wearable devices may also widely be used for medical services in the future. Since the conventional user activity display technology usually uses a single device (e.g., a wearable device) to display, it cannot provide a seamless user experience using a plurality of devices when a user uses the plurality of devices.
To address the above-discussed deficiencies, it is a primary object to provide a method and apparatus that can provide a user interface or a user experience, which can analyze health related data collected from one or more electronic devices so as to find a meaningful active interval for the user, and allow the user to intuitively determine data characteristics according to the active interval.
An electronic device according to various embodiments includes: a housing; a display exposed through a part of the housing; a first motion sensor disposed within the housing and configured to detect the movement of the housing; a wireless communication circuit disposed within the housing; a processor disposed within the housing and electrically connected to the display, the first motion sensor, and the wireless communication circuit; and a memory electrically connected to the processor, wherein the memory stores instructions which, when executed by a processor, cause the processor to perform operations including: generating a wireless communication channel with an external electronic device including a second motion sensor, using the wireless communication circuit; monitoring the movement of the housing using the first motion sensor, so as to generate first data for a first time period; receiving second data acquired for the first time period through the wireless communication channel, using the second motion sensor; calculating, as a value for the first time period, a value smaller than the sum of a first value based on the first data and a second value based on the second data; and displaying the calculated value through a user interface displayed on the display.
An electronic device according to various embodiments includes: a housing; a display exposed through a part of the housing; a motion sensor disposed within the housing and configured to detect the movement of the housing; a wireless communication circuit disposed within the housing; a processor disposed within the housing and electrically connected to the display, the motion sensor, and the wireless communication circuit; and a memory electrically connected to the processor, wherein the memory stores instructions which, when executed by a processor, cause the processor to perform operations including: monitoring the movement of the housing using the motion sensor so as to generate first data for a first time period; determining a first attribute of the movement during a first session of the first time period, using a first portion of the first data; determining a second attribute of the movement during a second session of the first time period, using a second portion of the first data; selecting one of the first attribute and the second attribute; and displaying at least one of an image, a text, or a symbol representing the selected attribute through a user interface displayed on the display.
An electronic device according to various embodiments includes a memory, a display, a communication interface, and a processor functionally connected to the memory, the display, or the communication interface, wherein the processor may be configured to acquire, through the communication interface, health related data collected from an external device, correct the acquired data per unit time, analyze the corrected data to extract activity information, store the extracted activity information in the memory, and display a user interface including the activity information on the display in response to a user request.
An operation method for an electronic device according to various embodiments may include: acquiring health related data; correcting the acquired data per unit time; analyzing the corrected data to extract activity information; and displaying a user interface including the activity information in response to a user request.
According to various embodiments, a user interface or user experience can be provided, which can analyze health related data collected from one or more electronic devices so as to find a meaningful active interval for the user, and allow the user to intuitively determine data characteristics according to the active interval.
According to various embodiments, health related data collected from a plurality of electronic devices may be integrated, so as to provide a seamless user experience for the number of steps, activity information, or non-activity information of the user.
Before undertaking the DETAILED DESCRIPTION below, it may be advantageous to set forth definitions of certain words and phrases used throughout this patent document: the terms "include" and "comprise," as well as derivatives thereof, mean inclusion without limitation; the term "or," is inclusive, meaning and/or; the phrases "associated with" and "associated therewith," as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like; and the term "controller" means any device, system or part thereof that controls at least one operation, such a device may be implemented in hardware, firmware or software, or some combination of at least two of the same. It should be noted that the functionality associated with any particular controller may be centralized or distributed, whether locally or remotely. Definitions for certain words and phrases are provided throughout this patent document, those of ordinary skill in the art should understand that in many, if not most instances, such definitions apply to prior, as well as future uses of such defined words and phrases.
For a more complete understanding of the present disclosure and its advantages, reference is now made to the following description taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts:
FIG. 1 illustrates an electronic device within a network environment according to various embodiments;
FIG. 2 illustrates a configuration of an electronic device according to various embodiments;
FIG. 3 illustrates a program module according to various embodiments;
FIG. 4 illustrates the configuration of an electronic device and a wearable device according to various embodiments;
FIG. 5 illustrates an example of a health related data flow according to various embodiments;
FIG. 6 illustrates an operation method for an electronic device according to various embodiments;
FIGS. 7A and 7B illustrate an example of data analysis according to various embodiments;
FIG. 8 illustrates a data integration method by an electronic device according to various embodiments;
FIG. 9 illustrates a method for integrating the number of steps by an electronic device according to various embodiments;
FIG. 10 illustrates an example of integrating the number of steps according to various embodiments;
FIG. 11 illustrates a method for integrating activity information by an electronic device according to various embodiments;
FIG. 12 illustrates an example of integrating activity information according to various embodiments;
FIG. 13 illustrates a method for integrating non-activity information by an electronic device according to various embodiments;
FIGS. 14A and 14B illustrate an example of integrating non-activity information according to various embodiments;
FIG. 15 illustrates an example of integrating data of various activity type according to various embodiments;
FIG. 16 illustrates an example of a user interface for activity information according to various embodiments;
FIG. 17 illustrates an example of a user interface for sharing activity information according to various embodiments;
FIG. 18 illustrates an example of a user interface for configuring a recognition priority of a wearable device according to various embodiments;
FIG. 19 illustrates an example of a user interface for configuring location information according to various embodiments;
FIG. 20 illustrates a method for displaying a user interface by an electronic device according to various embodiments;
FIG. 21 illustrates an example of calculating an active area according to various embodiments;
FIG. 22A to FIG. 24B illustrate an example of correcting an icon in an active area based on a threshold according to various embodiments; and
FIG. 25A to FIG. 26C illustrate an example of processing overlapped icons according to various embodiments.
FIGURES 1 through 26C, discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged electronic device.
Hereinafter, various embodiments of the present disclosure will be described with reference to the accompanying drawings. However, it should be understood that there is no intent to limit the present disclosure to the particular forms disclosed herein; rather, the present disclosure should be construed to cover various modifications, equivalents, and/or alternatives of embodiments of the present disclosure. In describing the drawings, similar reference numerals may be used to designate similar constituent elements. As used herein, the expression "have", "may have", "include", or "may include" refers to the existence of a corresponding feature (e.g., numeral, function, operation, or constituent element such as component), and does not exclude one or more additional features. In the present disclosure, the expression "A or B", "at least one of A or/and B", or "one or more of A or/and B" may include all possible combinations of the items listed. For example, the expression "A or B", "at least one of A and B", or "at least one of A or B" refers to all of (1) including at least one A, (2) including at least one B, or (3) including all of at least one A and at least one B. The expression "a first", "a second", "the first", or "the second" used in various embodiments of the present disclosure may modify various components regardless of the order and/or the importance but does not limit the corresponding components. For example, a first user device and a second user device indicate different user devices although both of them are user devices. For example, a first element may be termed a second element, and similarly, a second element may be termed a first element without departing from the scope of the present disclosure.
It should be understood that when an element (e.g., first element) is referred to as being (operatively or communicatively) "connected," or "coupled," to another element (e.g., second element), it may be directly connected or coupled directly to the other element or any other element (e.g., third element) may be interposer between them. In contrast, it may be understood that when an element (e.g., first element) is referred to as being "directly connected," or "directly coupled" to another element (second element), there are no element (e.g., third element) interposed between them.
The expression "configured to" used in the present disclosure may be exchanged with, for example, "suitable for", "having the capacity to", "designed to", "adapted to", "made to", or "capable of" according to the situation. The term "configured to" may not necessarily imply "specifically designed to" in hardware. Alternatively, in some situations, the expression "device configured to" may mean that the device, together with other devices or components, "is able to". For example, the phrase "processor adapted (or configured) to perform A, B, and C" may mean a dedicated processor (e.g. embedded processor) only for performing the corresponding operations or a generic-purpose processor (e.g., central processing unit (CPU) or application processor (AP)) that can perform the corresponding operations by executing one or more software programs stored in a memory device.
The terms used in the present disclosure are only used to describe specific embodiments, and are not intended to limit the present disclosure. As used herein, singular forms may include plural forms as well unless the context clearly indicates otherwise. Unless defined otherwise, all terms used herein, including technical and scientific terms, have the same meaning as those commonly understood by a person skilled in the art to which the present disclosure pertains. Such terms as those defined in a generally used dictionary may be interpreted to have the meanings equal to the contextual meanings in the relevant field of art, and are not to be interpreted to have ideal or excessively formal meanings unless clearly defined in the present disclosure. In some cases, even the term defined in the present disclosure should not be interpreted to exclude embodiments of the present disclosure.
An electronic device according to various embodiments of the present disclosure may include at least one of, for example, a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic book reader (e-book reader), a desktop PC, a laptop PC, a netbook computer, a workstation, a server, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a MPEG-1 audio layer-3 (MP3) player, a mobile medical device, a camera, and a wearable device. According to various embodiments, the wearable device may include at least one of an accessory type (e.g., a watch, a ring, a bracelet, an anklet, a necklace, a glasses, a contact lens, or a Head-Mounted Device (HMD)), a fabric or clothing integrated type (e.g., an electronic clothing), a body-mounted type (e.g., a skin pad, or tattoo), and a bio-implantable type (e.g., an implantable circuit). According to some embodiments, the electronic device may be a home appliance. The home appliance may include at least one of, for example, a television, a Digital Video Disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washing machine, an air cleaner, a set-top box, a home automation control panel, a security control panel, a TV box (e.g., SAMSUNG HOMESYNCTM, APPLE TV®, or GOOGLE TV®), a game console (e.g., XBOX® and PLAYSTATION®), an electronic dictionary, an electronic key, a camcorder, and an electronic photo frame.
According to another embodiment, the electronic device may include at least one of various medical devices (e.g., various portable medical measuring devices (a blood glucose monitoring device, a heart rate monitoring device, a blood pressure measuring device, a body temperature measuring device, etc.), a Magnetic Resonance Angiography (MRA), a Magnetic Resonance Imaging (MRI), a Computed Tomography (CT) machine, and an ultrasonic machine), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR) , a Flight Data Recorder (FDR) , a Vehicle Infotainment Devices, an electronic devices for a ship (e.g., a navigation device for a ship, and a gyro-compass), avionics, security devices, an automotive head unit, a robot for home or industry, an automatic teller's machine (ATM) in banks, point of sales (POS) in a shop, or internet device of things (e.g., a light bulb, various sensors, electric or gas meter, a sprinkler device, a fire alarm, a thermostat, a streetlamp, a toaster, a sporting goods, a hot water tank, a heater, a boiler, etc.).
According to some embodiments, the electronic device may include at least one of a part of furniture or a building/structure, an electronic board, an electronic signature receiving device, a projector, and various kinds of measuring instruments (e.g., a water meter, an electric meter, a gas meter, and a radio wave meter). The electronic device according to various embodiments of the present disclosure may be a combination of one or more of the aforementioned various devices. The electronic device according to some embodiments of the present disclosure may be a flexible device. Further, the electronic device according to an embodiment of the present disclosure is not limited to the aforementioned devices, and may include a new electronic device according to the development of technology. Hereinafter, an electronic device according to various embodiments will be described with reference to the accompanying drawings. As used herein, the term "user" may indicate a person who uses an electronic device or a device (e.g., an artificial intelligence electronic device) that uses an electronic device.
FIG. 1 illustrates a network environment including an electronic device according to various embodiments of the present disclosure.
An electronic device 101 within a network environment 100, according to various embodiments, will be described with reference to FIG. 1. The electronic device 101 may include a bus 110, a processor 120, a memory 130, an input/output interface 150, a display 160, and a communication interface 170. According to an embodiment of the present disclosure, the electronic device 101 may omit at least one of the above components or may further include other components.
The bus 110 may include, for example, a circuit which interconnects the components 110 to 170 and delivers a communication (e.g., a control message and/or data) between the components 110 to 170.
The processor 120 may include one or more of a Central Processing Unit (CPU), an Application Processor (AP), and a Communication Processor (CP). The processor 120 may carry out, for example, calculation or data processing relating to control and/or communication of at least one other component of the electronic device 101.
The memory 130 may include a volatile memory and/or a non-volatile memory. The memory 130 may store, for example, commands or data relevant to at least one other component of the electronic device 101. According to an embodiment of the present disclosure, the memory 130 may store software and/or a program 140. The program 140 may include, for example, a kernel 141, middleware 143, an Application Programming Interface (API) 145, and/or application programs (or "applications") 147. At least some of the kernel 141, the middleware 143, and the API 145 may be referred to as an Operating System (OS).
The kernel 141 may control or manage system resources (e.g., the bus 110, the processor 120, or the memory 130) used for performing an operation or function implemented in the other programs (e.g., the middleware 143, the API 145, or the application programs 147). Furthermore, the kernel 141 may provide an interface through which the middleware 143, the API 145, or the application programs 147 may access the individual components of the electronic device 101 to control or manage the system resources.
The middleware 143, for example, may serve as an intermediary for allowing the API 145 or the application programs 147 to communicate with the kernel 141 to exchange data. Also, the middleware 143 may process one or more task requests received from the application programs 147 according to priorities thereof. For example, the middleware 143 may assign priorities for using the system resources (e.g., the bus 110, the processor 120, the memory 130, or the like) of the electronic device 101, to at least one of the application programs 147. For example, the middleware 143 may perform scheduling or loading balancing on the one or more task requests by processing the one or more task requests according to the priorities assigned thereto.
The API 145 is an interface through which the applications 147 control functions provided from the kernel 141 or the middleware 143, and may include, for example, at least one interface or function (e.g., instruction) for file control, window control, image processing, character control, and the like.
The input/output interface 150, for example, may function as an interface that may transfer commands or data input from a user or another external device to the other element(s) of the electronic device 101. Furthermore, the input/output interface 150 may output the commands or data received from the other element(s) of the electronic device 101 to the user or another external device.
Examples of the display 160 may include a Liquid Crystal Display (LCD), a Light-Emitting Diode (LED) display, an Organic Light-Emitting Diode (OLED) display, a MicroElectroMechanical Systems (MEMS) display, and an electronic paper display. The display 160 may display, for example, various types of contents (e.g., text, images, videos, icons, or symbols) to users. The display 160 may include a touch screen, and may receive, for example, a touch, gesture, proximity, or hovering input using an electronic pen or a user's body part.
The communication interface 170 may establish communication, for example, between the electronic device 101 and an external device (e.g., a first external electronic device 102, a second external electronic device 104, or a server 106). For example, the communication interface 170 may be connected to a network 162 through wireless or wired communication, and may communicate with an external device (e.g., the second external electronic device 104 or the server 106).The wireless communication may use at least one of, for example, Long Term Evolution (LTE), LTE-Advance (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Universal Mobile Telecommunications System (UMTS), Wireless Broadband (WiBro), and Global System for Mobile Communications (GSM), as a cellular communication protocol. In addition, the wireless communication may include, for example, short range communication 164.
The short-range communication 164 may include at least one of, for example, Wi-Fi, Bluetooth, Near Field Communication (NFC), and Global Navigation Satellite System (GNSS). GNSS may include, for example, at least one of global positioning system (GPS), global navigation satellite system (GLONASS), Beidou Navigation satellite system (BEIDOU) or Galileo, and the European global satellite-based navigation system, based on a location, a bandwidth, or the like. Hereinafter, in the present disclosure, the "GPS" may be interchangeably used with the "GNSS". The wired communication may include, for example, at least one of a Universal Serial Bus (USB), a High Definition Multimedia Interface (HDMI), Recommended Standard 232 (RS-232), and a Plain Old Telephone Service (POTS). The network 162 may include at least one of a telecommunication network such as a computer network (e.g., a LAN or a WAN), the Internet, and a telephone network.
Each of the first and second external electronic devices 102 and 104 may be of a type identical to or different from that of the electronic device 101. According to an embodiment of the present disclosure, the server 106 may include a group of one or more servers. According to various embodiments of the present disclosure, all or some of the operations performed in the electronic device 101 may be executed in another electronic device or a plurality of electronic devices (e.g., the electronic devices 102 and104 or the server 106). According to an embodiment of the present disclosure, when the electronic device 101 has to perform some functions or services automatically or in response to a request, the electronic device 101 may request another device (e.g., the electronic device 102 or 104 or the server 106) to execute at least some functions relating thereto instead of or in addition to autonomously performing the functions or services. Another electronic device (e.g., the electronic device 102 or 104, or the server 106) may execute the requested functions or the additional functions, and may deliver a result of the execution to the electronic device 101. The electronic device 101 may process the received result as it is or additionally, and may provide the requested functions or services. To this end, for example, cloud computing, distributed computing, or client-server computing technologies may be used.
FIG. 2 illustrates an electronic device according to various embodiments of the present disclosure.
The electronic device 201 may include, for example, all or a part of the electronic device 101 shown in FIG. 1. The electronic device 201 may include one or more processors 210 (e.g., Application Processors (AP)), a communication module 220, a memory 230, a sensor module 240, an input device 250, a display 260, an interface 270, an audio module 280, a camera module 291, a power management module 295, a battery 296, an indicator 297, and a motor 298.
The processor 210 may control a plurality of hardware or software components connected to the processor 210 by driving an operating system or an application program, and perform processing of various pieces of data and calculations. The processor 210 may be embodied as, for example, a System on Chip (SoC). According to an embodiment of the present disclosure, the processor 210 may further include a Graphic Processing Unit (GPU) and/or an image signal processor. The processor 210 may include at least some (for example, a cellular module 221) of the components illustrated in FIG. 2. The processor 210 may load, into a volatile memory, commands or data received from at least one (e.g., a non-volatile memory) of the other components and may process the loaded commands or data, and may store various data in a non-volatile memory.
The communication module 220 may have a configuration equal or similar to that of the communication interface 170 of FIG. 1. The communication module 220 may include, for example, a cellular module 221, a Wi-Fi module 223, a BT module 225, a GNSS module 227 (e.g., a GPS module 227, a Glonass module, a Beidou module, or a Galileo module), an NFC module 228, and a Radio Frequency (RF) module 229. The cellular module 221, for example, may provide a voice call, a video call, a text message service, or an Internet service through a communication network. According to an embodiment of the present disclosure, the cellular module 221 may distinguish and authenticate the electronic device 201 in a communication network using a subscriber identification module (e.g: SIM card) 224 (for example, the SIM card). According to an embodiment of the present disclosure, the cellular module 221 may perform at least some of the functions that the AP 210 may provide. According to an embodiment of the present disclosure, the cellular module 221 may include a communication processor (CP).
For example, each of the Wi-Fi module 223, the BT module 225, the GNSS module 227, and the NFC module 228 may include a processor for processing data transmitted/received through a corresponding module. According to an embodiment of the present disclosure, at least some (e.g., two or more) of the cellular module 221, the Wi-Fi module 223, the BT module 225, the GNSS module 227, and the NFC module 228 may be included in one Integrated Chip (IC) or IC package. The RF module 229, for example, may transmit/receive a communication signal (e.g., an RF signal). The RF module 229 may include, for example, a transceiver, a Power Amplifier Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), and an antenna. According to another embodiment of the present disclosure, at least one of the cellular module 221, the Wi-Fi module 223, the BT module 225, the GNSS module 227, and the NFC module 228 may transmit/receive an RF signal through a separate RF module. The subscriber identification module 224 may include, for example, a card including a subscriber identity module and/or an embedded SIM, and may contain unique identification information (e.g., an Integrated Circuit Card Identifier (ICCID)) or subscriber information (e.g., an International Mobile Subscriber Identity (IMSI)).
The memory 230 (e.g., the memory 130) may include, for example, an embedded memory 232 or an external memory 234. The embedded memory 232 may include at least one of a volatile memory (e.g., a Dynamic Random Access Memory (DRAM), a Static RAM (SRAM), a Synchronous Dynamic RAM (SDRAM), and the like) and a non-volatile memory (e.g., a One Time Programmable Read Only Memory (OTPROM), a Programmable ROM (PROM), an Erasable and Programmable ROM (EPROM), an Electrically Erasable and Programmable ROM (EEPROM), a mask ROM, a flash ROM, a flash memory (e.g., a NAND flash memory or a NOR flash memory), a hard disc drive, a Solid State Drive (SSD), and the like). The external memory 234 may further include a flash drive, for example, a Compact Flash (CF), a Secure Digital (SD), a Micro Secure Digital (Micro-SD), a Mini Secure Digital (Mini-SD), an eXtreme Digital (xD), a MultiMediaCard (MMC), a memory stick, or the like. The external memory 234 may be functionally and/or physically connected to the electronic device 201 through various interfaces.
The sensor module 240, for example, may measure a physical quantity or detect an operation state of the electronic device 201, and may convert the measured or detected information into an electrical signal. The sensor module 240 may include, for example, at least one of a gesture sensor 240A, a gyro sensor 240B, an atmospheric pressure sensor (barometer) 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, a proximity sensor 240G, a color sensor 240H (e.g., red, green, and blue (RGB) sensor), a biometric sensor (medical sensor) 240I, a temperature/humidity sensor 240J, an illuminance sensor 240K, and a Ultra Violet (UV) sensor 240M. Additionally or alternatively, the sensor module 240 may include, for example, an E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an Infrared (IR) sensor, an iris scan sensor, and/or a finger scan sensor. The sensor module 240 may further include a control circuit for controlling one or more sensors included therein. According to an embodiment of the present disclosure, the electronic device 201 may further include a processor configured to control the sensor module 240, as a part of the processor 210 or separately from the processor 210, and may control the sensor module 240 while the processor 210 is in a sleep state.
The input device 250 may include, for example, a touch panel 252, a (digital) pen sensor 254, a key 256, or an ultrasonic input device 258. The touch panel 252 may use, for example, at least one of a capacitive type, a resistive type, an infrared type, and an ultrasonic type. The touch panel 252 may further include a control circuit. The touch panel 252 may further include a tactile layer, and provide a tactile reaction to the user. The (digital) pen sensor 254 may include, for example, a recognition sheet which is a part of the touch panel or is separated from the touch panel. The key 256 may include, for example, a physical button, an optical key or a keypad. The ultrasonic input device 258 may detect, through a microphone (e.g., the microphone 288), ultrasonic waves generated by an input tool, and identify data corresponding to the detected ultrasonic waves.
The display 260 (e.g., the display 160) may include a panel 262, a hologram device 264, or a projector 266. The panel 262 may include a configuration identical or similar to the display 160 illustrated in FIG. 1. The panel 262 may be implemented to be, for example, flexible, transparent, or wearable. The panel 262 may be embodied as a single module with the touch panel 252. The hologram device 264 may show a three dimensional (3D) image in the air by using an interference of light. The projector 266 may project light onto a screen to display an image. The screen may be located, for example, in the interior of or on the exterior of the electronic device 201. According to an embodiment of the present disclosure, the display 260 may further include a control circuit for controlling the panel 262, the hologram device 264, or the projector 266.
The interface 270 may include, for example, a High-Definition Multimedia Interface (HDMI) 272, a Universal Serial Bus (USB) 274, an optical interface 276, or a D-subminiature (D-sub) 278. The interface 270 may be included in, for example, the communication interface 170 illustrated in FIG. 1. Additionally or alternatively, the interface 270 may include, for example, a Mobile High-definition Link (MHL) interface, a Secure Digital (SD) card/Multi-Media Card (MMC) interface, or an Infrared Data Association (IrDA) standard interface.
The audio module 280, for example, may bilaterally convert a sound and an electrical signal. At least some components of the audio module 280 may be included in, for example, the input/output interface 150 illustrated in FIG. 1. The audio module 280 may process voice information input or output through, for example, a speaker 282, a receiver 284, earphones 286, or the microphone 288. The camera module 291 is, for example, a device which may photograph a still image and a video. According to an embodiment of the present disclosure, the camera module 291 may include one or more image sensors (e.g., a front sensor or a back sensor), a lens, an Image Signal Processor (ISP) or a flash (e.g., LED or xenon lamp).
The power management module 295 may manage, for example, power of the electronic device 201. According to an embodiment of the present disclosure, the power management module 295 may include a Power Management Integrated Circuit (PMIC), a charger Integrated Circuit (IC), or a battery or fuel gauge. The PMIC may use a wired and/or wireless charging method. Examples of the wireless charging method may include, for example, a magnetic resonance method, a magnetic induction method, an electromagnetic wave method, and the like. Additional circuits (e.g., a coil loop, a resonance circuit, a rectifier, etc.) for wireless charging may be further included. The battery gauge may measure, for example, a residual quantity of the battery 296, and a voltage, a current, or a temperature while charging. The battery 296 may include, for example, a rechargeable battery and/or a solar battery.
The indicator 297 may display a particular state (e.g., a booting state, a message state, a charging state, or the like) of the electronic device 201 or a part (e.g., the processor 210) of the electronic device 201. The motor 298 may convert an electrical signal into a mechanical vibration, and may generate a vibration, a haptic effect, or the like. Although not illustrated, the electronic device 201 may include a processing device (e.g., a GPU) for supporting a mobile TV. The processing device for supporting a mobile TV may process, for example, media data according to a certain standard such as Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), or MEDIAFLOTM.
Each of the above-described component elements of hardware according to the present disclosure may be configured with one or more components, and the names of the corresponding component elements may vary based on the type of electronic device. In various embodiments, the electronic device may include at least one of the above-described elements. Some of the above-described elements may be omitted from the electronic device, or the electronic device may further include additional elements. Also, some of the hardware components according to various embodiments may be combined into one entity, which may perform functions identical to those of the relevant components before the combination.
FIG. 3 illustrates a program module according to various embodiments of the present disclosure.
According to an embodiment of the present disclosure, the program module 310 (e.g., the program 140) may include an Operating System (OS) for controlling resources related to the electronic device (e.g., the electronic device 101) and/or various applications (e.g., the application programs 147) executed in the operating system. The operating system may be, for example, ANDROID®, iOS®, WINDOWS®, SYMBIAN®, TIZEN®, SAMSUNG BADA®, or the like. The program module 310 may include a kernel 320, middleware 330, an API 360, and/or applications 370. At least some of the program module 310 may be preloaded on an electronic device, or may be downloaded from an external electronic device (e.g., the electronic device 102 or 104, or the server 106).
The kernel 320 (e.g., the kernel 141) may include, for example, a system resource manager 321 and/or a device driver 323. The system resource manager 321 may control, allocate, or collect system resources. According to an embodiment of the present disclosure, the system resource manager 321 may include a process management unit, a memory management unit, a file system management unit, and the like. The device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a Wi-Fi driver, an audio driver, or an Inter-Process Communication (IPC) driver.
For example, the middleware 330 may provide a function required in common by the applications 370, or may provide various functions to the applications 370 through the API 360 so as to enable the applications 370 to efficiently use the limited system resources in the electronic device. According to an embodiment of the present disclosure, the middleware 330 (e.g., the middleware 143) may include at least one of a run time library 335, an application manager 341, a window manager 342, a multimedia manager 343, a resource manager 344, a power manager 345, a database manager 346, a package manager 347, a connectivity manager 348, a notification manager 349, a location manager 350, a graphic manager 351, and a security manager 352.
The runtime library 335 may include a library module that a compiler uses in order to add a new function through a programming language while an application 370 is being executed. The runtime library 335 may perform input/output management, memory management, the functionality for an arithmetic function, or the like.
The application manager 341 may manage, for example, a life cycle of at least one of the applications 370. The window manager 342 may manage Graphical User Interface (GUI) resources used by a screen. The multimedia manager 343 may recognize a format required for reproduction of various media files, and may perform encoding or decoding of a media file by using a codec suitable for the corresponding format. The resource manager 344 may manage resources of a source code, a memory, and a storage space of at least one of the applications 370.
The power manager 345 may operate together with, for example, a Basic Input/Output System (BIOS) or the like to manage a battery or power source and may provide power information or the like required for the operations of the electronic device. The database manager 346 may generate, search for, and/or change a database to be used by at least one of the applications 370. The package manager 347 may manage installation or an update of an application distributed in a form of a package file.
For example, the connectivity manager 348 may manage wireless connectivity such as WI-FI® or BLUETOOTH®. The notification manager 349 may display or notify of an event such as an arrival message, promise, proximity notification, and the like in such a way that does not disturb a user. The location manager 350 may manage location information of an electronic device. The graphic manager 351 may manage a graphic effect which will be provided to a user, or a user interface related to the graphic effect. The security manager 352 may provide all security functions required for system security, user authentication, or the like. According to an embodiment of the present disclosure, when the electronic device (e.g., the electronic device 101) has a telephone call function, the middleware 330 may further include a telephony manager for managing a voice call function or a video call function of the electronic device.
The middleware 330 may include a middleware module that forms a combination of various functions of the above-described components. The middleware 330 may provide a module specialized for each type of OS in order to provide a differentiated function. Further, the middleware 330 may dynamically remove some of the existing components or add new components.
The API 360 (e.g., the API 145) is, for example, a set of API programming functions, and may be provided with a different configuration according to an OS. For example, in the case of Android or iOS, one API set may be provided for each platform. In the case of Tizen, two or more API sets may be provided for each platform.
The applications 370 (e.g., the application programs 147) may include, for example, one or more applications which may provide functions such as a home 371, a dialer 372, an SMS/MMS 373, an Instant Message (IM) 374, a browser 375, a camera 376, an alarm 377, contacts 378, a voice dial 379, an email 380, a calendar 381, a media player 382, an album 383, a clock 384, health care (e.g., measuring exercise quantity or blood sugar), or environment information (e.g., providing atmospheric pressure, humidity, or temperature information).
According to an embodiment of the present disclosure, the applications 370 may include an application (hereinafter, referred to as an "information exchange application" for convenience of description) that supports exchanging information between the electronic device (e.g., the electronic device 101) and an external electronic device (e.g., the electronic device 102 or 104). The information exchange application may include, for example, a notification relay application for transferring specific information to an external electronic device or a device management application for managing an external electronic device.
For example, the notification relay application may include a function of transferring, to the external electronic device (e.g., the electronic device 102 or 104), notification information generated from other applications of the electronic device 101 (e.g., an SMS/MMS application, an e-mail application, a health management application, or an environmental information application). Further, the notification relay application may receive notification information from, for example, an external electronic device and provide the received notification information to a user.
The device management application may manage (e.g., install, delete, or update), for example, at least one function of an external electronic device (e.g., the electronic device 102 or 104) communicating with the electronic device (e.g., a function of turning on/off the external electronic device itself (or some components) or a function of adjusting the brightness (or a resolution) of the display), applications operating in the external electronic device, and services provided by the external electronic device (e.g., a call service or a message service).
According to an embodiment of the present disclosure, the applications 370 may include applications (e.g., a health care application of a mobile medical appliance or the like) designated according to an external electronic device (e.g., attributes of the electronic device 102 or 104). According to an embodiment of the present disclosure, the applications 370 may include an application received from an external electronic device (e.g., the server 106, or the electronic device 102 or 104). According to an embodiment of the present disclosure, the applications 370 may include a preloaded application or a third party application that may be downloaded from a server. The names of the components of the program module 310 of the illustrated embodiment of the present disclosure may change according to the type of operating system.
According to various embodiments, at least a part of the programming module 310 may be implemented in software, firmware, hardware, or a combination of two or more thereof. At least some of the program module 310 may be implemented (e.g., executed) by, for example, the processor (e.g., the processor 210). At least some of the program module 310 may include, for example, a module, a program, a routine, a set of instructions, and/or a process for performing one or more functions.
The term "module" as used herein may, for example, mean a unit including one of hardware, software, and firmware or a combination of two or more of them. The "module" may be interchangeably used with, for example, the term "unit", "logic", "logical block", "component", or "circuit". The "module" may be a minimum unit of an integrated component element or a part thereof. The "module" may be a minimum unit for performing one or more functions or a part thereof. The "module" may be mechanically or electronically implemented. For example, the "module" according to the present disclosure may include at least one of an Application-Specific Integrated Circuit (ASIC) chip, a Field-Programmable Gate Arrays (FPGA), and a programmable-logic device for performing operations which has been known or are to be developed hereinafter. According to various embodiments, at least some of the devices (for example, modules or functions thereof) or the method (for example, operations) according to the present disclosure may be implemented by a command stored in a computer-readable storage medium in a programming module form. The instruction, when executed by a processor (e.g., the processor 120), may cause the one or more processors to execute the function corresponding to the instruction. The computer-readable recoding media may be, for example, the memory 130.
FIG. 4 illustrates the configuration of an electronic device and a wearable device according to various embodiments.
Referring to FIG. 4, an electronic device 410 may include a web API manager 411, a wearable manager 412, a data analysis unit 413, a data integration unit 414, and a health DB 415.
The web API manager 411 may receive health related data, which is collected from a wearable device 3 440 or a wearable device 4 450, using a protocol released in the form of a web application programming interface (API). The wearable manager 412 may receive the health related data, which is collected from a wearable device 1 420 or a wearable device 2 430, using the communication protocol defined for the wearable device 1 420 or the wearable device 2 430. Here, the received data may be health related data (e.g., number of steps, cycling, swimming, sleep, etc.) measured or collected by the wearable device 1 420 to the wearable device 4 450.
The received data may be stored in a health database (DB) 415. The received data is measured or collected by one or more wearable devices, and the data may be individually provided to each device, but one piece of integrated data may be provided thereto. The health DB 415 may store data received for each device, or may integrate data received from a plurality of devices and store the same. The data stored in the health DB 415 may be synchronized with a cloud 460.
The data analysis unit 413 may analyze the received data. For example, the data analysis unit 413 may analyze the received data and classify the same depending on activity types. Here, the activity type may include a first activity type for the number of steps, a second activity type for an activity (e.g., workout), and a third activity type for non-activity. The non-activity may mean a stationary state such as sleeping, sitting, and the like without walking or activity (e.g., cycling, swimming, etc.). Alternatively, the non-activity may mean that there is no detection.
For example, the wearable device 1 420 may collect data for the number of steps and activities, the wearable device 2 430 may collect data for the number of steps, the wearable device 3 440 may collect data for the number of steps and activities, and the wearable device 4 450 may collect data on the number of steps. The data analysis unit 413 may classify the data depending on the activity types in order to integrate the data received from the wearable device 1 420 to the wearable device 4 450 into one piece of data .
The data integration unit 414 may integrate the data classified depending on activity types. For example, the data integration unit 414 may integrate respective data for the number of steps, an activity, and a non-activity. The data integration unit 414 may store the integrated data in the health DB 415.
According to various embodiments, the wearable device 1 420 may include a step number counting unit 421 for measuring the number of steps, and a workout measurement unit 422 for measuring activity. The wearable device 2 430 may include a step number counting unit 431 for measuring the number of steps. According to various embodiments, the wearable device 1 420 to the wearable device 4 450 may have different unit time for storing the measured data. For example, the wearable device 1 420 and the wearable device 3 440 may store data in a unit of 5 minutes, and the wearable device 2 430 and the wearable device 4 450 may store data in a unit of 10 minutes.
FIG. 5 illustrates an example of a health related data flow according to various embodiments.
Referring to FIG. 5, data 540, which is collected from an electronic device 510, a wearable device 520, and an application 530, respectively, may be stored in the health DB 415. The data analysis unit 413 may perform an analysis of data (indicated by reference numeral 550) stored in the health DB 415. The data integration unit 414 may perform integration of data based on the result of analysis. The data integration unit 414 may apply an algorithm 560 to the integrated data to correct the data. The corrected data 570 is for the number of steps, activity, and non-activity, and may be provided through a user interface.
The electronic device described below may be at least one of the electronic device 101 of FIG. 1, the electronic device 201 of FIG. 2, or the electronic device 410 of FIG. 4. In the following, for the convenience of explanation, an electronic device is described as the electronic device 101 of FIG. 1, but the electronic device is not limited to the description thereof.
An electronic device according to various embodiments includes: a housing; a display exposed through a part of the housing; a first motion sensor disposed within the housing and configured to detect the movement of the housing; a wireless communication circuit disposed within the housing; a processor disposed within the housing and electrically connected to the display, the first motion sensor, and the wireless communication circuit; and a memory electrically connected to the processor, wherein the memory stores instructions which, when executed by a processor, cause the processor to perform operations including: generating a wireless communication channel with an external electronic device including a second motion sensor, using the wireless communication circuit; monitoring the movement of the housing using the first motion sensor so as to generate first data for a first time period; receiving second data acquired for the first time period through the wireless communication channel, using the second motion sensor; calculating, as a value for the first time period, a value smaller than the sum of a first value based on the first data and a second value based on the second data; and displaying the calculated value through a user interface displayed on the display.
According to an embodiment, the instructions cause the processor to display the calculated value after the first time period through the user interface when the electronic device and the external electronic device are being worn or carried by a user.
An electronic device according to various embodiments includes: a housing; a display exposed through a part of the housing; a motion sensor disposed within the housing and configured to detect the movement of the housing; a wireless communication circuit disposed within the housing; a processor disposed within the housing and electrically connected to the display, the motion sensor, and the wireless communication circuit; and a memory electrically connected to the processor, wherein the memory stores instructions which, when executed by a processor, cause the processor to perform operations including: monitoring the movement of the housing using the motion sensor so as to generate first data for a first time period; determining a first attribute of the movement during a first session of the first time period, using a first portion of the first data; determining a second attribute of the movement during a second session of the first time period, using a second portion of the first data; selecting one of the first attribute and the second attribute; and displaying at least one of an image, text, or a symbol representing the selected attribute through a user interface displayed on the display.
The instructions according to an embodiment cause the processor to display, through the user interface, a map associated with locations where the housing has been located for the first time period, and to display at least one of the image, the text, or the symbol on the map in a superposed manner.
The instructions according to an embodiment may cause the processor to receive second data acquired for the first time period from an external electronic device including a motion sensor through the wireless communication circuit; calculate, as a value for the first time period, a value smaller than the sum of a first value based on the first data and a second value based on the second data; and display the calculated value through a user interface displayed on the display.
An electronic device according to various embodiments includes a memory, a display, a communication interface, and a processor functionally connected to the memory, the display, or the communication interface, wherein the processor may be configured to acquire, through the communication interface, health related data collected from an external device, correct the acquired data per unit time, analyze the corrected data to extract activity information, store the extracted activity information in the memory, and display, on the display, a user interface including the activity information in response to a user request.
The processor according to an embodiment may be configured to correct the acquired data based on a time unit for storing data of the electronic device.
The processor according to an embodiment may be configured to analyze data acquired from the external device and data acquired using a sensor module of the electronic device, so as to classify the data depending on activity types, and integrate the data depending on the activity types.
The processor according to an embodiment may be configured to integrate data based on a priority of at least one of time, an activity type, a workout type, and a device.
The processor according to an embodiment may be configured to assign different weights to at least one of the activity type, the workout type, and the device, and correct the integrated data based on the weights.
The processor according to an embodiment may be configured to classify the number of steps for each device per unit time, determine the maximum number of steps per unit time, and calculate the integrated number of steps based on the determined maximum number of steps, so as to integrate data for the number of steps.
The processor according to an embodiment may be configured to determine a workout type based on the result of data analysis, determine the start and end of the workout type, and integrate activity information for each workout type based on the priority.
The processor according to an embodiment may be configured to align inactive intervals for each time, and integrate non-active periods that do not include the activity information into one session, so as to process the integrated session as non-activity information.
The processor according to an embodiment may be configured to extract location information on the activity information, calculate an active area for activity information based on the extracted location information, calculate a distance between two adjacent active areas, and correct an icon in the active area based on the calculated distance.
The processor according to an embodiment may be configured to determine such that the icon overlap condition is satisfied when the calculated distance is less than a reference distance, and determine at least one of icons included in the adjacent two active areas.
FIG. 6 illustrates an operation method for the electronic device according to various embodiments.
FIG. 6 illustrates an operation of extracting activity information using one piece of data. Referring to FIG. 6, in operation 601, the electronic device 101 (e.g., the processor 120) may acquire health related data. The processor 120 may receive health related data from the external device (or an external electronic device) (e.g., one of the electronic device 102, the electronic device 104, and the wearable device 1 420 to the wearable device 4 450) through the communication interface 170. Alternatively, the electronic device 101 may autonomously acquire health related data. The electronic device 101 may acquire health related data using various sensor modules (e.g., the sensor module 240 of FIG. 2). The health related data may be data collected or measured by the external device or the electronic device 101, for example, the number of steps, running, cycling, swimming, sleeping, resting, and the like. The acquired data may be stored in the memory 130. In operation 603, the electronic device 101 (for example, the processor 120) may correct the acquired data. For example, the electronic device 101 may store data in a unit of minutes, the wearable device 1 420 and wearable device 3 440 may store data in a unit of five minutes, and the wearable device 2 430 and wearable device 4 450 may store data in a unit of ten minutes. Since the unit time for storing data is different for each device, the processor 120 may divide the received data according to the unit time for data storage of the electronic device 101. For example, the processor 120 may divide the received data by a unit of one minute. In operation 601, when the electronic device 101 has autonomously acquired the data not received from the external device, the operation 603 may not be performed. In addition, examples of the unit time are only examples, and the unit time for storing data in the electronic device 101 or the external device may be 30 seconds, 3 minutes, 5 minutes, or the like.
In operation 605, the electronic device 101 (for example, the processor 120) may analyze the corrected data. For example, various clustering or pattern recognition technologies may be utilized for data analysis methods. The processor 120 may analyze the corrected data using various technologies, so as to extract an activity that is determined to be meaningful to a user. According to medical opinion, a person may be healthy if he or she can walk an average of 100 or more steps over 10 minutes. The processor 120 may analyze the received data using density-based spatial clustering of applications with noise (DBSCAN) among various clustering technologies based on the medical opinion.
According to various embodiments, the processor 120 may search for, from data divided by one minute, a cluster having a condition that the minPts of the DBSCAN is 10 steps or more and the eps is one minute, and determine whether a user's activity is walking or running based on the number of steps within the corresponding cluster. For reference, the DBSCAN has two variables, and the minPts is the minimum number of objects to be included in one cluster, and the eps may mean the distance between objects. Here, the object is the number of steps, and the distance between the objects may be one minute, which is a unit of storing data of the electronic device 101. The processor 120 may include, in one cluster, objects walking 10 or more steps from data divided by one minute. When an initial condition included in one cluster is achieved one time and data generation (e.g., generation of the number of steps) is continuously maintained, the processor 120 may maintain a state where data is included in the cluster until a termination condition occurs. For example, the initial condition may be 10 or more steps per minute, or an average of 100 or more steps for 10 minutes. The termination condition may correspond to a case where there are no detected steps or less than 10 steps per minute. Data analysis of the number of steps will be described in detail with reference to FIGS. 7A and 7B below.
In operation 607, the electronic device 101 (for example, the processor 120) may extract activity information based on the result of analysis. According to an embodiment, when the analyzed data is used for classifying walking and running, the processor 120 may compare the sum of the walking with the sum of the running so as to extract an activity that has a larger value as activity information. For example, when the sum of the walking is 1000 and the sum of the running is 400 in one cluster, the processor 120 may extract a walking time and a total number of steps as the activity information. Alternatively, when the sum of the walking is 500 and the sum of the running is 2000, the processor 120 may extract a running time and a total running distance as activity information. According to various embodiments, when it is configured such that a user may acquire location information or when a user is allowed to acquire location information, the location information on the walking distance or the running distance may be extracted as the activity information.
According to the embodiment, when the analyzed data is not used for classify walking and running, the processor 120 may determine the workout types by using the number of steps per unit time. For example, the processor 120 may determine as 'running' when the number of steps is 150 or more per minute, and determine as 'walking' when the number of steps is less than 150 per minute. The processor 120 may compare the determined sum of the walking with the sum of the running, so as to extract an activity that has a larger value as activity information.
According to an embodiment, the processor 120 may analyze the frequency of steps per unit time (e.g., one minute) so as to extract activity information. The processor 120 may calculate the speed information using the GNSS module 227, or may determine whether the activity corresponds to an outdoor activity using an ultraviolet-ray sensor. For example, the processor 120 may calculate the moving speed for 10 minutes based on a change in location information when the number of steps generated for 10 minutes is equal to or greater than a predetermined number of steps (e.g., 1000 steps). When the calculated moving speed is equal to or higher than a predetermined speed (e.g., 20 Km/h), the processor 120 may extract the cycling, the moving speed, the moving distance, and the like as activity information. Alternatively, when the calculated moving speed is within a bicycle speed range (e.g., 20 Km/h to 50 Km/h), the processor 120 may extract the cycling, the moving speed, the moving distance, and the like as activity information. For example, when the number of steps is equal to or greater than a predetermined number of steps (e.g., 100 steps) or is generated in a predetermined pattern, the processor 120 may recognize the same as the outdoor activity if the sensor value of the ultraviolet sensor is a value that can be collected outdoors.
In operation 609, the electronic device 101 (for example, the processor 120) may store the extracted activity information. The processor 120 may store the extracted activity information in the memory 130. The activity information represents information such as walking, running, or total number of steps, total running distance, etc., and may be provided through a user interface that may be easily distinguished by a user.
FIGS. 7A and 7B illustrate an example of data analysis according to various embodiments.
Referring to FIGS. 7A and 7B, the electronic device 101 (for example, the processor 120) may determine activity information based on the number of steps per unit time. For example, the processor 120 may classify health related data in a unit of one minute, and include an object, which has generated 10 steps or more per minute in a workout 1 710. The workout 1 710 may include objects (for example 721 and 722) walking average 100 steps or more for ten minutes and an object 723 walking 10 steps or more per minute. For example, if the initial condition is achieved one time and the number of steps is continuously generated, the processor 120 may include, in the cluster 1 720, the number of steps which has generated until the termination condition occurs. When the objects (for example, 721 and 722) which have generated the average 100 steps or more for 10 minutes are included in the cluster 1 720, the processor 120 may also include the object 723, which has generated 10 steps or more per minute, in the cluster 1 720.
According to an embodiment, when the average 100 steps have generated during a predetermined time unit (e.g., one minute, five minutes, 10 minutes), the processor 120 may determine the case as an initial condition that is included in one workout (or cluster). Here, the termination condition may correspond to a case where there is no detected steps or less than 10 steps per minute. For example, the processor 120 may not include, in the workout 1 710, objects (for example, 724, 725, and 726) in which there are no detected steps or the object 727 that has generated less than 10 steps.
According to various embodiments, the processor 120 may classify health related data in a unit of one minute, and if the number of steps per minute is less than a predetermined number of steps (e.g., 10 steps), the processor 120 may determine that the initial condition is not satisfied, and may not include the data in a workout 2 750. For example, since the objects (for example 767 and 768) do not generate steps per minute and do not satisfy the initial condition to be included in the cluster, the objects 767 and 768 may not be included in the workout 2 750. The processor 120 may include, in the cluster 2 760, a case where the number of steps per minute is equal to or greater than a predetermined number of steps (e.g., 10 steps), or where the average number of steps for 10 minutes is equal to or greater than a predetermined number of steps (e.g., average 100 steps). When the objects (for example, 761 and 762), which have generated the average 100 steps or more for 10 minutes, are included in the cluster 2 760, the processor 120 may also include the objects (for example 763, 764, and 765), which have generated 10 steps or more per one minute, in the cluster 2 760. The processor 120 may determine that an object 766 walking less than 10 steps corresponds to a termination condition. The processor 120 may do not include the object 766 walking less than 10 steps in the cluster 2 760.
FIG. 8 illustrates a data integration method of an electronic device according to various embodiments.
FIG. 8 illustrates an operation of extracting activity information using one or more pieces of data. Referring to FIG. 8, in operation 801, the electronic device 101 (e.g., the processor 120) may collect health related data from one or more devices. The processor 120 may receive health related data from the external device (or external electronic device) (e.g., one of the electronic device 102, the electronic device 104, and the wearable device 1 420 to the wearable device 450) through the communication interface 170. Alternatively, the electronic device 101 may acquire health related data using various sensor modules (e.g., the sensor module 240 of FIG. 2).
According to various embodiments, the electronic device 101 may include a first wireless sensor in the housing (or body). The housing may be analyzed as a frame (or case) that receives components of the electronic device 101 (e.g., the processor 120, the memory 130, etc.). The first wireless sensor may be a sensor (e.g., the sensor module 240) that measures the number of steps, activity information, and non-activity information. The processor 120 may monitor the movement of the housing using the first wireless sensor so as to generate first data for a first time period. In addition, the external device may include a second wireless sensor. The second wireless sensor may be a sensor (e.g., the step number counting unit 421 and workout measurement unit 422) that measures the number of steps, activity information, and non-activity information of the external device. The processor 120 may generate a wireless communication channel (e.g., a communication protocol associated with the external device) with the external device using the communication interface 170 (or referred to as a "wireless communication circuit"). The processor 120 may receive the second data acquired for the first time period through the wireless communication channel, and calculate, as a value for the first time period, a value smaller than the sum of a first value based on the first data and a second value based on the second data.
In operation 803, the electronic device 101 (for example, the processor 120) may analyze the collected data and classify the same depending on activity types. For example, the processor 120 may correct the data as shown in operation 603 of FIG. 6 before the data analysis is performed. Since the unit time for storing data is different for each device, a pre-processing operation is required to process data received from a plurality of devices. The processor 120 may divide the received data according to a unit time (e.g., one minute) for storing data of the electronic device 101. The processor 120 may analyze the corrected data and classify the corrected data depending on the activity types. For example, the activity types may be classified into three categories of the number of steps (e.g., first activity type), activity information (e.g., second activity type), and non-activity information (e.g., third activity type). Hereinafter, three classifications of activity types are described as examples, but the activity types are not limited to thereto.
According to various embodiments, the classifications of data depending on the activity types may be for easily integrating data with different characteristics. For example, the processor 120 may classify the number of steps, activity information, and non-activity information for each device.
For example, the number of steps may have the characteristics of being constantly taken by a user, and the start and the end may not be clear. This is because, if movement is detected even during sleeping or sitting, it may be determined that a number of steps has been taken by a user. In addition, due to various issues such as hardware constraints and data accuracy of external devices, the unit time for storage are very diverse, such as 1 day, 10 minutes, 5 minutes, and the like. Therefore, in order to provide more accurate and meaningful number of steps to the user, it may be required to correct the number of steps detected by each of the plurality of external devices.
The activity information may be the record of activities having a clear start and end such as running, walking, swimming, cycling, yoga, and the like. Such activity information may be automatically recognized and stored by a user or an external device or the electronic device 101. However, the activity information may be detected in duplicate when collected from a plurality of devices.
The non-activity information may be a record that represents a state where a user is in a state of being inactive, such as sleeping, resting (e.g., sitting), and the like. Such non-activity information may cause an empty space in which there is no detection because a recognition rate differs for each device. When such empty spaces are integrated, some erroneous recognition of activities, that is, a case where a sleeping state or a sitting situation that is falsely recognized as walking may be excluded, more accurate information may be provided to the user.
In operation 805, the electronic device 101 (for example, the processor 120) may integrate data for each activity type. For example, the processor 120 may list the number of steps, activity information, or non-activity information in a time sequence, and may integrate data of the same activity type into one piece of data. According to various embodiments, the processor 120 may integrate the data based on the priority of at least one of time, an activity type, a workout type, and a device. The priority may be configured by the user or configured to the electronic device 101 by a default value. For example, the processor 120 may integrate data based on the activity type occurred the very first time when the time has a higher priority. Alternatively, when the activity type has a higher priority, the processor 120 may integrate the data based on the start and end of at least one of the number of steps, the activity information, or the non-activity information, and may integrate data for the remaining activity types. For example, when the activity information has a priority, and when the activity information and the number of steps or the non-activity information overlap to each other, the processor 120 may integrate data on the activity information based on the start and end of the activity information, and integrate data on the number of steps or the non-activity information.
Alternatively, when the workout type has a higher priority, the processor 120 may integrate the data based on the start and end of at least one of walking, running, cycling, swimming, and may integrate the data for the remaining activity types. For example, when the cycling has a higher priority, the processor 120 may integrate data for cycling based on the start and end of cycling, and integrate data on the remaining activity information, number of steps, or non-activity information. Alternatively, when the device has a higher priority, the processor 120 may integrate at least one of number of steps, activity information, or non-activity information of a device having a lower priority, based on the start and end of a device having a higher priority.
A method for integrating data for the number of steps, the activity information, and the non-activity information will be described in detail with reference to the following drawings.
In operation 807, the electronic device 101 (for example, the processor 120) may correct the integrated data based on weights. According to various embodiments, the processor 120 may assign different weights to at least one of an activity type, a workout type, and a device. For example, when a weight is assigned to an activity type, since it may be determined that the activity information is more meaningful to a user than the number of steps or the non-activity information, the processor 120 may assign a higher weight to the activity information. For example, when performing data correction, the processor 120 may configure higher weights in the sequence of activity information, non-activity information, and number of steps. Since the number of steps is always automatically counted and the error recognition rate is high due to technical limitations, it may be configured to assign the lowest weight to the number of steps. Alternatively, with respect to assigning weights to workout types, more accurate activity information may be provided to the user by assigning a higher weight to cycling, swimming, etc., which have more clear start and end times than walking or running has. Alternatively, with respect to assigning weights to the device, more accurate activity information may be provided to the user by assigning a higher weight to the electronic device 101 than the external device.
According to various embodiments, the processor 120 may assign different weights to at least one of an activity type, a workout type, and a device, respectively, and may correct data by comprehensively considering each weight. For example, the processor 120 may configure different weights on workout types or devices depending on activity types. When the second activity type on the activity information has a higher weight, the processor 120 may assign a higher weight to the external device than the electronic device 101. Alternatively, the processor 120 may configure a different weight for each device according to the workout type. When the workout type is swimming, the processor 120 may assign a higher weight to the external device than the electronic device 101, and when the workout type is cycling, the processor 120 may assign a higher weight to the electronic device 101 than the external device. This method may be intended to provide more meaningful and more accurate information to the user.
According to various embodiments, the processor 120 may integrate data based on the priorities, but may also correct the integrated data based on the weights. That is, when integrating data according to the priority in operation 805, the processor 120 may skip the operation 807 without performing thereof. Alternatively, when integrating data according to the priority in operation 805, the processor 120 may perform the operation 807. According to various embodiments, the weight may be configured by the user or configured to the electronic device 101 by a default value. According to various embodiments, the processor 120 may assign a different weight to each priority, and may correct data by comprehensively considering each weight. For example, the weight configured to perform the data correction may be proportional or inversely proportional to the priority configured to perform the data integration. For example, when the priority configured to perform the data integration is high, the weight configured to perform the data correction may also be high. Alternatively, when the priority configured to perform the data integration is high, the weight configured to perform the data correction may be low.
In operation 807, the electronic device 101 (for example, the processor 120) may provide a user interface based on the corrected data. The user interface may provide activity information on the corrected data along with activity information for each device.
According to various embodiments, the processor 120 may generate first data (e.g., number of steps, cycling, swimming, non-activity, etc.) for the first time period by using a motion sensor (e.g., sensor module 240) provided in the electronic device 101, determine a first attribute (e.., the number of steps) of the movement of the electronic device 101 during a first session of a first time period using a first portion of the first data, determine a second attribute (e.g., activity information) of the movement of the electronic device 101 during a second session of the first time period by using a second portion of the first data, select one of the first attribute or the second attribute, and display at least one of the image, text, or symbol representing the selected attribute through a user interface displayed on the display 160.
When a user interface receives an input of a health related application selected by a user, in order to view health related data, the processor 120 may provide the user interface through the selected application. Alternatively, when each device is connected (or paired) with the electronic device 101, the processor 120 may execute an application associated with the connected device to provide the user interface. The processor 120 may attach a tag (e.g., an auto tag) to the activity information on the corrected data so as to allow a user to easily recognize the corrected data. Various embodiments for the user interface will be described in detail with reference to the following drawings.
FIG. 9 illustrates a method for integrating the number of steps by an electronic device according to various embodiments.
FIG. 9 may be a drawing that embodies the data integration operation 805 of FIG. 8. That is, FIG. 9 illustrates the operation of integrating the number of steps for the first activity type. Referring to FIG. 9, in operation 901, the electronic device 101 (e.g., the processor 120) may classify the number of steps for each device per unit time. Each device may have a different unit time for storing data according to the performance of hardware or software. For example, the number of steps may be stored in each device in various units of times, such as 1 minute, 5 minutes, 10 minutes, and the like. Thus, in order to integrate the number of steps collected from devices having different storage unit, it is required to classify data in a predetermined size. To this end, the processor 120 may classify the number of steps for each device depending on the unit time for storing data of the electronic device 101.
Although the integration of the number of steps a user has taken may be done in various methods, a method which does not allow a user to realize the reduction in the number of steps while minimizing the error may be used. For example, the Max method may be used for the method of integrating the number of steps. Since the reduction in the number of steps may be a factor that hinders a user's experience, the processor 120 may integrate the number of steps using the max method to indicate a value that is always greater than the number of steps checked in the plurality of devices. For reference, when the number of steps is integrated into the average value of a plurality of devices or other integration methods are used, the max method may be suitable because the number of steps may have the possibility of being smaller than the number of steps measured by a single device.
According to various embodiments, the processor 120 may calculate a value, as the value for a first time period, which is less than the sum of a first value based on first data autonomously generated for the first time period and a second value based on second data measured from the external device. In other words, although an embodiment of integrating the number of steps using the max method is described below, the number of steps may be integrated using other method other than the max method.
In operation 903, the electronic device 101 (for example, the processor 120) may assign an index per unit time. For example, since a day is 24 hours, a total of 1,440 indices may be assigned when the index is assigned in a unit of one minute. In mathematics, since the index is used from 0, the processor 120 may assign indices from 0 to 1339.
In operation 905, the electronic device 101 (for example, the processor 120) may determine the maximum number of steps per unit time based on the index. For example, the processor 120 may determine the maximum number of steps per unit time using Equation 1.
Figure PCTKR2017001752-appb-I000001
The sourcei may refer to an external device (e.g., wearable device, electronic device 101) that has acquired the number of steps, x may refer to an index, and combined [x] may refer to the maximum number of steps per unit time. If i is 0, it may refer to the first device that has acquired the number of steps, and if i is 1, it may refer to the second device that has acquired the number of steps, and if i is i, it may refer to the i-th device that has acquired the number of steps.
Referring to Equation 1, the processor 120 may determine the number of steps as the number of steps corresponding to a unit time, which is obtained by measuring the maximum number of steps in the unit time among the number of steps measured by each of the plurality of devices.
In operation 907, the electronic device 101 (for example, the processor 120) may calculate the integrated number of steps based on the maximum number of steps. For example, the processor 120 may calculate the number of steps that is obtained by summing all the maximum number of steps per unit time, as the integrated number of steps during the time when the maximum number of steps is determined. For example, the processor 120 may integrate the number of steps measured by the wearable device 1 420 and the number of steps measured by the wearable device 3 440. Alternatively, the processor 120 may integrate the number of steps measured by the wearable device 1 420, the number of steps measured by the electronic device 101 itself, and the number of steps measured by the application installed in the electronic device 101.
FIG. 10 illustrates an example of integrating the number of steps according to various embodiments.
Referring to FIG. 10, reference numeral '1040' represents the number of steps measured by each device for a predetermined period of time (e.g., 09:00 to 09:10). The electronic device 1010 may store the number of steps in a unit of one minute, so as to acquire a total number of 443 steps 1016 for ten minutes. Since the electronic device 1010 stores the number of steps in a unit of one minute, the processor 120 may acquire, as the number of steps, 100 steps 1011 at 09:01, zero at 09:02, 110 steps 1012 at 09:03, 120 steps 1013 at 09:04, 110 steps 1014 at 09:05, zero at 09:06, 3 steps 1015 at 09:07, and zero from 09:08 to 09:10. The wearable device 1020 may store the number of steps in a unit of ten minutes, so as to acquire a total of 200 steps 1027 for ten minutes. For example, the wearable device 1020 may store the number of steps in units of 10 minutes, the wearable device 1020 may acquire, as the number of steps, 200 steps 1025 from 09:00 to 09:10. The application 1030 installed in the electronic device 1010 may store the number of steps in a unit of 5 minutes, and acquire a total of 50 steps 1037 for 10 minutes. Since the application 1030 stores the number of steps in a unit of 5 minutes, the application 1030 may acquire, as the number of steps, 50 steps 1031 from 09:00 to 09:05, and zero from 09:06 to 09:10.
According to various embodiments, when the number of steps is divided in a unit of one minute by the wearable device 1020 or the application 1030, the processor 120 may not accurately know the time point of the occurrence of the number of steps so that the total number of steps may be divided by 10 to become an average value.
Reference numeral '1050' indicates the classification of the number of steps measured by each device in a unit of one minute for a predetermined time (e.g., from 09:00 to 09:10). Since the wearable device 1020 stores the number of steps in a unit of 10 minutes, the number of steps in a unit of one minute may be calculated by dividing the total number of steps 1027 by 10 in order to divide in a unit of one minute . The number of steps in a unit of one minute of the wearable device 1020 may be 20 steps 1021. That is, the processor 120 may acquire the number of steps in a unit of one minute of the wearable device 1020, as 20 steps 1021 at 09:01, 20 steps 1022 at 09:02, and 20 steps from 09:03 to 09:10, respectively. Since the application 1060 stores the number of steps in a unit of five minutes, the number of steps in a unit of one minute may be calculated by dividing 50 steps 1031 by five, which is the number of steps in units of five minutes (e.g., from 09:00 to 09:05), and dividing 0 steps by five, which is the remaining number of steps in a unit of five minutes (e.g., from 09:06 to 09:10) in order to divide in a unit of one minute. The number of steps in a unit of one minute of the application 1060 may be 10 steps 1035 from 09:00 to 09:05, and may be zero 1036 from 09:06 to 09:10. In other words, the processor 120 may acquire 10 steps 1035 from 09:00 to 09:05, and zero 1036 from 09:06 to 09:10, so as to acquire the number of steps of the application 1030 in a unit of one minute.
The processor 120 may determine the maximum number of steps among the number of steps per unit time as the number of steps per unit time. The processor 120 may perform the integration 1060 of the number of steps of a plurality of devices by applying Equation 1. For example, since, at 09:01, the number of steps of the electronic device 1010 is 100 steps 1011, the number of steps of the wearable device 1020 is 20 steps 1021, and the number of steps of the application 1030 is 10 steps 1035, and therefore, the maximum number of steps may be 100 steps 1011, which is the number of steps of the electronic device 1010. In this case, the processor 120 may determine the number of steps at a unit time of 09: 01 as 100 steps 1061, which is the maximum number of steps.
Similarly, since, at 09:02, the number of steps of the electronic device 1010 is zero, the number of steps of the wearable device 1020 is 20 steps 1022, and the number of steps of the application 1030 is 10 steps 1032, the maximum number of steps may be 20 steps 1022, which is the number of steps of the wearable device 1020. In this case, the processor 120 may determine the number of steps at a unit time of 09: 02 as 20 steps 1062, which is the maximum number of steps. The processor 120 may calculate the maximum number of steps during 09:03 to 09:10 as described above, and calculate the integrated number of steps, as 550 steps 1070, which has summed all the calculated maximum number of steps. It can be seen that 550 steps 1070, which is the integrated number of steps, is greater than the integrated number of steps (e.g., 1016, 1027, and 1037) measured by the individual sources. When the number of steps is integrated using the max method, the user experience related to the reduction of the number of steps may be reduced.
According to various embodiments, the processor 120 may calculate, as a value (e.g., the integrated number of steps) for a first time period, a value (e.g., 550 steps 1070) which is smaller than the sum of a first value (e.g., 443 steps 1016) based on the first data (e.g., the number of steps measured by the electronic device 1010) acquired for the first time period (e.g., from 09:00 to 09:10) and a second value (e.g., 200 steps 1027) based on the second data (e.g., the number of steps measured by the wearable device 1020).
FIG. 11 illustrates a method for integrating activity information by an electronic device according to various embodiments.
FIG. 11 may be a drawing that embodies the data integration operation 805 of FIG. 8. That is, FIG. 11 illustrates the operation of integrating the activity information on the second activity type. Referring to FIG. 11, in operation 1101, the electronic device 101 (e.g., the processor 120) may determine a workout type based on the result of data analysis. The workout type may be obtained by classifying various workouts, such as walking, running, cycling, swimming, yoga, and the like. According to various embodiments, the processor 120 may determine, on the basis of the result of data analysis, that the workout type is 'running' when the number of steps measured for one minute is equal to or greater than the first predetermined number of steps (e.g., 150 steps), and determine that the workout type is 'walking' when the number of steps measured for one minute is less than the first predetermined number of steps. The processor 120 may compare the determined sum of the walking with the sum of the running, and determine a workout, which has a larger value, as the workout type. For example, if the sum of the walking is 1000 and the sum of the running is 400, the processor 120 may determine the type of workout as 'walking'. The first predetermined number of steps may be configured by the user or configured to the electronic device 101 by default.
According to various embodiments, the processor 120 may calculate a movement speed based on a change in location information for 10 minutes when the number of steps taken during a predetermined period of time is equal to or greater than the second predetermined number of steps (e.g., 1000 steps). When the calculated moving speed is equal to or higher than a predetermined speed (e.g., 20 Km/h), the processor 120 may identify that the moving speed corresponds to cycling, and determine the workout type as 'cycling'. In addition, when the calculated moving speed is less than a predetermined speed (e.g., 20 Km/h), the processor 120 may identify that the moving speed corresponds to running, and determine the workout type as 'running'. The predetermined speed may be configured by the user or configured to the electronic device 101 by default.
In operation 1103, the electronic device 101 (for example, the processor 120) may identify the start and end of the workout type. The processor 120 determines a workout type using data detected from a plurality of devices, but the start time and end time of the workout type may be different for each device. The processor 120 may check the start time and the end time of the workout type measured by each device. For example, the start time of the cycling, measured by the wearable device 1020, is 09:30, and the start time autonomously measured by the electronic device 101 may be 09:20. Alternatively, the end time of the cycling, measured by the wearable device 1020, is 10:30, and the end time autonomously measured by the electronic device 101 may be 10:20. In this case, since the start time and the end time are different for one workout type, it is required to match the start time and the end time for the data integration. In operation 1105, the electronic device 101 may check the priority. The priority may be at least one of time, an activity type, a workout type, and a device. The priority may be configured by the user or configured to the electronic device 101 by a default value, and then stored in the memory 130. The processor 120 may check whether the priority associated with the activity information is stored in the memory 130.
In operation 1107, the electronic device 101 (for example, the processor 120) may integrate activity information for each workout type based on the priority. For example, when giving a higher priority to time, the processor 120 may integrate activity information based on the start time and the end time of the workout type that has occurred for the very first time. Alternatively, when giving a higher priority to the workout type, the processor 120 may integrate activity information based on the start time and the end time of the workout type that has a higher priority. For example, when the priority is higher in the sequence of cycling, walking, running, and swimming, and the workout times of cycling and walking are partially overlapped, the processor 120 may integrate the activity information of the cycling based on the start time and the end time of cycling. The processor 120 may determine the workout time of walking so as not to overlap with the workout time of cycling, and integrate activity information of the walking.
When the priority is given to the device, the priority is higher in the sequence of the external device, the electronic device 101, and an application. For example, the processor 120 may integrate the activity information based on the workout time of the workout type measured by the external device, and integrate the activity information using the workout time of the workout type measured by the electronic device 101.
FIG. 12 illustrates an example of integrating activity information according to various embodiments.
Referring to FIG. 12, the processor 120 may integrate activity information based on the start time of the workout type. For example, when it is recognized that the cycling 1211 has started first, as a result of analysis of the data measured by the application 1210, the processor 120 may integrate the activity information 1241 on the cycling based on the start time t1 of the cycling 1211. The application 1210 may refer to an application installed in the electronic device 1220. The processor 120 may check the end time t3 of the cycling 1211 and determine whether the end time t3 of the cycling 1211 overlaps with other activity information.
Since the processor 120 integrates the activity information based on time, when the activity information (e.g., first activity information) that has started first overlaps other activity information (e.g., second activity information), the processor 120 may integrate the second activity information based on the end time of the first activity information. For example, it can be seen that the end time t3 of the cycling 1211 overlaps the start time t2 of the swimming 1 1221 measured by the electronic device 1220. In this case, the processor 120 may integrate the activity information 1241 on the cycling until the end time t3 of the cycling 1211, and integrate the activity information on the swimming after the end time t3 of the cycling 1211.
For example, a user may swim after configuring the swimming start time on the electronic device 1220 before beginning swimming. Alternatively, a user may configure an expected swimming time (e.g., 30 minutes, one hour, etc.) to the electronic device 1220. In this case, the swimming time actually taken by a user may be different from the swimming workout time measured by the electronic device 1220. However, a user may swim by attaching the electronic device 1220 to the body, and in general the user may swim by wearing the wearable device 1230. For example, the start time t2 of the swimming 1 1221 measured by the electronic device 1220 may be different from the time t4 measured for the swimming 2 1231 measured by the wearable device 1230. In addition, the end time t5 of the swimming 1 1221 measured by the electronic device 1220 may be different from the end time t6 measured for the swimming 2 1231 measured by the wearable device 1230.
In consideration of the above description, when integrating activity information on the swimming, the processor 120 may integrate the activity information 1242 on the swimming based on data on the swimming 1 1221 measured by the electronic device 1220 and data on the swimming 2 1231 measured by the wearable device 1230. The processor 120 may integrate the activity information 1242 on the swimming based on the workout time t2 to t5 of the swimming 1 1221, measured by the electronic device 1220, after the end time t3 of the cycling 1211, and the workout time t4 to t6 of the swimming 2 1231 measured by the wearable device 1230. In this case, activity information acquired from different devices is processed as one continuous activity, so that the user may intuitively identify the activity information.
According to various embodiments, when integrating activity information, the activity information may be integrated by assigning different priorities to each device. Unlike cycling, it can be seen that two devices have measured the swimming. The processor 120 may assign different priorities to each device, so as to integrate the activity information. For example, in FIG. 12, the wearable device 1230 may have a higher priority than the electronic device 1220. The processor 120 may assign a higher priority to the workout time t4 to t6 of the swimming 2 1231 measured by the wearable device 1230 than the workout time t2 to t5 of the swimming 1 1221 measured by the electronic device 1220, so as to integrate the activity information 1242 on the swimming.
FIG. 13 illustrates a method for integrating non-activity information by an electronic device according to various embodiments.
FIG. 13 may be a drawing that embodies the data integration operation 805 of FIG. 8. That is, FIG. 13 illustrates the operation of integrating non-activity information on a third activity type. Referring to FIG. 13, in operation 1301, the electronic device 101 (e.g., the processor 120) may align inactive intervals for each time. The inactive intervals may mean a stationary state such as taking a sleep, sitting, and the like without walking or activity (e.g., cycling, swimming, etc.). Alternatively, the inactive intervals may mean that there is no detection. The processor 120 may align the inactive intervals for each time.
In operation 1303, the electronic device 101 (e.g., processor 120) may determine whether activity information is included between inactive intervals. For example, the number of steps may be taken by a walker during a break. In this case, since the number of steps has been taken by a user when moving, such as going to the restroom in the middle of rest, and the like, the activity information may be included between inactive intervals.
When the activity information is included between the inactive intervals, the operation 1309 may be performed, and when the activity information is not included therebeween, an operation 1305 may be performed.
In operation 1305, the electronic device 101 (for example, the processor 120) may integrate the inactive intervals into one session. The inactive interval may be integrated through a distance based clustering. For example, when the inactive interval is continuously displayed, the processor 120 may integrate the inactive intervals that have been continuously detected into one session.
In operation 1307, the electronic device 101 (for example, the processor 120) may process the integrated session as non-activity information. The processor 120 may store the processed non-activity information in the memory 130.
In operation 1309, the electronic device 101 (e.g., processor 120) may determine whether activity information included between inactive intervals exceeds the reference activity information. The reference activity information may be configured by the user or configured to the electronic device 101 by a default value, and then stored in the memory 130. When the inactive interval is continuously displayed, it is highly likely that activity information for a very short time period recognized in the middle may be wrongly recognized. Accordingly, the reference activity information may be used for processing a small movement between inactive intervals as an error, for example, the reference activity information may be 10 steps or less, 5 minutes or less, and so on. The processor 120 may return to the operation 1305 when the activity information included between the inactive intervals is less than or equal to the reference activity information.
When the activity information included between the inactive intervals exceeds the reference activity information, the processor 120 may perform an operation 1311, and when the activity information included between the inactive intervals is less than or equal to the reference activity information, the processor 120 may perform the operation 1305.
In operation 1311, the electronic device 101 (e.g., processor 120) may perform an activity information integration process when the activity information included between inactive intervals exceeds the reference activity information. The activity information integration process may be an operation described in FIG. 11.
FIGS. 14A and 14B illustrate an example of integrating non-activity information according to various embodiments.
Referring to FIGS. 14A and 14B, as shown in FIG. 14A, the inactive intervals may be aligned in a time sequence. When there is no data detection, the processor 120 may process the case as an 'unknown interval' such as an object 1411 and an object 1412. The unknown interval may refer to an interval in which there is no data detection in an external device, the electronic device 101, or an application, which measure health related data. The processor 120 may analyze data collected in one or more devices, and when there is no measured data, as a result of analysis, the processor 120 may process the case as an 'unknown interval'. The processor 120 may analyze the data measured by the electronic device 101 as sleep 1420. In addition, the processor 120 may analyze the data measured by the external device as sitting 1430. The processor 120 may recognize the sleep 1420 measured by the electronic device 101 as an inactive interval (e.g., first inactive interval), and recognize the sitting 1430 measured by the external device as an inactive interval (e.g., second inactive interval). When there is no data detection between data measured by the electronic device 101 and the data measured by the external device, the processor 120 may process the case as an unknown interval 1413.
The processor 120 may integrate inactive intervals, as shown in FIG. 14B. For example, the processor 120 may determine an unknown interval 1413 between the first inactive interval (e.g., sleep 1420) and the second inactive interval (e.g., sitting 1430) as continuous inactive intervals. The processor 120 may use the minimum distance by 10 minutes and determine total 55 minutes as inactivity information. The processor 120 may integrate the interval from a first inactive interval (e.g., sleep 1420) to a second inactive interval (e.g., sitting 1430) into one session. The processor 120 may process an integrated session that includes a first inactive interval (e.g., sleep 1420), an unknown interval 1413, a second inactive interval (e.g., sitting 1430) as inactivity information (stationary) 1440.
FIG. 15 illustrates an example of integrating various activity type data according to various embodiments.
Referring to FIG. 15, the processor 120 may integrate number of steps 1510, inactivity information 1520, and activity information 1530 according to each activity type, as shown in FIG. 9 to FIG. 14B (indicated by reference numeral 1540), and correct the integrated data by considering the priority (or weight) of the activity type. Alternatively, the processor 120 may integrate the number of steps 1510, inactivity information 1520, and activity information 1530 by considering the priority (or weight) of the activity type.
The processor 120 may integrate the number of steps 1 (Step 1, 1511) from t1 to t3 (e.g., 9 minutes), integrate the number of steps 2 (Step 2, 1512) from t5 to t8 (e.g., 9 minutes), and integrate the number of steps 3 (Step 3, 1513) from t9 to t11 (e.g., 30 minutes). In addition, the processor 120 may integrate inactivity information 1521 from t6 to t10. In addition, the processor 120 may integrate activity information on the cycling 1531 from t2 to t4, and integrate activity information on the swimming 1532 from t4 to t7.
According to various embodiments, the priority may be higher in the order of the activity information 1530, the inactivity information 1520, and the number of steps 1510. The processor 120 may configure a higher priority to the activity information 1530 because it may be determined that the activity information 1530 is more meaningful to a user than the number of steps 1510 or the inactivity information 1520. For reference, since the number of steps 1510 has a high error recognition rate, the processor 120 may configure the lowest priority to the number of steps 1510.
According to various embodiments, since the priority of the activity information 1530 is higher than the number of steps 1510 when the activity information for the number of steps 1 1511 overlaps the activity information for the cycling 1531, the processor 120 may correct the number of steps 1 1511 based on the activity information on the cycling 1531. For example, when the start time t2 of the cycling 1531 between the start time t1 and the end time t3 of the number of steps 1 1511 overlap with each other, the processor 120 may correct the end time t3 of the number of steps 1 1511 to be the start time t2 of the cycling 1531. That is, the corrected number of steps (S) 1541 may be corrected so as to be generated during the time from t1 to t2. In addition, activity information 1542 on the cycling 1531 may be corrected so as to be generated during the time from t2 to t4.
For the same activity type (e.g., activity information 1530), the processor 120 may correct the workout time based on at least one of a time, a workout type, and a device. For example, the processor 120 may correct the activity information 1543 on the swimming 1532, after the correction of the activity information 1542 on the cycling 1531 has been completed, which has first started based on time. When the end time t7 of the swimming 1532, the start time t5 of the number of steps 2 1512, and the start time t6 of the inactive interval 1521 are overlapped to each other, the processor 120 may place a priority on the swimming 1532 based on the priority of the activity type, and correct the number of steps 2 1512 and the inactive interval 1521 after the end time t7 of the swimming 1532. Accordingly, activity information 1543 on the swimming 1532 may be corrected so as to be generated during the time from t4 to t7.
Here, since the error recognition rate of the number of steps 1510 is high, the processor 120 may correct the number of steps 2 1512 based on the start time t6 of the inactive interval 1521 when the start time t5 of the number of steps 2 1512 and the start time t6 of the inactive interval 1521 are overlapped. In this case, the number of steps 2 1512 may not be included in the integrated data. That is, when the number of steps 2 1512 is temporarily generated during the inactive interval 1521, the processor 120 may process the number of steps 2 1512 as an error.
To this end, the processor 120 may determine whether the number of steps 2 1512 that has been generated from the start time of the inactive interval 1521 exceeds the reference activity information. The reference activity information may be used for processing a small movement between inactive intervals as an error, for example, the reference activity information may be 10 steps or less, 5 minutes or less, and so on. When the number of steps 2 1512 that has been generated in the inactive interval 1521 is less than or equal to the reference activity information, the processor 120 may process the number of steps 2 1512 as an error and do not include the same in the integrated data. That is, the processor 120 may ignore the number of steps 2 1512 and do not include the same in the integrated data.
In addition, when the end time t10 of the inactive interval 1521 overlaps the start time t9 of the number of steps 3 1513, the processor 120 may correct the number of steps 3 1513 based on the end time t10 of the inactive interval 1521. In this case, the processor 120 may determine whether the number of steps 3 1513 that has been generated from the start time t9 of the number of steps 3 1513 to the end time t10 of the inactive interval 1521 exceeds the reference activity information. The processor 120 may process a part of the number of steps 3 1513 as an error when the number of steps 3 1513 that has been generated until the end time t10 of the inactive interval 1521 is less than or equal to the reference activity information. Therefore, the inactivity information 1544 may be corrected so as to be generated during the time from t7 to t10. In this case, the corrected number of steps 1545 may be corrected so as to be generated during the time from t10 to t11.
Accordingly, the processor 120 may store the corrected data 1540 to the memory 130.
FIG. 16 illustrates an example of a user interface for activity information according to various embodiments.
A user interface of FIG. 16 may be a user interface provided in operation 807 of FIG. 8. Referring to FIG. 16, the processor 120 may display a first user interface 1610 based on the corrected data. As an example, the first user interface 1610 may include a map image 1611 displaying activity areas for walking and activity information, integrated information 1612 for consumed calories, moved distance, the longest period of active times, and activity information on running 1613, walking 1 1614, and walking 2 1615 in a time sequence. For the corrected data, the processor 120 may include a tag (e.g., auto) in the activity information on the running 1613, the walking 1 1614, and the walking 2 1615. Alternatively, when the data is corrected based on the priority, the processor 120 may shade the item color for the corrected data. For example, the processor 120 may differently display the item colors of the corrected data and the uncorrected data.
The first user interface 1610 according to various embodiments may display such that at least one of the images, text, or symbols, associated with the corrected data, is overlapped in a map associated with locations that have been occupied for a predetermined time period. For example, the first user interface 1610 may display, on the map, data associated with the activity information, as icons, text, and the like.
According to various embodiments, the processor 120 may display a second user interface 1620 based on the corrected data. When a user selects one of the activity information on the running 1613, walking 1 1614, and walking 2 1615 through the first user interface 1610, the processor 120 may display the second user interface 1620. For example, when the user selects activity information on the running 1613 through the first user interface 1610, the processor 120 may display the second user interface 1620 that includes brief information 1621 indicating the time and distance for the running, a map image 1622 indicating the running area on the map, and a graph 1623 indicating the running speed.
According to various embodiments, the processor 120 may display a third user interface 1630 based on the corrected data. When the user selects one of the activity information on the running 1613, walking 1 1614, and walking 2 1615 through the first user interface 1610, the processor 120 may display the third user interface 1630. For example, when the user selects activity information on the running 1613 through the first user interface 1610, the processor 120 may display the third user interface 1630 that includes a duration 1631 and distance 1632 for the running, an amount of calories consumed by the running 1633, speed information 1634, pace information 1635, weather information 1636, and a photo button 1637.
The speed information 1634 may include an average speed and a maximum speed according to the time of the running. The pace information 1635 may include an average stride (pace) and a maximum pace according to the distance of the running. The weather information 1636 may include at least one of a weather icon, temperature, a weather type (e.g., clear, cloudy, rain, etc.), humidity, and wind direction. When the user selects a photo button 1637, the processor 120 may execute the camera. When the camera is executed, a camera application is executed and a preview image photographed by the camera may be displayed on the screen of the display 160.
FIG. 17 illustrates an example of a user interface for sharing activity information according to various embodiments.
Referring to FIG. 17, the processor 120 may display a first user interface 1710 for sharing the activity information. For example, when a health-related application is input, selected by a user, the processor 120 may display a first user interface 1710 through the selected application. The health-related application may be installed in the electronic device 101 by a default value even without the user's request. Alternatively, the health-related application may be installed in the electronic device 101 at the request of the user. The first user interface 1710 may be the first screen after executing the application, or may be provided when the user selects 'share' in order to share activity information. For example, when activity information on the walking 1 1614 is selected through the first user interface 1610 of FIG. 16, the processor 120 may display a first user interface 1710.
The first user interface 1710 may include an image 1711, an image add button 1712, a photo button 1713, a chart view button 1714, and a share button 1715. The image 1711 may have been photographed by a user or already registered in an application. When the image add button 1712 is selected, a gallery application including the photographed photos may be executed. Alternatively, when the image add button 1712 is selected, various pop-up menus such as (1) camera execution, (2) gallery execution, and (3) cancellation may be displayed. When the photo button 1713 is selected, a camera application is executed and a preview image photographed by the camera may be displayed on the screen of the display 160. The chart view button 1714 may display detailed information including a map image, a moved distance, a speed, and the like for the activity information displayed on the image 1711. When the share button 1715 is selected, a menu for sharing the activity information displayed on the image 1711 with other users may be displayed. The menu for sharing may be a list of applications (e.g., message, e-mail, etc.) for sharing or a list of partner information (e.g., name, phone number, etc.) included in the contact list.
The processor 120 may display a second user interface 1720 for sharing the activity information. For example, when activity information on the running 1613 is selected through the first user interface 1610 of FIG. 16, the processor 120 may display the second user interface 1720. The second user interface 1720 may include a map image 1721, activity information 1722, a menu list 1723 such as photo, rewards, map view, and chart view, and a share button 1724. The map image 1721 may be a map that designates activity information on the running 1613 as an activity area on the map. The activity information 1722 may include the distance and time for the running 1613. When an item included in the menu list 1723 is selected, a screen for the corresponding item may be displayed. When the share button 1724 is selected, a menu for sharing the activity information displayed on the map image 1721 with other users may be displayed, similarly to the share button 1715.
FIG. 18 illustrates an example of a user interface for configuring a recognition priority of a wearable device according to various embodiments.
Referring to FIG. 18, when currently being connected with a device, the processor 120 may display a first user interface 1810 for configuring a recognition priority. The first user interface 1810 may include a recognition on/off item 1811 for configuring such that the currently connected device is preferentially recognized, a connection guide message 1812 generated according to turning on the on/off item 1811, and a location on/off item 1813 for configuring location information on the currently connected device. When the location on/off item 1813 is turned on, the processor 120 may collect location information on the currently connected device.
When currently not being connected with a device, the processor 120 may display a second user interface 1820 for configuring a recognition priority. The second user interface 1820 may include a recognition on/off item 1821 and a location on/off item (1822) for one of the currently unconnected devices (e.g., the first device) or a device (e.g., the first device) selected by a user among a list of devices. The second user interface 1820 may be configured such that the recognition on/off item 1811 is ON and the location on/off item 1822 is OFF. In order to protect the personal information of a user, the location on/off item 1822 may generally be turned OFF.
When the second user interface 1820 receives an input of selecting OFF of the recognition on/off item 1821 by a user, the processor 120 may display a third user interface 1830. The third user interface 1830 may be configured such that a recognition on/off item 1831 is OFF and a location on/off item 1822 is OFF. For example, when the touch input to the recognition on/off item 1821 by the user or the user input for changing the ON state configured in the recognition on/off item 1821 to the OFF state is generated, the processor 120 may display the third user interface 1830. The user input for changing the ON state to the OFF state may be a drag input for moving the touched location to the OFF direction after touching ON.
FIG. 19 illustrates an example of a user interface for configuring location information according to various embodiments.
Referring to FIG. 19, the processor 120 may display a first user interface 1910. The first user interface 1910 may be similar to the second user interface 1820 of FIG. 18. For example, the first user interface 1910 may be configured such that a recognition on/off item 1911 is ON and a location on/off item 1912 is OFF. When the first user interface 1910 receives an input of selecting ON of the location on/off item 1912 by a user, the processor 120 may display a second user interface 1920.
The second user interface 1920 may be configured such that the first user interface 1910 includes a pop-up message 1921 thereon. The pop-up message 1921 may include, a notification message indicating the entrance to the setting menu of the electronic device 101, in order to change the setting to provide the location information, a location setting 1922, a cancel button 1923, and a setting button 1924. The location setting 1922 may be provided in a form of a check box. When the check box is checked (e.g., selected), the location setting 1922 is activated, and when the check box is not checked (e.g., not selected), the location setting 1922 may be deactivated.
When the search button 1924 is selected after selecting the location setting 1922, the processor 120 may display a third user interface 1930. The third user interface 1930 may be the one to be used for permission setting of the application of the electronic device 101. The third user interface 1930 may display various items (e.g., contacts, locations, phones, etc.) associated with the permission setting of the application. Generally, the electronic device 101 may be in an OFF state in order to protect the personal information of the user. That is, the location setting item 1931 of the third user interface 1930 may be in an OFF state.
When the location setting item 1931 is selected through the third user interface 1930, the processor 120 may provide a fourth user interface 1940. For example, when the touch input to the location setting item 1931 by the user or the user input for changing the OFF state configured in the location setting item 1931 to the ON state is generated, the processor 120 may display the fourth user interface 1940. The location setting item 1941 in the fourth user interface 1940 may be in an ON state.
When the user changes the location setting item 1941 and then selects OK, the processor 120 may provide a second user interface 1950. The second user interface 1950 may be configured such that a location on/off item 1951 is configured to ON in the first user interface 1910. When the location on/off item 1951 is turned on, the processor 120 may collect location information on a device associated with the second user interface 1950.
FIG. 20 illustrates a method for displaying a user interface by an electronic device according to various embodiments.
Referring to FIG. 20, in operation 2001, the electronic device 101 (e.g., the processor 120) may detect an event for displaying activity information. The event may be at least one of an application execution event displaying health related data, an event for selecting a map image through the application, and an event for selecting activity information through the application. The processor 120 may provide a user interface or user experience associated with the activity information in response to the detected event. For example, the processor 120 may display the user interface (e.g., 1610 and 1620) of FIG. 16 in response to the detected event.
In operation 2003, the electronic device 101 (for example, the processor 120) may determine whether the location information for the activity information exists. When the location information associated with the activity information exists, the processor 120 may display an activity area for activity information based on the location information. However, when the location information associated with the activity information does not exist, the processor 120 may not display an activity area associated with the activity information on the map.
The processor 120 may perform an operation 2005 when the location information on the activity information exists, and perform an operation 2015 when the location information on the activity information does not exist.
In operation 2005, the electronic device 101 (for example, the processor 120) may calculate the activity area associated with the activity information. The processor 120 may or may not provide location information for each device. In addition, even when the location information is provided, the sampling rate for the location information may be different for each device. Accordingly, the processor 120 may represent the activity area in a circular form in order to provide a user interface. For example, when one piece of location information exists for one piece of activity information, the processor 120 may calculate an activity area having a predetermined radius with reference to location information. The predetermined radius may be configured by the user or configured to the electronic device 101 by a default value. When two pieces of location information exist for one piece of activity information, the processor 120 may calculate the center point using two end points.
The processor 120 may calculate the center point using the average method as shown in Equation 2.
Figure PCTKR2017001752-appb-I000002
Center (a, b) may be the coordinates of the center point, ax and ay are the coordinates of the first end point, and bx and by are the coordinates of the second end point.
The processor 120 may calculate an activity area for activity information extracted from each device.
In operation 2007, the electronic device 101 (for example, the processor 120) may determine whether there are multiple activity areas. For example, the processor 120 may provide activity information detected in one or more devices, and when there are multiple activity areas, the active areas may overlap with each other.
When there are multiple active areas, the processor 120 may perform an operation 2009, and when the active areas are not multiple, the processor 120 may perform an operation 2015.
In operation 2009, the electronic device 101 (for example, the processor 120) may calculate the distance between activity areas adjacent to each other. For example, the processor 120 may calculate the distance between the activity areas using the distance between center points of activity areas.
The processor 120 may calculate the distance between two activity areas as shown in Equation 3.
Figure PCTKR2017001752-appb-I000003
dist (a, b) may be the distance between two activity areas, ax and ay may be the center coordinates of the first activity area, and bx, by may be the center coordinates of the second activity area.
The processor 120 may calculate the distance between two activity areas using the Euclidean distance of Equation 3, and determine whether the two activity areas overlap with each other, using the calculated distance.
In operation 2011, the electronic device 101 (for example, the processor 120) may determine whether the calculated distance is less than a reference distance. The reference distance may be configured by the user or the electronic device 101.
The processor 120 may determine whether two activity areas are overlapped with each other as shown in Equation 4.
Figure PCTKR2017001752-appb-I000004
dist (a, b) may be the distance between two activity areas, α may be the reference distance, radius of icon1 may be the center point (or icon) of the first activity area, and radius of icon2 may be the center point (or icon) of the second activity area.
For example, when the condition obtained by Equation 4 is true, the processor 120 may perform an overlap control process. For example, when the reference distance (e.g., alpha (α)) is 1, an overlap control process between two activity areas may be performed. When the reference distance is greater than 1, the two activity areas may not overlap with each other, and when the reference distance is less than 1, the two activity areas may overlap by a predetermined size or more.
According to various embodiments, the processor 120 may configure the reference distance based on a map ratio of a map image where the active area is to be displayed, and based on the size of the activity area. For example, when the map ratio is increased or decreased, the size of the activity area may be increased or decreased. Alternatively, when the map ratio is increased or decreased, the size of the activity area may be decreased or increased. The map ratio and the size of the activity area may be proportional or inversely proportional to each other. According to various embodiments, when the size of the activity area is changed according to the map ratio, the processor 120 may adjust the reference distance according to the map ratio and the size of the activity area.
The processor 120 may perform an operation 2013 when the calculated distance is less than the reference distance, and may perform an operation 2015 when the calculated distance is equal to or greater than the reference distance.
In operation 2013, the electronic device 101 (for example, the processor 120) may correct an icon within the activity area.
According to various embodiments, the processor 120 may place a priority on the activity areas in order to prevent as much as possible the case where activity areas on activity information extracted from the plurality of devices are overlapped to each other, thereby reducing the overlapping of the activity areas. The priority may be configured to at least one of a radius of the activity area, a workout type, and a device. The priority may be configured by the user or the electronic device 101. A device which can be processed using a duplicated function and a device which cannot be processed in duplicates may be distinguished and displayed according to the priority. For example, the device that cannot be processed in duplicates may be shaded and then provided. To this end, the processor 120 may perform the following operations.
For example, when the radius of the activity area has a priority, the processor 120 may configure the priority in the descending order by the magnitude of the radius of the activity area, maintain icons in the activity area which has a higher priority, and delete icons in an activity area which has a lower priority. When the workout type has a priority, the processor 120 may maintain the icon in the activity area based on a workout type having a higher priority, and delete the icon in the activity area of the workout type having a lower priority. When a device has a priority, the processor 120 may maintain icons in the activity area based on the activity information extracted from the device having a higher priority, and delete icons in the activity area extracted a device having a lower priority.
In operation 2015, the electronic device 101 (for example, the processor 120) may display a user interface for the activity information. When the operation 2015 is performed during the operation 2005 and operation 2007, the processor 120 may display the user interface for the activity information without correcting icons in the activity area associated with the activity information. When the operation 2013 is performed and then the operation 2015 is performed, the processor 120 may display a user interface for activity information on which the icons have corrected. For example, the processor 120 may display the user interface (e.g., 1610 and 1620) of FIG. 16 as the user interface for the activity information.
FIG. 21 illustrates an example of calculating an activity area according to various embodiments.
Referring to FIG. 21, the processor 120 may calculate an activity area based on location information. When a plurality of pieces of location information (e.g., 2111, 2112, 2113, and 2114) for the activity information exist, the processor 120 may calculate the center point using two end points (e.g., 2111 and 2114) . The processor 120 may calculate the activity area 2110 using the calculated center point. In addition, when one piece of location information (e.g., 2121) exists for one piece of activity information, the processor 120 may calculate an activity area 2120 which has a predetermined radius with reference to location information. In addition, when two pieces of location information exist, the processor 120 may calculate the center point using two pieces of location information (e.g., 2132 and 2132). The processor 120 may calculate the activity area 2130 using the calculated center point.
FIG. 22A to FIG. 24B illustrate examples of correcting an icon in an activity area based on a threshold according to various embodiments.
FIGS. 22A to 22D show an example of correcting an icon in an activity area when the threshold is small.
FIG. 22A shows activity areas (e.g., 2211, 2213) for activity information before the overlapped icons are corrected. Referring to FIG. 22A, a first icon 2212 may be displayed in a first activity area 2211, and a second icon 2214 may be displayed in a second activity area 2213. The processor 120 may perform an icon overlapping process by applying different overlapping ratios to activity areas as shown in FIG. 22A. FIG. 22B shows an example in which the icon overlapping process is performed by lowering the overlapping ratio. FIG. 22C shows an example in which the icon overlapping process is performed by making the overlapping ratio to a medium ratio. FIG. 22D shows an example in which the icon overlapping process is performed by increasing (e.g., high) the overlapping ratio. Referring to FIG. 22B to FIG. 22D, the first icon 2212 included in the first activity area 2211 may be displayed, and the second icon 2214 displayed in the second activity area 2213 may be deleted. That is, when the threshold is small, it can be seen that the second icon 2214 displayed in the second activity area 2213 is deleted even if the overlapping ratio is differently configured.
FIGS. 23A to 23D show another example of correcting an icon in an activity area when the threshold is small.
FIG. 23A shows activity areas for activity information before the overlapped icons are corrected. Referring to FIG. 23A, a first icon 2312 may be displayed in a first activity area 2311, and a second icon 2314 may be displayed in a second activity area 2313. Icons may also be displayed in a third activity area 2315 and a fourth activity area 2316, respectively. The processor 120 may perform an icon overlapping process by applying different overlapping ratios to the activity areas as shown in FIG. 23A. FIG. 23B shows an example in which the icon overlapping process is performed by lowering the overlapping ratio. When the overlapping ratio is configured to be low, as shown in FIG. 23B, the first icon 2312 may be displayed in the first activity area 2311, and the second icon 2314 may be displayed in the second activity area 2313, as shown in FIG. 23A. Icons may also be displayed in the third activity area 2315 and the fourth activity area 2316, respectively.
FIG. 23C shows an example in which the icon overlapping process is performed by making the overlapping ratio to a medium ratio. FIG. 23D shows an example in which the icon overlapping process is performed by increasing (e.g., high) the overlapping ratio. Referring to FIG. 23C to FIG. 23D, the first icon 2312 displayed in the first activity area 2311 may be deleted, the second icon 2314 may be displayed in the second active area 2313, and icons may also be displayed in the third activity area 2315 and the fourth activity area 2316, respectively.
Therefore, if the threshold is small, the icon may not be deleted even if the icon overlapping process is performed.
FIGS. 24A and 24B illustrate an example of correcting an icon in an activity area when the threshold is high.
Referring to FIG. 24A shows activity areas for activity information before the overlapped icons are corrected. Referring to FIG. 24A, a first icon 2412 may be displayed in a first activity area 2411, and a second icon 2414 may be displayed in a second activity area 2413. The processor 120 may perform the icon overlapping process by increasing the threshold. Referring to FIG. 24B, the first icon 2412 may be displayed in the first activity area 2411, and the second icon 2414 may be deleted in the second activity area 2413.
Accordingly, when the icon overlapping process is performed when the threshold is large, icons that should not be deleted may be deleted.
According to various embodiments, when performing the icon overlapping process, the processor 120 may weigh the importance of the icon (e.g., the priority of the workout type) and the physical quantity for activity information, etc., so as to maintain an important icon, and delete an unimportant icon.
FIG. 25A to FIG. 26C illustrate examples of processing overlapped icons according to various embodiments.
FIGS. 25A to 25C illustrate an example of processing an overlapped icon using a greedy set cover. The greedy set cover (Chvatal, V.: A Greedy Heuristic for the Set-Covering Problem. Math. of Oper. Res., Vol. 4, 1979, No. 3, pp. 233~235) may be one of the methods of processing an overlapped icon.
Referring to FIGS. 25A to 25C, the processor 120 may display each icon in a first activity area 2510, a second activity area 2520, a third activity area 2530, a fourth activity area 2540, and a fifth activity area 2550, as shown in FIG. 25A. For example, the processor 120 may display each activity area in a time sequence, as shown in FIG. 25A, if the icons are not prioritized. An icon for each activity area may be configured to 1, 2, 3, 4, and 5 in order. The initial set {} may be an empty state, and become {1} and {1, 2} in order, and icons 1 and 2 may satisfy the overlapping condition.
The overlapping condition is described in operation 2111 described above, and when the center point distance between the first activity area 2510 and the second activity area 2520 including the icons 1 and 2 is smaller than the reference distance, the processor 120 may determine that the overlapping condition is satisfied. Since the icons 1 and 2 satisfy the overlapping condition, the processor 120 may delete the icon 2. That is, the processor 120 may delete the icon included in the second activity area 2520 as shown in FIG. 25B.
Then, since the icon 2 has been deleted, when adding the icon 3 in a state of {1}, the processor 120 may configure set {1, 3}, and determine the overlapping condition between the icons 1 and 3. Since the overlapping condition is not satisfied between the icons 1 and 3, the processor 120 may configure a set {1, 3, 4} by successively adding an icon 4. Icons 3 and 4 included in the third activity area 2530 and the fourth activity area 2540, which are two adjacent activity areas, may satisfy the overlapping condition. In this case, the processor 120 may delete the icon included in the fourth activity area 2540 as shown in FIG. 25C. Since the icon 4 is deleted, the processor 120 may configure a set {1, 3, 5} by adding an icon 5 in the state of set {1, 3}, and determine the overlapping condition between the icons 3 and 5. Since the icons 3 and 5 do not satisfy the overlapping condition, the processor 120 may terminate the icon overlapping process.
In FIG. 25C, which has completed the icon overlapping process, only icons for the first activity area 2510, the third activity area 2530, and the fifth activity area 2550 may be maintained, and the icons for the second activity area 2520 and the fourth activity area 2540 may be deleted. The processor 120 may display a user interface that includes activity information as shown in FIG. 25C, in operation 1215.
The processor 120 according to various embodiments may assign a priority to at least one of a radius of an activity area, an workout type, and a device.
FIGS. 26A to 26C illustrate an example of processing overlapped icons by placing a priority on workout types.
Referring to FIGS. 26A to 26C, the processor 120 may display each icon in a first activity area 2611, a second activity area 2612, a third activity area 2613, and a fourth activity area 2614, as shown in FIG. 26A. For example, the first activity area 2611 and the fourth activity area 2614 may correspond to the activity information when the workout type is 'cycling', and the second activity area 2612 and the third activity area 2613 may correspond to the activity information when the type of workout is 'walking'.
The processor 120 may configure the set in the sequence of higher priorities and apply the greedy set cover to the same. An icon for each activity area may be configured to 1, 2, 3, and 4 in order. The initial set {} may be an empty state, become {1} and {1, 4} in the sequence of higher priorities, and icons 1 and 4 may not satisfy the overlapping condition. That is, since the icons of the first activity area 2611 and the fourth activity area 2614 do not overlap each other, the processor 120 may successively add an icon 2 to configure a set {1, 4, 2}. Icons 4 and 2 included in the fourth activity area 2614 and the second activity area 2612, which are two adjacent activity areas, may satisfy the overlapping condition. In this case, the processor 120 may delete the icon included in the second activity area 2612 as shown in FIG. 26B. The icon 2 is deleted based on the priority without deleting icon 4.
Since the icon 2 is deleted, the processor 120 may configure a set {1, 4, 3} by adding the icon 3 in a state of set {1, 4}, and determine the overlapping condition between the icons 4 and 3. Since the icons 4 and 3 satisfy the overlapping condition, the processor 120 may delete the icon included in the third activity area 2613, as shown in FIG. 26C. The icon 3 is deleted based on the priority without deleting icon 4. Since the activity area to be included in the set does not exist anymore after performing the above process, the processor 120 may terminate the icon overlapping process. In FIG. 26C, which has completed the icon overlapping process, only icons for the first activity area 2611 and the fourth activity area 2614 may be maintained, and icons for the second activity area 2612 and the third activity area 2613 may be deleted.
In this manner, the processor 120 may perform an icon overlapping process using a greedy set cover and a threshold, so as to provide a user with an activity area in which icons are not overlapped.
An operation method for an electronic device according to various embodiments may include operations of: acquiring health related data; correcting the acquired data per unit time; extracting activity information by analyzing the corrected data; and displaying a user interface including the activity information in response to a user request.
The operation of correcting the acquired data may include an operation of correcting the acquired data based on a time unit for storing data of the electronic device.
The operation of extracting the activity information may include an operation of: analyzing data acquired from an external device and data acquired by using the sensor module of the electronic device, so as to classify the data depending on an activity type; and integrating the data depending on the activity type.
The data integration operation may include an operation of integrating data based on a priority of at least one of a time, an activity type, a workout type, and a device.
The operation method may further include operations of: assigning different weights to at least one of the activity type, the workout type, and the device; and correcting the integrated data based on the weights.
The operation method may further include operations of: classifying the number of steps for each device per unit time; determining the maximum number of steps per unit time; and calculating the integrated number of steps based on the determined maximum number of steps.
The operation method may further include operations of: determining a workout type based on the result of data analysis; determining the start and end of the workout type; and integrating activity information for each workout type based on the priority.
The operation method may further include operations of: aligning inactive intervals for each time; and integrating inactive intervals that do not include the activity information into one session, so as to process the integrated session as non-activity information.
The displaying operation may include operations of: extracting location information for the activity information; calculating an activity area for the activity information based on the extracted location information; calculating a distance between two adjacent activity areas; correcting an icon in an activity area based on the calculated distance; and displaying a user interface including the corrected icon.
The operation of correcting the icon may include operations of: determining such that the icon overlap condition is satisfied when the calculated distance is less than a reference distance; and deleting at least one of icons included in the adjacent two active areas.
A computer-readable recording media can include a hard disk, a floppy disk, a magnetic media (e.g., a magnetic tape), an optical media (e.g., a Compact Disc - Read Only Memory (CD-ROM) and/or Digital Versatile Disk (DVD)), a Magneto-Optical Media (e.g., a floptical disk), an internal memory, etc. An instruction can include a code made by a compiler or a code executable by an interpreter. A module or a program module according to various exemplary embodiments can further include at least one or more of the aforementioned constituent elements, or omit some, or further include another constituent element. Operations carried out by a module, a program module or another constituent element according to various exemplary embodiments can be executed in a sequential, parallel, repeated or heuristic method, or at least some operations can be executed in different order or can be omitted, or another operation can be added.
Although the present disclosure has been described with an exemplary embodiment, various changes and modifications may be suggested to one skilled in the art. It is intended that the present disclosure encompass such changes and modifications as fall within the scope of the appended claims.

Claims (15)

  1. An electronic device comprising:
    a housing;
    a display exposed through a part of the housing;
    a first motion sensor disposed within the housing, and configured to detect movement of the housing;
    a wireless communication circuit disposed within the housing;
    a processor disposed within the housing and electrically connected to the display, the first motion sensor, and the wireless communication circuit; and
    a memory electrically connected to the processor,
    wherein the memory stores instructions which, when executed by a processor, cause the processor to:
    generate a wireless communication channel with an external electronic device including a second motion sensor, using the wireless communication circuit;
    monitor the movement of the housing using the first motion sensor so as to generate first data for a first time period;
    receive second data acquired for the first time period through the wireless communication channel, using the second motion sensor;
    calculate, as a value for the first time period, a value smaller than a sum of a first value based on the first data and a second value based on the second data; and
    display the calculated value through a user interface displayed on the display.
  2. The electronic device of claim 1, wherein the instructions cause the processor to display, through the user interface, the calculated value after the first time period when the electronic device and the external electronic device are being worn or carried by a user.
  3. The electronic device of claim 1, wherein the instructions cause the processor to:
    determine a first attribute of the movement of the housing during a first session of the first time period using a first portion of the first data;
    determine a second attribute of the movement of the housing during a second session of the first time period using a second portion of the first data;
    select one of the first attribute and the second attribute; and
    display at least one of an image, a text, or a symbol representing the selected attribute through a user interface displayed on the display.
  4. The electronic device of claim 3, wherein the instructions cause the processor to:
    display, through the user interface, a map associated with locations where the housing has been located for the first time period, and
    display at least one of the image, the text, or the symbol on the map in a superposed manner.
  5. The electronic device of claim 3, wherein the instructions cause the processor to:
    receive second data acquired for the first time period from an external electronic device including a motion sensor through the wireless communication circuit;
    calculate, as a value for the first time period, a value smaller than the sum of a first value based on the first data and a second value based on the second data; and
    display the calculated value through a user interface displayed on the display.
  6. The electronic device of claim 1, wherein the instructions cause the processor to: integrate the fist data and the second data based on a priority of at least one of time, an activity type, a workout type, or a device.
  7. The electronic device of claim 6, wherein the instructions cause the processor to: assign different weights to at least one of an activity type, a workout type, or a device; and
    correct the integrated data based on the weights.
  8. The electronic device of claim 1, wherein the instructions cause the processor to: classify a number of steps for each device per unit time;
    determine the maximum number of steps per unit time; and
    calculate an integrated number of steps based on the determined maximum number of steps, so as to integrate data for the number of steps.
  9. The electronic device of claim 1, wherein the instructions cause the processor to: determine workout types based on a result of data analysis;
    determine a start and end of the workout type; and
    integrate activity information for each workout type based on a priority.
  10. The electronic device of claim 1, wherein the instructions cause the processor to:
    extract location information on the activity information;
    calculate an active area for activity information based on the extracted location information;
    calculate a distance between two adjacent active areas; and
    correct an icon in the active area based on the calculated distance.
  11. An operation method for an electronic device, the method comprising:
    acquiring health related data;
    correcting the acquired health related data per unit time;
    extracting activity information by analyzing the corrected health related data; and
    displaying a user interface including the activity information in response to a user request.
  12. The method of claim 11, wherein correcting of the acquired health related data comprises correcting the acquired health related data based on a time unit for storing data of the electronic device.
  13. The method of claim 11, wherein extracting of the activity information comprises:
    analyzing data acquired from an external device and data acquired by using a sensor module of the electronic device, so as to classify the data depending on an activity type; and
    integrating the data depending on the activity type.
  14. The method of claim 13, wherein integrating of the data comprises integrating the data based on a priority of at least one of a time, an activity type, a workout type, or a device.
  15. The method of claim 13, further comprising:
    assigning different weights to at least one of a activity type, a workout type, or a device; and
    correcting the integrated data based on the weights.
PCT/KR2017/001752 2016-02-19 2017-02-17 Method for integrating and providing collected data from multiple devices and electronic device for implementing same WO2017142341A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP17753502.8A EP3403166A4 (en) 2016-02-19 2017-02-17 Method for integrating and providing collected data from multiple devices and electronic device for implementing same
CN201780012097.XA CN108701495B (en) 2016-02-19 2017-02-17 Method for integrating and providing data collected from a plurality of devices and electronic device for implementing the method
MYPI2018702903A MY193558A (en) 2016-02-19 2017-02-17 Method for integrating and providing collected data from multiple devices and electronic device for implementing same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160019535A KR102446811B1 (en) 2016-02-19 2016-02-19 Method for combining and providing colltected data from plural devices and electronic device for the same
KR10-2016-0019535 2016-02-19

Publications (1)

Publication Number Publication Date
WO2017142341A1 true WO2017142341A1 (en) 2017-08-24

Family

ID=59626132

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/001752 WO2017142341A1 (en) 2016-02-19 2017-02-17 Method for integrating and providing collected data from multiple devices and electronic device for implementing same

Country Status (6)

Country Link
US (1) US10796803B2 (en)
EP (1) EP3403166A4 (en)
KR (1) KR102446811B1 (en)
CN (1) CN108701495B (en)
MY (1) MY193558A (en)
WO (1) WO2017142341A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109036538A (en) * 2018-07-24 2018-12-18 上海常仁信息科技有限公司 A kind of blood pressure detecting system based on robot

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US9026928B2 (en) * 2012-06-06 2015-05-05 Apple Inc. Graphical user interface layout
WO2014143776A2 (en) 2013-03-15 2014-09-18 Bodhi Technology Ventures Llc Providing remote interactions with host device using a wireless device
US20160019360A1 (en) 2013-12-04 2016-01-21 Apple Inc. Wellness aggregator
US12080421B2 (en) 2013-12-04 2024-09-03 Apple Inc. Wellness aggregator
US10270898B2 (en) 2014-05-30 2019-04-23 Apple Inc. Wellness aggregator
EP3147747A1 (en) 2014-06-27 2017-03-29 Apple Inc. Manipulation of calendar application in device with touch screen
EP3195098B1 (en) 2014-07-21 2024-10-23 Apple Inc. Remote user interface
KR102319896B1 (en) 2014-08-02 2021-11-02 애플 인크. Context-specific user interfaces
CN111180039B (en) 2014-09-02 2023-10-24 苹果公司 Physical activity and fitness monitor
WO2016036552A1 (en) 2014-09-02 2016-03-10 Apple Inc. User interactions for a mapping application
CN115695632B (en) 2014-09-02 2024-10-01 苹果公司 Electronic device, computer storage medium, and method of operating electronic device
WO2016144385A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
EP3337583B1 (en) 2015-08-20 2024-01-17 Apple Inc. Exercise-based watch face
EP3451926A4 (en) * 2016-05-02 2019-12-04 Dexcom, Inc. System and method for providing alerts optimized for a user
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
US10873786B2 (en) 2016-06-12 2020-12-22 Apple Inc. Recording and broadcasting application visual output
US11216119B2 (en) 2016-06-12 2022-01-04 Apple Inc. Displaying a predetermined view of an application
US10736543B2 (en) 2016-09-22 2020-08-11 Apple Inc. Workout monitor interface
US10845955B2 (en) * 2017-05-15 2020-11-24 Apple Inc. Displaying a scrollable list of affordances associated with physical activities
US20190130707A1 (en) * 2017-10-31 2019-05-02 Ecolink Intelligent Technology, Inc. Event notification using an intelligent digital assistant
KR102506666B1 (en) * 2018-02-19 2023-03-07 삼성전자주식회사 Microwave, display device and cooking system including the same
DK201870599A1 (en) 2018-03-12 2019-10-16 Apple Inc. User interfaces for health monitoring
DK201870378A1 (en) 2018-05-07 2020-01-13 Apple Inc. Displaying user interfaces associated with physical activities
US11317833B2 (en) 2018-05-07 2022-05-03 Apple Inc. Displaying user interfaces associated with physical activities
DK180171B1 (en) 2018-05-07 2020-07-14 Apple Inc USER INTERFACES FOR SHARING CONTEXTUALLY RELEVANT MEDIA CONTENT
CN109213030A (en) * 2018-08-09 2019-01-15 北京云迹科技有限公司 Industrial personal computer integrating device for robot
CN109256201A (en) * 2018-08-28 2019-01-22 上海联影医疗科技有限公司 Scan monitoring method, device, computer equipment and storage medium
CN109405232B (en) * 2018-09-04 2019-08-30 重庆工业职业技术学院 Based on infrared temperature sensing and the dynamic air-conditioning Automatic adjustment method of human body
US10953307B2 (en) 2018-09-28 2021-03-23 Apple Inc. Swim tracking and notifications for wearable devices
JP6888605B2 (en) * 2018-12-19 2021-06-16 カシオ計算機株式会社 Training discrimination device, training discrimination method and training discrimination program
CN111382765B (en) * 2018-12-29 2023-07-04 中国移动通信集团四川有限公司 Complaint hot spot area clustering method, device, equipment and medium
CN109725280B (en) * 2019-01-17 2021-05-18 乐清市启程电气有限公司 Watt-hour meter field parameter extracting platform
US11586158B1 (en) 2019-03-05 2023-02-21 Etellimetrix, LLC Wireless sensor system and related methods
CN110263230B (en) * 2019-04-25 2021-04-06 北京科技大学 Data cleaning method and device based on density clustering
US11863700B2 (en) 2019-05-06 2024-01-02 Apple Inc. Providing user interfaces based on use contexts and managing playback of media
DK201970532A1 (en) 2019-05-06 2021-05-03 Apple Inc Activity trends and workouts
JP7297940B2 (en) 2019-06-01 2023-06-26 アップル インコーポレイテッド Multimodal activity tracking user interface
CN110686699A (en) * 2019-08-19 2020-01-14 苏宁智能终端有限公司 Step counting method and device
US11639944B2 (en) * 2019-08-26 2023-05-02 Apple Inc. Methods and apparatus for detecting individual health related events
DK202070613A1 (en) 2020-02-14 2021-10-15 Apple Inc User interfaces for workout content
CN113555132B (en) * 2020-04-24 2024-09-17 华为技术有限公司 Multi-source data processing method, electronic device, and computer-readable storage medium
EP4323992A1 (en) 2021-05-15 2024-02-21 Apple Inc. User interfaces for group workouts
KR20230023172A (en) * 2021-08-10 2023-02-17 삼성전자주식회사 Electronic device displaying measured data and method therefor
US11984222B2 (en) * 2021-09-02 2024-05-14 Safety Shield Products, LLC System and method for sharing health data
US11896871B2 (en) 2022-06-05 2024-02-13 Apple Inc. User interfaces for physical activity information
US11977729B2 (en) 2022-06-05 2024-05-07 Apple Inc. Physical activity information user interfaces
CN115101169B (en) * 2022-07-29 2023-03-21 北京欧应科技有限公司 Method, apparatus, and medium for implementing a training action

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020151810A1 (en) * 2001-04-16 2002-10-17 Acumen, Inc. Wrist-based fitness monitoring devices
US20040046692A1 (en) 2002-09-05 2004-03-11 Robson Jack D. Physical training system
US20110152637A1 (en) * 2008-05-14 2011-06-23 Kateraas Espen D Physical activity monitor and data collection unit
US20130325392A1 (en) * 2011-07-11 2013-12-05 Ntt Docomo Inc. Mobile terminal and continuous movement detection method
US20140278139A1 (en) 2010-09-30 2014-09-18 Fitbit, Inc. Multimode sensor devices
WO2014207294A1 (en) 2013-09-13 2014-12-31 Polar Electro Oy System for monitoring physical activity
US20150042468A1 (en) * 2013-08-07 2015-02-12 Nike, Inc. Activity recognition with activity reminders
US20150100245A1 (en) * 2013-10-09 2015-04-09 LEDO Network, Inc. Systems, methods, applications for smart sensing, motion activity monitoring, and motion activity pattern recognition

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6013007A (en) * 1998-03-26 2000-01-11 Liquid Spark, Llc Athlete's GPS-based performance monitor
US6997852B2 (en) * 1999-07-08 2006-02-14 Icon Ip, Inc. Methods and systems for controlling an exercise apparatus using a portable remote device
US7489979B2 (en) * 2005-01-27 2009-02-10 Outland Research, Llc System, method and computer program product for rejecting or deferring the playing of a media file retrieved by an automated process
US8740751B2 (en) * 2005-07-25 2014-06-03 Nike, Inc. Interfaces and systems for displaying athletic performance information on electronic devices
US7698061B2 (en) * 2005-09-23 2010-04-13 Scenera Technologies, Llc System and method for selecting and presenting a route to a user
US7827000B2 (en) * 2006-03-03 2010-11-02 Garmin Switzerland Gmbh Method and apparatus for estimating a motion parameter
US9297709B2 (en) * 2013-03-15 2016-03-29 Nike, Inc. System and method for analyzing athletic activity
WO2009152456A2 (en) * 2008-06-13 2009-12-17 Nike, Inc. Footwear having sensor system
US9409052B2 (en) * 2008-10-03 2016-08-09 Adidas Ag Program products, methods, and systems for providing location-aware fitness monitoring services
CN101579238B (en) * 2009-06-15 2012-12-19 吴健康 Human motion capture three dimensional playback system and method thereof
KR20110074024A (en) * 2009-12-24 2011-06-30 삼성전자주식회사 Multimedia apparatus
BR112013003183A2 (en) * 2010-08-09 2016-05-17 Nike International Ltd fitness monitoring using a mobile device
US8615377B1 (en) * 2010-09-30 2013-12-24 Fitbit, Inc. Methods and systems for processing social interactive data and sharing of tracked activity associated with locations
US8762101B2 (en) 2010-09-30 2014-06-24 Fitbit, Inc. Methods and systems for identification of event data having combined activity and location information of portable monitoring devices
US8775120B2 (en) * 2010-09-30 2014-07-08 Fitbit, Inc. Method of data synthesis
KR101787848B1 (en) * 2012-06-04 2017-10-18 나이키 이노베이트 씨.브이. Combinatory score having a fitness sub-score and an athleticism sub-score
US9500464B2 (en) * 2013-03-12 2016-11-22 Adidas Ag Methods of determining performance information for individuals and sports objects
US9125015B2 (en) 2013-06-28 2015-09-01 Facebook, Inc. User activity tracking system and device
US9723381B2 (en) * 2013-12-23 2017-08-01 Nike, Inc. Athletic monitoring system having automatic pausing of media content
US9037199B1 (en) 2014-02-13 2015-05-19 Google Inc. Detecting transitions between physical activity
US10001386B2 (en) 2014-04-03 2018-06-19 Apple Inc. Automatic track selection for calibration of pedometer devices
US9672482B2 (en) * 2014-06-11 2017-06-06 Palo Alto Research Center Incorporated System and method for automatic objective reporting via wearable sensors
KR101584458B1 (en) * 2014-12-05 2016-01-14 경희대학교 산학협력단 Apparatus and method for real-time recognition of activity and posture based on combined sensor data
KR102113201B1 (en) * 2015-02-27 2020-05-20 삼성전자주식회사 Method for Performing Function and Electronic Device supporting the same
KR102345610B1 (en) * 2015-02-27 2021-12-30 삼성전자주식회사 Apparatus and method for providing of screen mirroring service
KR102344045B1 (en) * 2015-04-21 2021-12-28 삼성전자주식회사 Electronic apparatus for displaying screen and method for controlling thereof
KR20170017407A (en) * 2015-08-06 2017-02-15 삼성전자주식회사 Apparatus and method for providing notification

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020151810A1 (en) * 2001-04-16 2002-10-17 Acumen, Inc. Wrist-based fitness monitoring devices
US20040046692A1 (en) 2002-09-05 2004-03-11 Robson Jack D. Physical training system
US20110152637A1 (en) * 2008-05-14 2011-06-23 Kateraas Espen D Physical activity monitor and data collection unit
US20140278139A1 (en) 2010-09-30 2014-09-18 Fitbit, Inc. Multimode sensor devices
US20130325392A1 (en) * 2011-07-11 2013-12-05 Ntt Docomo Inc. Mobile terminal and continuous movement detection method
US20150042468A1 (en) * 2013-08-07 2015-02-12 Nike, Inc. Activity recognition with activity reminders
WO2014207294A1 (en) 2013-09-13 2014-12-31 Polar Electro Oy System for monitoring physical activity
US20150100245A1 (en) * 2013-10-09 2015-04-09 LEDO Network, Inc. Systems, methods, applications for smart sensing, motion activity monitoring, and motion activity pattern recognition

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FREEDSON ET AL.: "Objective Monitoring of Physical Activity Using Motion Sensors and Heart Rate", RESEARCH QUARTERLY FGR EXERCISE AND SPORT, vol. 71, no. sup2, 1 June 2000 (2000-06-01), pages 21 - 29, XP055548538, doi:10.1080/02701367.2000.11082782
See also references of EP3403166A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109036538A (en) * 2018-07-24 2018-12-18 上海常仁信息科技有限公司 A kind of blood pressure detecting system based on robot

Also Published As

Publication number Publication date
KR20170097888A (en) 2017-08-29
CN108701495A (en) 2018-10-23
KR102446811B1 (en) 2022-09-23
US10796803B2 (en) 2020-10-06
US20170239524A1 (en) 2017-08-24
EP3403166A1 (en) 2018-11-21
CN108701495B (en) 2021-11-09
MY193558A (en) 2022-10-19
EP3403166A4 (en) 2019-03-27

Similar Documents

Publication Publication Date Title
WO2017142341A1 (en) Method for integrating and providing collected data from multiple devices and electronic device for implementing same
WO2018131775A1 (en) Electronic device and method of operation thereof
WO2018117739A1 (en) Method and apparatus for determining abnormal state of battery
WO2017116024A1 (en) Electronic device having flexible display and method for operating the electronic device
WO2018070716A1 (en) Electronic device having plurality of fingerprint sensing modes and method for controlling the same
WO2018021739A1 (en) Method for providing video content and electronic device for supporting the same
WO2018097549A1 (en) Method for processing various inputs, and electronic device and server for the same
WO2017057939A1 (en) Method for processing job information and electronic device supporting same
WO2019027255A1 (en) Electronic device for determining biometric information and method of operating same
WO2017209502A1 (en) Method for controlling connection between electronic device and charging device, and device for providing same
WO2018135841A1 (en) Message generation method and wearable electronic device for supporting the same
WO2016137221A1 (en) Electronic device and image display method thereof
WO2017131467A1 (en) Apparatus and method for determining location of electronic device
WO2018174581A1 (en) Method and device for controlling white balance function of electronic device
WO2017069480A1 (en) Screen outputting method and electronic device supporting the same
WO2017003218A1 (en) Method for controlling multiple batteries and electronic device for implementing same
WO2017135645A1 (en) User interfacing method and electronic device for performing the same
WO2017135634A1 (en) Electronic device for processing and providing data and operating method thereof
WO2017095203A2 (en) Electronic device and method for displaying a notification object
WO2016186418A1 (en) Sensor information using method and electronic device using the same
WO2018194313A1 (en) Motion detection method and electronic device supporting the same
WO2018147658A1 (en) Method for providing activity information of other related to activity pattern of user and electronic device thereof
WO2017209446A1 (en) Electronic device and information processing system including the same
WO2018236082A1 (en) Method for determining data of barometer sensor using data obtained from motion sensor and electronic device for the same
WO2017052097A1 (en) Activity information providing method and electronic device supporting the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17753502

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2017753502

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017753502

Country of ref document: EP

Effective date: 20180814