CN105637448A - Contextualizing sensor, service and device data with mobile devices - Google Patents

Contextualizing sensor, service and device data with mobile devices Download PDF

Info

Publication number
CN105637448A
CN105637448A CN201480057095.9A CN201480057095A CN105637448A CN 105637448 A CN105637448 A CN 105637448A CN 201480057095 A CN201480057095 A CN 201480057095A CN 105637448 A CN105637448 A CN 105637448A
Authority
CN
China
Prior art keywords
information
user
electronic equipment
data
wearable device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201480057095.9A
Other languages
Chinese (zh)
Inventor
P.J.德赛
B.A.罗特勒
D.米洛塞斯基
朴垠映
G.克里什纳
J.奥尔森
M.博格
M.比奇
W.云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/449,091 external-priority patent/US20150046828A1/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN105637448A publication Critical patent/CN105637448A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • G06Q30/0271Personalized advertisement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0255Targeted advertisements based on user history
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0267Wireless devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/04Protocols specially adapted for terminals or networks with limited capabilities; specially adapted for terminal portability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/53Network services using third party service providers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/535Tracking the activity of the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/55Push-based network services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Accounting & Taxation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Signal Processing (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Human Computer Interaction (AREA)
  • Game Theory and Decision Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method and system for contextualizing and presenting user data. The method includes collecting information comprising service activity data and sensor data from one or more electronic devices. The information is organized based on associated time for the collected information. One or more of content information and service information of potential interest are provided to the one or more electronic devices based on one or more of user context and user activity.

Description

Utilize mobile equipment venation sensing data, service data and device data
Technical field
One or more embodiments relate generally to collection, venation (contextualizing) and present user activity data, and more particularly, it relates to collecting sensor and service action message, by archival of information, venation information and present tissue user activity data and it is proposed that content and service.
Background technology
Along with a lot of individualities have mobile electronic device (such as, smart mobile phone), information can be manually entered by user and organize for access, such as photo, appointment and life events (such as, walking, participation, the birth of child, birthday, party etc.).
Summary of the invention
According to the present invention, provide a kind of for venation and the method that presents user data, including: the information including service action data and sensing data is collected from one or more electronic equipments, the time that is associated organizational information based on collected information, and one or more based in user environment (context) and User Activity, what provide in the content information of potential interest and information on services to one or more electronic equipments is one or more.
Accompanying drawing explanation
In order to more fully understand character and the advantage of embodiment and preferably use pattern, it should with reference to accompanying drawing read described in detail below, in accompanying drawing:
Fig. 1 shows the schematic diagram of the communication system according to embodiment.
Fig. 2 shows the block diagram of the framework of the system for including server and one or more electronic equipment according to example embodiment.
Fig. 3 shows the example system environment according to embodiment.
Fig. 4 shows the example organizing data into archives according to embodiment.
Fig. 5 shows the example timeline view according to embodiment.
Fig. 6 shows the example command for gesture navigation according to embodiment.
Fig. 7 A-D shows the example in the upper extension event of timeline graphic user interface (GUI) according to embodiment.
Fig. 8 shows the example for flag event according to embodiment.
Fig. 9 shows the example about panel detail view according to embodiment.
Figure 10 shows the example of the service according to embodiment and equipment control.
Figure 11 A-D shows the example of the Service Management found for application/service according to an embodiment.
Figure 12 A-D shows the example of the Service Management for application/service stream according to an embodiment.
Figure 13 A-D shows the example of the Service Management for application/service user interest according to an embodiment.
Figure 14 shows the example general view of the mode detection according to an embodiment.
Figure 15 shows the instantiation procedure for assembling/collect and show user data according to an embodiment.
Figure 16 show according to an embodiment for by the instantiation procedure of the Service Management of electronic equipment.
Figure 17 shows the example timeline according to an embodiment and slide plate (slide).
Figure 18 shows the instantiation procedure information architecture according to an embodiment.
Figure 19 shows the example active task according to an embodiment.
Figure 20 shows the example of the timeline logic of the slide plate with arrival according to an embodiment and active task.
Figure 21 A-B shows the example detail time line according to an embodiment.
Figure 22 A-B shows the example of the timeline logic with example slide plate kind according to an embodiment.
Figure 23 shows that the timeline according to an embodiment pushes (push) and notifies the example of slide plate kind.
Figure 24 shows that the timeline according to an embodiment draws the example that (pull) notifies.
Figure 25 shows the instantiation procedure of the slide plate for routeing arrival according to an embodiment.
Figure 26 shows the example wearable device block diagram according to an embodiment.
Figure 27 shows the example notification function according to an embodiment.
Figure 28 shows that the example mutual with timeline that be used for according to an embodiment inputs gesture.
Figure 29 shows the instantiation procedure for creating slide plate according to an embodiment.
Figure 30 shows the example that the slide plate of the use template according to an embodiment generates.
Figure 31 shows the example of the voice command of the venation of the slide plate based on display according to an embodiment.
Figure 32 shows the wearable device according to an embodiment and the example block diagram of main process equipment/smart mobile phone.
Figure 33 shows the instantiation procedure receiving order on wearable device according to an embodiment.
Figure 34 show according to an embodiment for moving/instantiation procedure of the based drive gesture of wearable device.
Figure 35 shows the example intelligent alarms of the use tactile element according to an embodiment.
Figure 36 shows the instantiation procedure of the haptic mode for recording customization according to an embodiment.
Figure 37 shows that the wearable device according to an embodiment receives the instantiation procedure of sense of touch record.
Figure 38 shows the example illustration of the sense of touch record according to an embodiment.
Figure 39 shows the example single shaft force transducer for recording sense of touch input according to an embodiment.
Figure 40 shows the example touchscreen for sense of touch input according to an embodiment.
Figure 41 shows the example block diagram of the wearable device system according to an embodiment.
Figure 42 shows the block diagram for venation the process presenting user data according to an embodiment.
Figure 43 shows the high level block diagram including implementing the information processing system of the computing system of one or more embodiment.
Preferred forms
One or more embodiments relate generally to collection, venation user activity data and it are presented. In one embodiment, method includes collecting, from one or more electronic equipments, the information including service action data and sensing data. Can based on the time that the is associated organizational information of collected information. It addition, one or more based in user environment and User Activity, what present in the content information of potential interest and information on services to the one or more electronic equipment is one or more.
The method can also include carrying out the information filtering to tissue based on the filter of one or more selections.
One or more in position-based information, mobile message and User Activity can determine user environment.
The information of tissue can present with specific chronological order on graph time line.
It is one or more that content and service one or more providing potential interest to one or more electronic equipments can include in providing alarm, suggestion, event and communicating.
Content information and information on services are that user can subscribe to for one or more electronic equipments.
The information of tissue can be automatically transferred to one or more electronic equipment.
Based on user action, it is possible to be the event of labelling by service action data, sensing data and content capture.
From the sensing data of one or more electronic equipments and service action data can be provided in the system based on cloud and network system one or more for determining user environment, and user environment can be provided to one or more in the mode activation controlling on the one or more electronic equipment and notice of the one or more electronic equipment.
The information of tissue can be continuously supplied, and includes the life events information collected on the time line, and life events information can be stored in following one or more upper: based on the system of cloud, network system and the one or more electronic equipment.
These one or more electronic equipments can include mobile electronic device, and mobile electronic device includes following one or more: mobile phone, wearable computing equipment, tablet device and mobile computing device.
In one embodiment, it is provided that include the system of active module, wherein active module is used for collecting the information including service action data and sensing data. Also included may be molded tissue block, this molded tissue block is configured to the time that the is associated organizational information based on collected information. Information analyser module can one or more based in user environment and User Activity, what provide in the content information of potential interest and information on services to one or more electronic equipments is one or more.
Molded tissue block can provide the filtration of the information of tissue based on the filter of one or more selections.
User environment can be determined by one or more in information analyser module position-based information, mobile message and User Activity, and the information organized can present with specific chronological order on the graph time line on the one or more electronic equipment.
The content information of potential interest is one or more with the institute in information on services can include following one or more: alarm, suggestion, event and communicate.
Content information and information on services are that user can subscribe to for one or more electronic equipments.
One or more electronic equipments can include the multiple haptic unit for providing haptic signal.
In response to the user action receiving identification on the one or more electronic equipment, it is possible to be the event of labelling by service action data, sensing data and content capture.
The information analyser module of the one or more upper operation that can be provided in the system based on cloud and network system from sensing data and the service action data of one or more electronic equipments is for determining user environment, and user environment can be provided to one or more in the mode activation controlling on the one or more electronic equipment and notice of the one or more electronic equipment.
The information of tissue can be presented serially, and includes the life events information collected on the time line, and life events information can be stored in following one or more upper: based on the system of cloud, network system and the one or more electronic equipment.
These one or more electronic equipments can include mobile electronic device, and mobile electronic device includes following one or more: mobile phone, wearable computing equipment, tablet device and mobile computing device.
In one embodiment, a kind of non-transient computer-readable medium with instruction, this instruction, when running on computers, performs to include collecting, from one or more electronic equipments, the method for information including service action data and sensing data. Can based on the time that the is associated organizational information of collected information. It addition, one or more based in user environment and User Activity, it is possible to what provide in the content information of potential interest and information on services to the one or more electronic equipment is one or more.
Non-transient computer-readable medium can also include coming the information filtering to tissue based on the filter of one or more selections, and one or more in position-based information, mobile message and User Activity can determine user environment.
The information of tissue can present with specific chronological order on graph time line, and to the one or more electronic equipment provide that the content information of potential interest can include in providing alarm, suggestion, event and communicating with one or more in information on services one or more.
Content information and information on services are that user can subscribe to for one or more electronic equipments, the information of tissue can dynamically be passed to the one or more electronic equipment, and based on user action, it is possible to be the event of labelling by service action data, sensing data and content capture.
From the sensing data of one or more electronic equipments and service action data can be provided in the system based on cloud and network system one or more for determining user environment, and user environment can be provided to one or more in the mode activation controlling on the one or more electronic equipment and notice of the one or more electronic equipment.
The information of tissue can be presented serially, and includes the life events information collected on the time line, and life events information can be stored in following one or more upper: based on the system of cloud, network system and the one or more electronic equipment.
These one or more electronic equipments can include mobile electronic device, and mobile electronic device includes following one or more: mobile phone, wearable computing equipment, tablet device and mobile computing device.
In one embodiment, it is shown in the graphic user interface on the display of electronic equipment (GUI) and includes the information-related one or more timeline events with the service action data collected from least electronic equipment and sensing data. GUI can also include the content information of the potential interest to user and the one or more of selectable type service, its one or more based in user environment and the User Activity that is associated with one or more timeline events.
One or more icons can be alternatively used for one or more kinds of being associated with one or more timeline events of display, and can provide on GUI the interest to user it is proposed that content information and information on services in one or more.
In one embodiment, the display framework for electronic equipment includes timeline, and this timeline includes one or more content element of multiple content element and potential user's interest. In one embodiment, it is one or more that the plurality of time-based unit includes in event information, the communication information and environment alarm information, and the plurality of time-based unit shows with specific chronological order. In one embodiment, multiple time-based unit are extended to provide the information of extension based on the user action of the identification received.
In one embodiment, wearable electronic include processor, be couple to the memorizer of processor, the display of bending and one or more sensor. In one embodiment, sensor provides sensing data to analyzer module, analyzer module determines environmental information, and using based on sensing data and the always extra determined environmental information of information of one or more receptions in the service action data of host electronic appliance matched and extra sensing data, what provide in the content information of potential interest and information on services is one or more to the timeline module of wearable electronic. In one embodiment, timeline modular organisation content timeline interface on the display bent.
According to described in detail below, these and other aspect of one or more embodiments and advantage will be apparent from, wherein described in detail below in conjunction with accompanying drawing to illustrate the principle of one or more embodiment.
Detailed description of the invention
It is described below making for the purpose of the General Principle that one or more embodiments are described, and is not intended to limit the inventive concept that there is a requirement that right. Additionally, special characteristic described here can use with the feature combination of other description in each in various possible combinations and displacement. Unless additionally defined clearly at this, otherwise all terms to be endowed its widest possible explanation, including the meaning of hint from description, and understood by one of ordinary skill in the art and/or definition in dictionary, paper etc. the meaning.
Embodiment relates to from one or more electronic equipments (such as, the mobile electronic device of such as smart mobile phone, wearable device, tablet device, camera etc.) collecting sensor and service action message, by archival of information, by information venation (contextualize) and provide/present the user activity data of tissue and the content information of suggestion and information on services. In one embodiment, the method includes collecting, from one or more electronic equipments, the information including service action data and sensing data. Can based on the time that the is associated organizational information of collected information. One or more based in user environment and User Activity, one or more in the content information of potential interest and information on services can be provided to one or more electronic equipment, as described in this.
" life events " of the individual caught from the ecosystem of electronic equipment is collected and is organized into the timeline life daily record of event data by one or more embodiments, and it can be filtered by the special interests region of various " camera lenses ", filter or individual. In one embodiment, the life events caught is wider in scope, deeper on abundant in content degree. In one embodiment, collect and organize from individual's ecosystem (such as, the electronic equipment that user uses, such as smart mobile phone, wearable device, tablet device, intelligent television equipment, other computing equipment etc.) in various services (such as, third party's service, based on the service etc. of cloud) and life activity event in other electronic equipment.
In one embodiment, life data are (such as, User Activity from the equipment of use, sensing data from the equipment used, third party services, service etc. based on cloud) by from mobile electronic device (such as, smart mobile phone) and the sensing data of wearable electronic and service activity are (such as, use service, such as tourism suggestion service, information providing services, restaurant recommendation services, comment service, Financial Service, guide service etc.) combination catch, and can automatically and dynamically be visualized as panel GUI based on the region-of-interest that user specifies. one or more embodiments provide wherein can regular organization activity event (such as, walking, driving, fly, ride automatic vehicle, such as bus, train etc. transport services) the big collection of pattern. these embodiments can not merely rely on the sensing data from handheld device, is also with from the wearable sensor information with equipment.
One or more embodiments service for the bottom (underlying) with wearable device, and it can adopt the form with application to help management user to see how different types of content and being seen by which touch point on GUI. These embodiments may provide for the unique course view of electronic equipment, it assembles various different life eventss, come from use service (such as, service action data) and User Activity (such as, sensing data, electronic equipment activity data) event is placed in the more overall situation within pattern. By utilizing sensor information to come supplemental services information and content information/data (such as, text, photo, link, video, audio frequency etc.), embodiment can be gathered various different information and become single view.
One or more embodiments show the perception about its life (insights) based on user's actual activity, it is allowed to user understands themselves. One embodiment provides center touch point for managing how service and they are experienced. One or more embodiments provide for advising different types of service (namely, thered is provided by third party, by the service offer etc. based on cloud) and electronic device user be likely to the method for content (it can be suitable for venationization user's (that is, being potential interest)) subscribed to. In an example embodiment, it inputs in different types of user, and user can see that the service recommendations based on User Activity, such as, where user registers (position, mechanism etc.), and what movable (such as, various activity pattern) they are doing.
Fig. 1 is the schematic views of the communication system 10 according to an embodiment. Communication system 10 can include communication equipment (send equipment 12) and the communication network 10 of initiating the traffic operation of outgoing, transmission equipment 12 communication network 10 can be used to initiate and carry out with communication network 110 within the traffic operation of other communication equipment. Such as, communication system 10 can include the reception communication equipment (receiving equipment 11) from the traffic operation of transmission equipment 12. Although communication system 10 can include multiple transmission equipment 12 and reception equipment 11, but figure 1 illustrates each only one to simplify accompanying drawing.
Operation creates these any suitable circuit of communication network, equipment, system or combination (such as, including the wireless communication infrastructure of communication tower and telecommunication server) and may be used for creating communication network 110. Communication network 110 possibility can use any suitable communication protocol to provide communication. In certain embodiments, for instance, communication network 110 can support traditional telephone wire, cable television, Wi-Fi (such as, IEEE802.11 agreement),Radio frequency system (such as, 900MHz, 2.4GHz and 5.6GHz communication system), infrared, other wireless communication protocol relatively localized or its any combination. In certain embodiments, communication network 110 can support the agreement that wireless and cell phone and individual's mail device use. Such as, such agreement can include GSM, GSM+EDGE, CDMA, four frequency bands (quadband) and other cellular protocol. Another example, long haul communication agreement can include WiFi and for using VOIP, LAN, WAN or other carry out based on the communication protocol of TCP-IP or receive the agreement of calling. When being positioned within communication network 110, transmission equipment 12 and reception equipment 11 can communicate by the two-way communication path in such as path 13 or by two unidirectional communication paths. Transmission equipment 12 and reception equipment 11 all possibilities can be initiated traffic operation and receive the traffic operation initiated.
Transmission equipment 12 and reception equipment 11 can include any suitable equipment for sending and receive traffic operation. Such as, transmission equipment 12 and reception equipment 11 can include mobile telephone equipment, television system, camera, camcorder, have the equipment of audio frequency and video ability, flat board, wearable device and any miscellaneous equipment that wirelessly (can be in or be not under the help of wireless enabled aid system) or communicate via wireline pathway (such as, using traditional telephone wire). Traffic operation can include the communication of any suitable form, such as include voice communication (such as, call), data communication (such as, Email, text message, media information), video communication or these combination (such as, video conference).
Fig. 2 shows and is used for one or more electronic equipment 120 and wearable device 140 provides for collecting sensor and service action message, by archival of information, by information venation and present the service of the user activity data of tissue and the content of suggestion and service or the functional block diagram of the architecture system 100 of application. Transmission equipment 12 and reception equipment 11 can include some or all of the feature of electronic equipment 120 and/or the feature of wearable device 140. In one embodiment, electronic equipment 120 and wearable device 140 can with communicate with one another, and data synchronized with each other, information, content etc., and (complimentary) or the similar feature given are provided.
In one embodiment, electronic equipment 120 can include display 121, mike 122, audio frequency output 123, input equipment 124, telecommunication circuit 125, control circuit 126, application 1-N127, camera model 128,Module 129, Wi-Fi module 130 and sensor 1 are to N131 (N is positive integer), active module 132, molded tissue block 133 and other suitable assembly any. In one embodiment, application 1-N127 is provided and can obtain from cloud or server 150, communication network 110 etc., and wherein N is equal to or positive integer more than 1. In one embodiment, system 100 includes environmental consciousness inquiry application (contextawarequeryapplication), it is collected evidence with the subscription service work in combination based on cloud or based on server and environmental information, inquiry evidence and environmental information, and in the answer referring now to the request inquired about and inquiry on display 121. In one embodiment, wearable device 140 can include the part or all of of the feature of electronic equipment 120, assembly and module.
In one embodiment, whole application that audio frequency output 123, display 131, output device 124, telecommunication circuit 125 and mike 122 adopt can be passed through control circuit 126 and interconnect and manage. In one example, it is possible to the handheld music player sending music to other tuner can be integrated in electronic equipment 120 and wearable device 140.
In one embodiment, audio frequency output 123 can include any suitable audio-frequency assembly for providing audio frequency to the user of electronic equipment 120 and wearable device 140. Such as, audio frequency exports the 123 one or more speakers (such as, monophonic or boombox) that can include being arranged in electronic equipment 120. In certain embodiments, audio frequency exports 123 audio-frequency assemblies that can include being remotely coupled to electronic equipment 120 or wearable device 140. Such as, audio frequency output 123 can include can being couple to communication equipment (such as, being couple to electronic equipment 120/ wearable device 140 with jack) with line or wirelessly coupling to communication equipment (such as,Earphone orReceiver) receiver, earphone or earplug.
In one embodiment, display 121 can include for providing any suitable screen to the visible display of user or optical projection system. Such as, display 121 can include the screen (such as, lcd screen) that is incorporated in electronic equipment 120 or wearable device 140. As a further example, display 121 can include moveable display or the optical projection system (such as, video projector) of the display for providing content on the surface away from electronic equipment 120 or wearable device 140. Display 121 can be operated to show content (such as, about the information of traffic operation or the information that selects about available media) under the guidance of control circuit 126.
In one embodiment, input equipment 124 could be for providing user to input to electronic equipment 120 or wearable device 140 or any suitable device of instruction or user interface. Input equipment 124 can take various forms, such as button, keyboard, rotating disk, some striking wheel or touch screen. Input equipment 124 can include multi-point touch panel.
In one embodiment, telecommunication circuit 125 can be that operation is for being connected to communication network (such as, Fig. 1, communication network 110) and send any suitable telecommunication circuit of traffic operation and media from electronic equipment 120 or wearable device 140 to the miscellaneous equipment within communication network. Telecommunication circuit 125 is operable to use any suitable communication protocol to be connected with communications network interface, and communication protocol is such as, for instance, Wi-Fi (such as, IEEE802.11 agreement),Radio frequency system (such as, 900MHz, 2.4GHz and 5.6GHz communication system), infrared, GSM, GSM+EDGE, CDMA, four frequency bands and other cellular protocol, VOIP, TCP-IP or other suitable agreement any.
In certain embodiments, telecommunication circuit 125 may be operative to use any suitable communication protocol to create communication network. Such as, telecommunication circuit 125 can use short-range communication protocols to create short distance communication network to be connected to other communication equipment. Such as, telecommunication circuit 125 may be operative to useAgreement create local communication network by electronic equipment 120 withEarphone couples.
In one embodiment, control circuit 126 is operable to control operation and the performance of electronic equipment 120 or wearable device 140. Such as, control circuit 126 can include processor, bus (such as, for sending instruction to other assembly of electronic equipment 120 or wearable device 140), memorizer, bin or other suitable assembly any, for controlling electronic equipment 120 or the operation of wearable device 140. In certain embodiments, processor can drive display and process the input received from user interface. Such as, memorizer and bin can include high-speed cache, flash memory, ROM and/or RAM/DRAM. In certain embodiments, memorizer can be exclusively used in storage firmware (such as, applying) for the equipment of such as operating system, user interface function and processor function. In certain embodiments, memorizer may be operative to the information that storage is relevant with miscellaneous equipment, electronic equipment 120 or wearable device 140 perform traffic operation (such as, preserving the different media types of the contact details relevant with traffic operation or storage and user's selection and the information that media item is relevant) with described miscellaneous equipment.
In one embodiment, control circuit 126 may be operative to perform the operation of one or more application of enforcement on electronic equipment 120 or wearable device 140. The application of any suitable quantity or type can be implemented. Although discussed below will enumerate different application, it will be understood that some or all application can be merged into one or more application. Such as, electronic equipment 120 and wearable device 140 can include automatic voice identification (ASR) application, conversation applications, map application, media application (such as, QuickTime, MobileMusic.app or MobileVideo.app,Deng), social networks apply (such as,Deng), the Internet visit application etc. In certain embodiments, electronic equipment 120 and wearable device 140 can include operating the one or more application performing traffic operation. Such as, electronic equipment 120 and wearable device 140 can include messages application, mail applications, voice mail application, instant message application (such as, being used for chatting), video conference application, fax application or be used for performing other suitable application any of any suitable traffic operation.
In certain embodiments, electronic equipment 120 and wearable device 140 can include mike 122. Such as, electronic equipment 120 and wearable device 140 can include mike 122 to allow user to send the audio frequency for speech control and the navigation of application 1-N127 during traffic operation (such as, speech audio), or as setting up the instrument of traffic operation, or as using the replacement of physical user interface. Mike 122 can be incorporated in electronic equipment 120 and wearable device 140, or can be remotely coupled to electronic equipment 120 and wearable device 140. Such as, mike 122 can be incorporated in wired earphone, and mike 122 can be incorporated in wireless receiver, and mike 122 can be incorporated in remote control equipment etc.
In one embodiment, camera model 128 includes one or more camera apparatus, and it includes for catching static and video image function, editting function, for photo/video etc. being sent, the communication interoperability shared etc.
In one embodiment,Module 129 includes for processingThe processor of information and/or program, and receptor, transmitter, transceiver etc. can be included.
In one embodiment, electronic equipment 120 and wearable device 140 can include multiple sensor 1 to N131, and such as accelerometer, gyroscope, mike, temperature, light, barometer, gaussmeter, compass, radio frequency (RF) identify sensor etc. In one embodiment, multiple sensor 1-N131 provide information to active module 132.
In one embodiment, electronic equipment 120 and wearable device 140 can include other assembly any of being adapted for carrying out traffic operation. Such as, electronic equipment 120 and wearable device 140 can include power supply, port or for being couple to the interface of main process equipment, secondary input equipment (such as, ON/OFF switch) or other suitable assembly any.
Fig. 3 illustrates the example system 300 according to embodiment. In one embodiment, frame 310 illustrates and collects data and understand collected data. Frame 320 illustrates that the electronic equipment to such as electronic equipment 120 (Fig. 2) and wearable device 140 presents data (such as, life data). Frame 330 illustrates collected archives data to LifeHub (namely be based on the systems/servers of cloud, network, storage device etc.). In one embodiment, system 300 illustrates the general view of the process how data (such as, live data) of user to be in progress by system 300, and wherein system 300 uses three aspects: collection in frame 310 and understanding, presenting in frame 320, and the archive in frame 330.
In a block 310, collect with understanding process always from subscriber equipment (such as, electronic equipment 120 and/or wearable device 140) and the User Activity of miscellaneous equipment in the equipment ecosystem of user, third party's information on services in gather data (such as, life data). In one embodiment, data can collected by the active module 132 (Fig. 2) of electronic equipment 120 and/or wearable device 140. Service activity information potentially includes and what checks about user, what is read, what is searched for, watch what etc. information. Such as, if user is currently in use travel service (such as, Travel guide service/application, travelling recommendation service/application etc.), then service activity information may include that the hotel/motel checked, the city of comment, course line, date, information of hiring a car etc., the comment read, the search criterion (such as, price, grading, date etc.) of input, the annotation stayed, the grading etc. that carries out. In one embodiment, it is possible in cloud/server 150, analyze collected data. In one embodiment, it is possible to from mobile equipment (such as, electronic equipment 120, wearable device 140 etc.) user towards touch point manage this collection and analysis. In one embodiment, management can include Services Integration (integration) as described herein below and integration of equipments.
In one embodiment, the process in system 300 can transmit suitable data (such as life data) to user intelligently by wearable device (such as wearable device 140) or mobile equipment (such as electronic equipment 120). These equipment can include equipment ecosystem and miscellaneous equipment. Presenting in frame 320 can perform with forms such as alarm, suggestion, event, communications, and it can be processed via figure, text, sound, speech, vibration, light etc. with forms such as slide plate, card, data or the unit of content-based time, objects. Data including appearance form can be transmitted by the various methods of communication interface, for instance,Near-field communication (NFC), WiFi, honeycomb, broadband etc.
In one embodiment, the archiving process in frame 330 can utilize the data from third party and User Activity, and presents to user and the data mutual with user. In one embodiment, this process can be edited and process data, then generates the panel (as shown in frame 330) with timeline performance or pays close attention to the panel of interest, it is allowed to user checks their activity. Data can archived/be saved in cloud/server 150, upper or any combination at electronic equipment 120 (and/or wearable device 140).
Fig. 4 illustrates the example 400 organizing data into archives according to embodiment. In one embodiment, the timeline format 4 20 processing data into archive can occur in cloud 150, and outside electronic equipment 120 and wearable device 140. Alternatively, electronic equipment 120 can process data and generate archives, or one or more any combination of electronic equipment 120, wearable device 140 and cloud 150 can process data and generate archives. As indicated, data are collected from activity service 410, electronic equipment 120 (such as, data, content, sensing data etc.) and wearable device 140 (such as, data, content, sensing data etc.).
Fig. 5 illustrates the example timeline view 450 according to embodiment. In one embodiment, timeline 420 view 450 includes exemplary diary or filling time line view. Can the daily routines of the archive of organizing user on timeline 420. As it has been described above, archives are filled with the actually mutual activity of user or place, it is provided that the integration view of the life data of user. In one embodiment, the action hurdle on timeline 420 pushes up provides the navigation of homepage/timeline view or interest particular figure, as will be described below.
In an example embodiment, header indicates the current date checked, and includes that user catches or be derived from the third-party image based on User Activity or position. In one example, environment (context) is pattern (such as, walking). In one embodiment, " present (now) " or the current life events that are being recorded always are extended to the information that display is extra, any media of such as event header, progress and consumption or seizure (music such as, listened, the picture of seizure, reading book etc.). In an example embodiment, as shown in view 450, user is in peri-urban walking.
In one embodiment, bygone part includes the event that records from current date. In the exemplary embodiment, as shown in view 450, user is mutual with two events when RitzCarlton. Any one of these events can be chosen and be extended to sees deeper of information (as described herein below). It is alternatively possible to use other environment, such as position. In one embodiment, with different icons or symbol highlighted wearable device 140 achievement event on the time line. In one example, user can continue to scroll down through the life events on previous date for timeline 420 information. Alternatively, once the time of advent line 420 bottom, more contents are just automatically loaded in view 450, it is allowed to continue to check.
Fig. 6 illustrates that the example 600 for gesture navigation according to embodiment is ordered. As indicated, the example timeline 620 that user is in the face of touch point can pass through to explain that the gesture from user inputs 610 navigation. In an example embodiment, such input can be interpreted to roll, move between region-of-interest, expansion etc. In one embodiment, pinching near or pinching remote gesture the navigation across kind layer can be provided of multiple finger is such as used. In an example embodiment, in the display view in odd-numbered day, knob gesture can be converted to panorama, and again for moon view etc. Similarly, contrary action (such as, the multi-finger gesture for amplifying) can from panorama, the moon view etc. any one amplify.
Fig. 7 A-D illustrates the example 710,711,712 and 713 of the event (such as, based on the unit of slide plate/time) being respectively used on expansion time line GUI according to embodiment. In one embodiment, example 710-713 illustrates how the details of the event on the timeline that display achieves. In an example embodiment, such extension can show the extra details relevant with event, such as record and the sensing data analyzed, application/service/content recommendation etc. Any life data event in timeline received to the input (such as, transient force, rap touch etc.) identified or activate user and can extend event to check detailed content in the face of touch point. In one embodiment, example 710 illustrates the result of input or the activation command identifying the reception about " goodmorning (good morning) " event. In example 711, goodmorning event shown in the view of extension. In example 712, timeline is scrolled down through by input or activation command via identifying, extends another event via the input identified received or activation touch point. In example 713, the event of display extension.
Fig. 8 illustrates the example 800 for flag event according to embodiment. In an example embodiment, wearable device 140 (Fig. 2) can have predetermined user action or gesture (such as, extruded tapes), can register user's flag event when received. In one embodiment, system 300 (Fig. 3) can detect the gesture from user on the wearable device 140 of pairing. Such as, user can squeeze 810 wearable devices 140 to initiate labelling. In one embodiment, various data points are captured as single incident 820 by labelling, such as position, picture or other image, neighbouring friend or household, the other event etc. that occurs in same position. System 300 can determine, by context relation, the data point being incorporated in event, picture that context relation is such as clapped during activity, activity data (time of cost, the distance of travelling, the step number etc. that carries out), moving position etc. In one embodiment, the event of labelling can be archived in timeline 420 (Fig. 4), and as highlighted event 830 occur (such as, via specific color, symbol, icon, animation symbol/color/icon etc.).
Fig. 9 illustrates the example 900 about panel detail view according to embodiment. In one embodiment, example 910,911 and 912 illustrates that user passes through the example detail view of timeline 420 (Fig. 4) the GUI panel that can navigate. Panel detail view can allow user to check the information of the gathering about special interests. In an example embodiment, by selecting suitable icon, link, symbol etc., specific interest can be selectable from the user interface timeline 420. In one example, interest can include finance, body-building, travelling etc. User can select finance symbol or icon on the timeline 420 as shown in exemplary view 910. In example 911, finance interest view is shown, and it can display to the user that the budget of gathering. In an example embodiment, budget can be customized for the various time period (such as, day, week, the moon, self-defined period etc.). In one embodiment, panel can show figure detailed catalogue (breakdonw) or spending list or any other theme relevant with finance.
In an example embodiment, in exemplary view 912, show body-building panel based on user's selection to body-building icon or symbol. In one embodiment, body-building view can include performing movable details, about various activities tolerance (step number that such as, carries out, the distance of covering, the time of cost, burning calorie etc.), user is towards the progress etc. of target. In other example embodiment, it is possible to show travelling details based on travelling icon or symbol, it can show the place etc. that user or this locality or distance have accessed. In one embodiment, interest types can be extendible or customizable. Such as, by belonging to specific interest, such as on foot, golf, exploration, motion, hobby etc., interest types can comprise display or in detail to the data of further grain size category.
Figure 10 illustrates the example 1000 of the service according to embodiment and equipment control. In one embodiment, user provides management service and equipment in the face of touch point, as described further on this. In one example, once select the sidebar icon on (such as, touch, rap) example timeline 1010 or symbol, administration view 1011 turns on, and the difference that its display can be managed by user services and equipment.
Figure 11 A-D illustrates the example illustration 1110,1120,1130 and 1140 of the Service Management found for application/service according to an embodiment. Shown example illustrates for enabling the one exemplary embodiment finding relevant application or service. In one embodiment, timeline 420 (Fig. 4) GUI can show the recommendation for the service being incorporated in above-mentioned virtual controlling plate current. This recommendation can be separated into multiple kind. In one example, kind can be based on individual's recommendation of environment (such as, User Activity, existing application/service, position etc.). In another example, kind can be added to the most popular application/service of stream. In another example again, the third class can include new significant application/service. These kinds can show application in various formats, including, it is similar to application/service by the sampled format how being shown in timeline, Grid, List View etc.
In one embodiment, after selecting kind, service or application can show preview details and the extraneous information about service or application. In one embodiment, if application or service are mounted, then Service Management can only by application integration to virtual controlling plate. In one embodiment, example 1110 shows that user touches drawer (drawer) on timeline 420 space GUI and is used for opening drawer. Drawer can comprise quick acting. In an example embodiment, a district provides user to access action, such as finds (Discover), equipment control (DeviceManager) etc. In one embodiment, rap " discovery " and user is taken to new screen (such as, being converted to example 1120 from example 1110).
In one embodiment, example 1120 display comprises " discovery " screen of recommendation for stream that can pass through multiple type classification, plurality of kind such as " for you ", " popular " and " having how new (what ' sNew) ". In one embodiment, Apps icon/symbol is similar to route view and formats, it is allowed to user's convection current " sampling ". In one embodiment, user can rap " adding (the add) " button on the right to add stream. As hi the example shown, kind can be relevant to user, is similar to example provided above.
In one embodiment, example 1120 shows that user can rap label directly to go to that label, or hit between label one by one (swipe). As it has been described above, kind can show application in various formats. In example 1130, popular label shows available stream with grid format, and provides preview when icon or symbol are tapped. In example 1140, " what having new " label shows available service or application in a listing format, and wherein each list items is with short description and " interpolation " button.
Figure 12 A-D illustrates the example 1210,1220,1230 and 1240 of the Service Management for application/service stream according to an embodiment. In one embodiment, example 1210-1240 shows that user can edit virtual controlling plate or stream. User can provide user to activate or disable the option of the application shown by virtual controlling plate in the face of touch point. Touch point may be provided for user and selects which details of application to be shown in which associated device on virtual controlling plate and in equipment ecosystem (such as electronic equipment 120, wearable device 140 etc.).
In one embodiment, in example 1210, to the input received and recognized of drawer icon or activate (such as transient force, the power etc. of applying that moves on touch point/pull) and received and recognized. Alternatively, drawer icon can be the full duration toolbar of call options menu. In example 1220, it is possible to display has such as the options menu of " stream of edit myprof ", " interest of edit myprof " etc. In one example, " stream of edit myprof " in example 1220 is selected based on the action (such as the transient force on touch point, user's input etc. of receiving and recognizing) received and recognized. In example 1230 (stream screen), after the selection to editor's stream, user can be provided traditional service list. In an example embodiment, user can rap on switch and switch (toggle) service and open or close. In one embodiment, the features/content provided in this rank can be pre-fixed (pre-can). Alternatively, when receive the input of identification of the reception for list items instruction, for touch point order or activate (such as, user raps on touch point) time, it is possible to display list items details. In one embodiment, the item of display can include allowing the item of each display to be "caught" and pull to reorder the region of list (as top is preferential). In example 1230, the region that can grab is positioned at the left side of each.
In one embodiment, exemplary view 1240 shows the detail view of single stream and allows user to customize this stream. In an example embodiment, user can select their expectation of which features/content to see and on which equipment (such as, Fig. 2, electronic equipment 120, wearable device 140). In one embodiment, it is impossible to but pent features/content is shown inoperable.
Figure 13 A-D illustrates the example 1310,1320,1330 and 1340 of the Service Management for application/service user interest according to an embodiment. One or more embodiments provide the management for the user interest on timeline 420 (Fig. 4). In one embodiment, user interest types can be added, delete, resequence, amendment etc. Alternatively, user can also customize and can show what (application/service that such as, what is associated is shown together with details) in virtual interest panel. It addition, management can include a part for the user feedback for calibration as mentioned.
In one embodiment, in example 1310, received and recognized and inputted (such as, transient force, on touch point the applying power etc. of movement) and be applied on drawer icon or symbol (such as, receive rap or orientation is hit). Alternatively, the icon in full duration toolbar or symbol can be used to call options menu. In one embodiment, in example 1320 options menu occur, Qi Zhongyou: the stream of edit myprof, edit myprof interest etc. In an example embodiment, as shown in example 1320, have selected at user option " interest of edit myprof " options menu based on the input received and recognized. In one embodiment, in example 1330, the display including the list (user had previously selected in first time uses) of interest occurs. In one embodiment, based on the input received and recognized, interest can be reordered, deletes and add. In an example embodiment, user can resequence interest based on preference, hits to delete interest, rap "+" symbol to be to add interest etc.
In one embodiment, in example 1340, the detail view of single stream allows user to customize this stream. In one embodiment, user can select their expectation of which features/content to see and on which equipment (such as, electronic equipment 120, wearable device 140 etc.). In one embodiment, it is not that the pent features/content of energy is shown not still to be operable to. In an example embodiment, selector can be graying, or other similar display indicative character is locked.
Figure 14 shows the example general view for mode detection according to an embodiment. In one embodiment, general view display example user mode detecting system 1400. In one embodiment, system 1400 utilizes wearable device 140 (wrist strap such as, matched) with the main process equipment of such as electronic equipment 120. In one embodiment, wearable device 140 can provide to electronic equipment 120 and carry (onboard) sensing data 1440, for instance, accelerometer, gyroscope, gaussmeter etc. In one embodiment, it is possible to provide data by various communication interface method, for instance,WiFi, NFC, honeycomb etc. In one embodiment, wearable device 140 data can be assembled by electronic equipment 120 with the data from the sensor within its own, wherein from the data of its own internal sensor such as time, position (via GPS, honeycomb triangulation, beacon or other similar approach), accelerometer, gyroscope, gaussmeter etc. In one embodiment, the collection of the data 1430 of this gathering that be analyzed can be provided to the environment searching system 1410 in cloud 150.
In one embodiment, environment searching system 1410 may be located in cloud 150 or other network. In one embodiment, environment searching system 1410 can receive data 1430 by the various methods of communication interface. In one embodiment, environment searching system 1410 can include environment and determine that engine algorithms is to analyze received data 1430 and the data from learning data set 1420, or since the post analysis received data 1430 of data training of self study data set 1420. In an example embodiment, algorithm can be machine learning algorithm, and it can be customized to user feedback. In one embodiment, learning data set 1420 can include the initial general data about various patterns that compiles from various sources. New data can be added to learning data set in response to the feedback provided to determine for better pattern. In one embodiment, then environment searching system 1410 can produce pattern, the institute's analytical data output 1435 of instruction user, and it is provided back to electronic equipment 120.
In one embodiment, pattern 1445 can be provided back to wearable device 140 by smart mobile phone, apply (such as at LifeHub, Fig. 2, active module 132) or life record application is (such as, molded tissue block 133) in utilize determined pattern 145, or suppress (throttle) to be pushed to the message of wearable device 140 even with it based on environment. In an example embodiment, if user is engaged in the activity such as driven or cycle, then electronic equipment 120 can in a receiving mode 1445, and prevent message be sent to wearable device 140 or provide the notice of non-intrusive, so user will not divert attention. In one embodiment, this point substantially considers the activity of user rather than relies on another kind of method, for instance, geography fence (geofencing). In an example embodiment, if another example can include user and be detected running, then automatically activate pedimeter pattern and show the distance of traveling.
Figure 15 shows the instantiation procedure 1500 for assembling/collect and show user data according to an embodiment. In one embodiment, in frame 1501, process 1500 starts (such as, altered automatically, manually etc.). In frame 1510, active module 132 (Fig. 2) receives third party and services data (such as, from electronic equipment 120 and/or wearable device 140). In frame 1520, active module 132 receives user activity data (such as, from electronic equipment 120 and/or wearable device 140). In frame 1530, collected data are provided to the equipment (such as electronic equipment 120 and/or wearable device 140) of one or more connection and are used for being shown to user. In frame 1540, user interactive data is received by active module 132.
In frame 1550, relevant data are identified and be associated with interest types (such as, by environment searching system 1410 (Figure 14)). In frame 1560, relevant data are searched integration events (such as, by environment searching system 1410 or molded tissue block 133). In frame 1570, the virtual controlling plate of event is generated and temporally reverses (such as, by molded tissue block 133). In frame 1580, utilize and include the event about data that associates to generate the virtual controlling plate of interest types. In one embodiment, timeline 420 (Fig. 4) GUI is used to show one or more virtual controlling plate in frame 1590. In frame 1592, process 1500 terminates.
Figure 16 shows the instantiation procedure of the Service Management by electronic equipment according to an embodiment. In one embodiment, process 1600 starts at beginning frame 1601 place. Frame 1610 being determined, whether process 1600 is searching for application. If process 1600 is searching for application, then process 1600 proceeds to frame 1611, wherein determines the relevant application for suggestion based on user environment. If process 1600 is not in search application, then process 1600 proceeds to frame 1620, where it is determined whether the application of editor control plate. If it is determined that want editor control plate to apply, then process 1600 proceeds to frame 1621, wherein shows the list of application being associated and current state details. If it is determined that not editor control plate application, then process 1600 proceeds to frame 1630, where it is determined whether editor's interest types. If it is determined that do not edit interest types, then process 1600 proceeds to frame 1641.
After frame 1611, process 1600 proceeds to frame 1612, is wherein shown in the suggestion based on user environment of the one or more kinds of apoplexy due to endogenous wind. In frame 1613, receive the user to the one or more application being associated with virtual controlling plate and select. In frame 1614, one or more application are downloaded to electronic equipment (such as, Fig. 2, electronic equipment 120). In frame 1615, the application of download is associated with virtual controlling plate.
In frame 1622, receive user's amendment. In frame 1623, revise the application being associated according to the input received.
If it is determined that editor's interest types, then in frame 1631, the list of display interest types and the application being associated for each kind. In frame 1632, receive the user's amendment for kind with the application being associated. In frame 1633, revise kind and/or the application being associated according to the input received.
Process 1600 carries out after frame 1633, frame 1623 or frame 1615 and terminates at frame 1641.
Figure 17 illustrates the timeline general view 1710 according to an embodiment and the example 1700 of the unit 1730 and 1740 based on slide plate/time. In one embodiment, wearable device 140 (Fig. 2) can include wrist strap type equipment. In an example embodiment, wrist wearing devices can include the band (strap) of formation bracelet spline structure. In an example embodiment, bracelet spline structure can be circular or elliptical shape is to meet the wrist of user.
In one embodiment, wearable device 140 can include bending Organic Light Emitting Diode (OLED) touch screen or the display screen of similar type. In an example embodiment, OLED screen curtain can bend, in convex mode, the bending meeting bracelet construction. In one embodiment, wearable device 140 can also include processor, memorizer, communication interface, power supply etc., as mentioned above. Alternatively, wearable device can include below at the assembly described in Figure 42.
In one embodiment, timeline general view 1710 includes data instance (being shown by the unit of slide plate/data or content-based time), and with three general classes: the past, arrange in (at present) and future (suggestion) now. The example in past can include the event of previous notice or record, such as what see on the left side of timeline general view 1710. Present example can include slide plate 1730 or the suggestion 1740 of time, weather or arrival relevant with user at present. In one example, arrive slide plate (unit of data or content-based time) 1730 can be current life events (such as, body-building record, payment etc.), arrive communication (such as, SMS text, call etc.), personal warning's (such as, sport score, Current traffic, police, emergency etc.). Example in the future can include relevant useful suggestion and prediction. In one embodiment, it was predicted that or suggestion can based on the previous action/preference of user profiles or user. In one example, it is proposed that slide plate 1740 can include upcoming activity, the course line notice of delay etc. around recommendation that the reward voucher near the position such as planned provides, position.
In one embodiment, the slide plate 1730 of arrival can be included into propelling movement or draw (pull) to notify, this is described in greater detail below. In one embodiment, by providing timeline navigation 1720 based on the interface (or voice command, motion or mobile identification etc.) touched. Various users stimulate or gesture can be received and interpreted as navigation command. In an example embodiment, horizontal gestures or hit and may be used for flatly navigating to the left and to the right, rap and can show date, upwards or vertical hitting can recall (bringup) actions menu etc.
Figure 18 illustrates the example information architecture 1800 according to an embodiment. In one embodiment, exemplary architecture 1800 illustrates that user passes through the exemplary information framework of the timeline of timeline navigation 1810 experience. In one embodiment, the slide plate (unit of date or content-based time) 1811 in past can access the lasting scheduled time slot of storage in storehouse or store under other circumstances before being deleted. In an example embodiment, such condition can include the size of the high-speed cache for storing slide plate in the past. In one embodiment, present slide plate includes nearest notice (slide plate, data or the unit of content-based time) 1812 and homepage/time 1813 and active task.
In one embodiment, it is possible to (phonetic entry 1821, pay 1822, register (check-in) 1823, touch gestures etc.) receives and nearest notify 1812 to input 1820 from user. In one embodiment, it is possible to received the externally input 1830 servicing 1832 from equipment ecosystem 1831 or third party by the timeline logic 1840 provided from main process equipment. In one embodiment, nearest notify 1812 can also with the communication of timeline logic 1840 in send data, timeline logic 1840 indicates user action (such as, release or cancellation notice). In one embodiment, nearest notify that 1812 can continue until user and check them, and can be then moved to over 1811 storehouses or remove from wearable device 140 (Fig. 2).
In one embodiment, timeline logic 1840 may be inserted into new slide plate, enters the left side notifying slide plate 1812 recently of most recent as them, for instance, further away from homepage 1813, and to the right of any active task. Alternatively, it may be possible to there is exception, the slide plate wherein arrived is placed directly into the right of active task.
In one embodiment, homepage 1813 can be the acquiescence slide plate that can show time (or other configurable information of possible user). In one embodiment, it is possible to access various patterns 1850 from homepage 1813 slide plate, such as body-building 1851, alarm 1852,1853 etc. are arranged.
In an embodiment it is proposed to 1814 (in the future) slide plate/time-based unit can be mutual with timeline logic 1840, it be similar to and above-mentioned notify 1812 recently. In an embodiment it is proposed to 1814 can be context-sensitive and based on time, position, user interest, user's time schedule/calendar etc.
Figure 19 illustrates the example active task 1900 according to an embodiment. In an example embodiment, showing two active tasks: music long-range 1910 and navigation 1920, each have regular collection separately. In one embodiment, active task 1900 unlike the slide plate that other are all kinds of in timeline (such as, Fig. 4, timeline 420). In one embodiment, active task 1900 keeps easily can using, and can substitute that homepage 1813 is shown until task is done or releases.
Figure 20 shows the example 2000 of the timeline logic of the slide plate 2030 with arrival according to an embodiment and active task 2010. In one embodiment, new slide plate/time-based unit 2030 enters into the left side of active task slide plate 2010, and when being replaced by new content in timeline 2020 as slide plate in the past. In one embodiment, when earphone is connected, the long-range 2040 active task slide plates of music are active. In one embodiment, when user has requested that turn one by one (turn-by-turn) navigates, 2050 slide plates that navigate are active. In one embodiment, homepage slide plate 2060 can be permanently fixed in timeline 2020. In one embodiment, homepage slide plate 2060 can be replaced as visible slide plate by active task temporarily, as mentioned above.
Figure 21 A and 21B display example detail time line 2110 according to an embodiment. In one embodiment, the notice implementing the past, present/nearest notice, the notice of arrival and being explained in more detail of suggestion are described. In one embodiment, timeline 2110 shows the experience example that the user based on touch or gesture is mutual with slide plate/time-based unit. In one embodiment, user's elapsed-time standards line 2110 can include features that: wearable device 140 (Fig. 2) navigation promotes that main process equipment (such as electronic equipment 120) uses. In one embodiment, if user navigates to the second layer of information (such as from notice, expansion event or slide plate/time-based unit), then the application on the main process equipment of pairing can be opened to corresponding screen for more complicated user's input.
Secondary series from the left side of Figure 21 A shows the exemplary vocabulary (such as, symbol, icon etc.) of user action. In one embodiment, to facilitate the limited input of wearable device 140 mutual for such user action. In one embodiment, timeline 2100 show nearest slide plate 2120, homepage slide plate 2130 and advise slide plate 2140.
In one embodiment, timeline user experience can include suggestion engine, the preference of its study user. In an embodiment it is proposed to engine can be trained to initially through the initial kind that user selects, it is then based on acting on, from user, the suggestion that the feedback of suggestion carrys out self-calibrating or deletion provides. In one embodiment, engine can also provide new suggestion to replace old suggestion, or does so when user deletes suggestion.
Figure 22 A and 22B display example slide plate/time-based unit kind 2200 for timeline logic according to an embodiment. In one embodiment, Exemplary species also indicated after the event past, and slide plate (or card) can how long wearable device 140 (Fig. 2) above stores. In one embodiment, timeline slide plate 2110 shows event slide plate, alarm slide plate, communication slide plate, " now " slide plate 2210, " always (always) " slide plate (such as, homepage slide plate) and suggestion slide plate 2140.
Figure 23 shows the example of the timeline sending out notice slide plate kind 2300 according to an embodiment. In one embodiment, event 2310, communication 2320 and environment alarm 2330 kind are appointed as sending out notice by timeline logic. In one example, the slide plate persistent period of event 2310 or predetermined number of days (such as, two days), or selected slide plate maximum quantity reaches or user releases, and which which is exactly at first. In an example embodiment, for communication 2320, the persistent period of slide plate is: they are retained in timeline until they are responded, are checked or be released from electronic equipment 120 (Fig. 2); Or it is retained in timeline to reach predetermined number of days (such as, two days) or the maximum quantity of slide plate supported reaches. In an example embodiment, for environment alarm 2330, the persistent period of slide plate is: they are retained in timeline until no longer relevant (such as, when user is no longer when same position, or when condition or time change).
Figure 24 shows that the timeline according to an embodiment draws the example notifying 2400. In an embodiment it is proposed to slide plate 2410 is considered as draw notice, and it is provided when being asked by the user of hit (such as, the leftward swipe) of home screen. In one embodiment, user need not subscription service to receive suggestion 2410 from it clearly. Suggestion can based on time, position and user interest. In one embodiment, initial user interest kind can arrange definition in app (in the stage in future, user interest can pass through to use calibration automatically) being likely located at the wearable device on electronic equipment 120 or on wearable device 140. In an embodiment it is proposed to the example of 2410 includes: location-based reward voucher; Popular recommendation about food; Place; Amusement and event; The body-building of suggestion or life style target; During non-commuting time, transfer updates; Event after a while, (projected) weather such as projected or the event etc. of arrangement.
In one embodiment, when user indicates it to want to receive suggestion (such as, leftward swipe), it is possible to the suggestion (such as three, as shown in this example) of prestrain predetermined quantity. In one example, if user continues leftward swipe, then extra suggestion 2410 (when applicable) can dynamic load. In one embodiment, change position or the special time at a day as user, refresh suggestion 2410. In one example, it is possible to advise cafe in the morning, and film can be advised between the lights.
Figure 25 shows the instantiation procedure 2500 of the slide plate for routeing arrival according to an embodiment. In one embodiment, process 2500 starts at beginning frame 2501 place. In frame 2510, receive the timeline slide plate from the equipment (such as Fig. 2, electronic equipment 120) matched. In frame 2520, timeline logic determines that whether the timeline slide plate received is requested suggestion. If the timeline slide plate received is requested suggestion, then process 2500 proceeds to frame 2540. In frame 2540, it is proposed that slide plate is arranged to homepage slide plate or the right of nearest suggestion slide plate in timeline.
Frame 2550 being determined, user releases whether occur or whether slide plate is no longer relevant. If user does not release slide plate or slide plate is still relevant, then process 2500 proceeds to frame 2572. If user releases slide plate or slide plate is no longer relevant, then process 2500 proceeds to frame 2560, and wherein slide plate is deleted. Process 2500 then proceedes to frame 2572 and process terminates. In frame 2521, slide plate is arranged to homepage slide plate in timeline or enlivens the left side of slide plate. Frame 2522 being determined, whether slide plate is the slide plate of notification type. Frame 2530 being determined, whether the persistent period of slide plate reaches. If the persistent period reaches, then process 2500 proceeds to frame 2560, and wherein slide plate is deleted. If the persistent period not yet reaches, then process 2500 proceeds to frame 2531, wherein slide plate is placed in slide plate storehouse in the past. Process 2500 then proceedes to frame 2572 and terminates.
Figure 26 shows example wearable device 140 block diagram according to an embodiment. In one embodiment, wearable device 140 includes processor 2610, memorizer 2620, touch screen 2630, communication interface 2640, mike 2665, timeline logic module 2670 and the optional LED module 2650 such as (or OLED) and actuator module 2660. In one embodiment, timeline logic module includes suggestion module 2671, notification module 2672 and user's input module 2673.
In one embodiment, the instruction that the module in wearable device 140 can be stored in memorizer and can be run by processor 2610. In one embodiment, communication interface 2640 can be configured to various communication means (such asLTE, WiFi etc.) it is connected to main process equipment (such as electronic equipment 120). In one embodiment, optional LED module 2650 can be monochromatic or polychrome, and actuator module 2660 can include one or more actuator. Alternatively, wearable device 140 can be configured with optional LED module 2650 and actuator module 2660 respectively through the display of specific pre-programmed or vibration pass on unobtrusive (unobtrusive) to notify.
In one embodiment, timeline logic module 2670 can for controlling how at past, comprehensive logic of tissue timeline slide plate and framework now and in suggestion. Timeline logic module 2670 can be used for user's rule alternately how long and complete this work via controlling slide plate by slide plate kind. In one embodiment, timeline logic module 2670 or can not include submodule, such as suggestion module 2671, notification module 2672 or user's input module 2673.
In an embodiment it is proposed to module 2671 can based on environment, such as user preference, position etc., suggestion is provided. Alternatively, it is proposed that module 2671 can include suggestion engine, its by user with it is proposed that the preference calibrating and learning user alternately of slide plate. In an embodiment it is proposed to module 2671 can remove old or no longer relevant suggestion slide plate, and replace them with new and more relevant suggestion.
In one embodiment, notification module 2672 can control suppression and the display of notice. In one embodiment, notification module 2672 can have general rule for all notices, as described below. In one embodiment, notification module 2672 can also distinguish two kinds of notice, important and unessential. In an example embodiment, important notice can be immediately displayed on display, and can be attended by vibration and/or the LED module 2650 from actuator module 2660 and activate. In one embodiment, screen can remain turned-off based on user preference, and important notice can be activated by vibration and LED and pass on. In one embodiment, unessential notice can only activate LED module 2650. In one embodiment, other combination can be used to pass on and distinguish important or unessential notice. In one embodiment, wearable device 140 also includes other module any with reference to wearable device 140 description shown in Fig. 2.
Figure 27 shows the example notification function 2700 according to an embodiment. In one embodiment, notice includes important notifying 2710 and unessential notifying 2720. User's gesture that user's input module 2673 may identify which when with slide plate mutual on touch screen 2630, the user movement sensed or physical button. In an example embodiment, when user activates touch screen 2630 after new notice, that notice is visible on touch screen 2630. In one embodiment, then the LED from LED module 2650 is closed, and represents "read" state. In one embodiment, if just checking content on wearable device 140 when notice arrives, then touch screen 2630 will remain unchanged (with avoidance breakout), but user is by by the LED alarm and reminding from LED module 2650, if and message is important, then also from the vibration of actuator module 2660. In one embodiment, wearable device 140 touch screen 2630 by specified number of seconds free time (such as, 15 seconds etc.) if after or the arm of user be laid down; close after another time period (such as, 5 seconds).
Figure 28 shows that the example mutual with timeline framework that be used for according to an embodiment inputs gesture 2800. In one embodiment, user can hit to the left or to the right on timeline 2810 and 2820 come navigation time line and suggestion. In one embodiment, the tapping gesture 2825 on slide plate shows extra details 2830. In one embodiment, another time is rapped 2825 and is circulated back to initial condition. In one embodiment, slide plate is upwards hit and 2826 represent action 2840.
Figure 29 shows the instantiation procedure 2900 for creating slide plate according to an embodiment. In one embodiment, process 2900 starts at beginning frame 2901 place. In frame 2910, receive third party's data of the action including text, image or uniqueness. In frame 2920, image is prepared for being shown in wearable device (such as Fig. 2, Figure 26, wearable device 140). In frame 2930, text is disposed in the template field specified. In frame 2940, for unique action, generate dynamic slide plate. In frame 2950, slide plate is provided to wearable device. In frame 2960, receive interaction response from user. In frame 2970, user's response is provided to third party. Process 2900 proceeds to end block 2982.
Figure 30 shows that the slide plate of the use template according to an embodiment generates the example of 3000. In one embodiment, timeline slide plate provides data to interaction models. In one embodiment, model allows third party's service when creating slide plate not consume vast resource with user. Third party's service can provide data as a part for externally input 1830 (Figure 18). In one embodiment, third party's data can include the action of text, image, image cursor (such as URL) or uniqueness. In an example embodiment, such third party's data can be passed through third-party application, provide by API or by other similar means (such as HTTP). In one embodiment, or by wearable device 140 (Fig. 2, Figure 26) logic, main process equipment (such as electronic equipment 120), or even in cloud 150 (Fig. 2), third party's data can be converted into slide plate, card or for other of particular device suitable present form (such as, based on screen size or device type), for being shown in wearable device 140 by the use of template.
In one embodiment, target device can be detected to the data of interaction models and determine and present form (such as, slide plate/card, suitably sized etc.) for what show. In one embodiment, image can be ready to by the cutting out of preset design rule of the applicable display of feature detection and use. Such as, design rule may indicate that the part of the picture that be the theme relevant with the focus of display (such as, aircraft, face etc.).
In one embodiment, template can include the position (such as, pre-set image, the text field, design etc.) specified. So, image can be inserted in background, and suitable text can be provided in various field (such as, main or secondary field). Third party's data can also include may be incorporated in the data in extra layer. Can pass through in detail or the making for getting out extra layer of action slide plate. Some actions can be to be included in default-action on all slide plates (as remove, bookmark etc.). In one embodiment, third party the action of the uniqueness servicing offer can be placed on by the dynamic slide plate of template generation. Unique action can be specific to the slide plate generated by third party. Such as, the unique action shown in the exemplary slide plate in Figure 30 can be the instruction that user has been observed that aircraft. Dynamic slide plate can access from default-action slide plate.
In one embodiment, ready slide plate can be provided to the wearable device 140 of timeline logic module 2670 (Figure 26) its display of instruction. In one embodiment, it is possible to receive user's response from mutual. Result can be provided back to third party by the similar approach being initially provided with third party's data, for instance, third-party application, by API, or by other means (such as HTTP).
Figure 31 shows the example 3100 of the environment voice command based on shown slide plate according to an embodiment. In one embodiment, wearable device 140 use such as include from any slide plate 3120 long by gesture 3110 to receive voice message 3130. Such by being the long touch detected on the touchscreen or press physical button. In one embodiment, general voice command 3140 and the specific voice command 3150 of slide plate are explained for action. In one embodiment, the combination of the gesture interaction on wearable device 140 (such as, wrist strap) and voice command is for the navigation of the framework based on event. In an example embodiment, the mixing of such voice command and gesture input can include registering specific gesture to trigger voice message 3130 for user's input by internal sensor (such as accelerometer, gyroscope etc.).
In one embodiment, the voice of combination and gesture interaction and visual cues provide dialogue and improve Consumer's Experience alternately. It addition, the input based on limited gesture/touch is helped based on the action in the system of event by supplementary greatly with voice command, such as search for specific slide plate/card, fast filtering and classification etc. In one embodiment, diagram describes based on the environment voice command (such as, the specific voice command of slide plate 3150) of slide plate shown on the touchscreen or the universal phonetic order 3140 from any display.
In an example embodiment, when any slide plate is shown, user can run length and press the actuating of 3120 hard buttons to activate voice command function. In other embodiments, voice command function can be triggered by touch gestures or the user movement identified via the sensor embedded. In an example embodiment, lift wrist strap simultaneously if user turns over their wrist it is spoken, or user performs the shake of the wrist sharply/motion of short order, then wearable device 140 can be configured to trigger phonetic entry.
In one embodiment, wearable device 140 shows visual cue on screen, and it is ready to accept verbal order to notify user. In another example embodiment, wearable device 140 can include speaker to provide audio prompt, if or wearable device be placed on base station or docking station, then base station can include speaker for providing audio prompt. In one embodiment, wearable device 140 provides tactile notification (such as specific oscillating sequence) it is in the pattern of listening to notify user.
In one embodiment, user's instruction verbal order from the discernible the presets list of equipment. In one embodiment, example generic voice command 3140 is shown in example 3100. In one embodiment, order can be general (thus from any slide plate can use) or context-sensitive and be applied to shown specific slide plate. In one embodiment, in specific situation, generic command 3140 can environmentally relevant with the slide plate of display at present. In an example embodiment, if position slide plate is shown, then order " register (check-in) " can be registered in this position. If it addition, slide plate includes big contents list, then order may be used for selecting the certain content on slide plate.
In one embodiment, wearable device 140 can provide the system response of request clarification or more information, and waits the response of user. In an example embodiment, this may be from wearable device 140 do not understand user order, by command recognition be invalid/not in pre-set commands, or order request further user input. In one embodiment, once whole order is ready to run, wearable device 140 can allow user confirm and then perform action. In one embodiment, wearable device 140 can ask confirmation to be ready for order for running.
In one embodiment, user can also be mutual with wearable device 140 by terminals simultaneously or concurrently activating touch screen with voice command. In an example embodiment, user can use finger to hit to scroll down or up to look back order. Other gesture may be used for clear command (such as, rap screen to represent virtual removing button), or touch/rap virtual ACK button and take orders. In further embodiments it is possible to use physical button. In an example embodiment, user can pass through to press physical button or switch (such as home button) to release/remove voice command and other action.
In one embodiment, wearable device 140 registration motion gesture outside the finger gesture that belt sensor (such as gyroscope, accelerometer etc.) is used on the touchscreen. In an example embodiment, the motion or the gesture that use registration may be used for cancelling or clear command (such as, shake wearable device 140 is once). In other example embodiment, it is possible to employing is rolled by inclination wrist, moves to next slide plate with clockwise movement rotary wrist or counterclockwise to move to the navigation of previous slide plate. In one embodiment, it is possible to have by the environmental movement gesture of some kinds of slide plate identification.
In one embodiment, wearable device 140 can adopt the process without app, and wherein the main display for information includes card or slide plate, contrary with application. One or more embodiments can allow user to navigate and not require that user is resolved by each slide plate based on the system architecture of event. In an example embodiment, user can ask specific slide plate (such as, " display 6:00 this morning "), and this slide plate can be displayed on screen. Such order can also retract the slide plate not stored in the archive on wearable device 140. In one embodiment, number order can present selection, and this selection can present over the display and select mechanism to navigate via slide plate. In an example embodiment, the voice command " registered " can cause allowing or ask user to select the display in the various places for registering.
In one embodiment, it is possible to use allow to easily access coherent events, by the interesting display of fast filtering and the navigation based on card of classification. In an example embodiment, order " what I 3:00 yesterday afternoon doing? " the display of the subset of processable card around indicated time can be provided. In one embodiment, wearable device 140 can show that instruction includes the visible notice of the quantity of the slide plate of subset or standard. If including the quantity of subset more than predetermined threshold (as 10 or more card), then wrist strap can prompt the user whether that it wishes to carry out and further filters or classification. In one embodiment, user can use touch input to the subset of the card that navigates, or utilize voice command to filter further or subset of classifying (such as, " and by degree of association arrange ", " first showing the results " etc.).
In one embodiment, another embodiment can include the voice command of the upper action performed in third party's service of equipment (such as Fig. 2, electronic equipment 120) in pairing. In an example embodiment, user can register in certain position, and it can pass through such asDeng third-party application reflection, without pairing equipment on open third party service. Another example embodiment includes social activity more newer command, it is allowed to user's more new state on social networks, for instance, shown aboveUpdate,State renewal etc.
In one embodiment, the main process equipment that voice command (such as, universal phonetic order 3140 and the specific voice command 3150 of slide plate) can be paired to by wearable device 140 processes. In one embodiment, order will be transferred to main process equipment. Alternatively, main process equipment can provide the command to cloud 150 (Fig. 2) and be adapted to assist in explanation order. In one embodiment, wearable device 140 maintenance can be monopolized by number order. Such as, " go to (goto) " order, general action etc.
In one embodiment, during when wearable device 140 mainly through main process equipment and external equipment or server interaction, in certain embodiments, wearable device 140 can have the direct communication connection with the miscellaneous equipment in the equipment ecosystem of user, such as television set, flat board, earphone etc. In one embodiment, other example of equipment can include the equipment of other connection in thermostat (such as, Nest), platform scale, camera or network. In one embodiment, such control can include activating or control equipment or help make various equipment can with communicate with one another.
In one embodiment, wearable device 140 may identify which that predetermined motion gesture triggers condition of specifically listening to, i.e. for the search of particular types or the filtration of the slide plate of type. Such as, equipment may identify which the sign language motion for " suggestion ", and can restrict the search to suggestion kind card. In one embodiment, mike can be utilized for tracking of sleeping based on the voice command of wearable device 140. Such supervision can also utilize other sensors various that wearable device 140 includes, including accelerometer, gyroscope, photoelectric detector etc. Can provide about determining determining more accurately of analysis that when user falls asleep and wake up about the data of light, sound and motion, and other details of sleep pattern.
Figure 32 shows the wearable device 140 according to an embodiment and the example block diagram 3200 of main process equipment (such as electronic equipment 120). In one embodiment, the speech command module 3210 that wearable device 140 carries can be configured to receive input from touch display 2630, mike 2665, sensor 3230 and communication module 2640 assembly, and provide output to touch display 2630 for pointing out/confirm or to communication module 2640 for relay commands to main process equipment (such as electronic equipment 120), as mentioned above. In one embodiment, speech command module 3210 can include gesture recognition module 3220 process the touch respectively from touch display 2630 or sensor 3230 or motion input.
In one embodiment, the voice command processing module 3240 that main process equipment (such as electronic equipment 120) carries can process order for running, and provides instructions to the speech command module 3210 on wearable device 140 by communication module (such as communication module 2640 and 125). In one embodiment, such voice command processing module 3240 can include being programmed to working together with wearable device 140 with application, or can to the transparent backdrop procedure of user.
In one embodiment, voice command processing module 3240 on main process equipment (such as electronic equipment 120) can only process the audio frequency from wearable device 140 transmission or speech data, and runs for the speech command module 3210 on wearable device 140 with the data after the form offer process of command instruction. In one embodiment, voice command processing module 3240 can include navigation command identification submodule 3250, it can perform various function, such as identifies on wearable device 140 no longer available card and together with the order processed, they are supplied to wearable device 140.
Figure 33 shows the upper instantiation procedure 3300 receiving order of the wearable device (such as Fig. 2, Figure 26, Figure 32, wearable device 140) according to an embodiment. In one embodiment, any point in process 3300, user can roll with touch screen interaction with viewing command. In one embodiment, in process 3300, user can specifically be cancelled touch/motion gesture cancelled by pressing physical button or use. In one embodiment, when order is instructed to, the confirmation that user can also take orders by rapping screen to provide.
In one embodiment, process 3300 starts at beginning frame 3301 place. In frame 3310, received the instruction entering pattern of listening to by wearable device (such as Fig. 2,26,32, wearable device 140). In frame 3320, prompt the user with voice command from wearable device. In frame 3330, wearable device receives audio/speech order from user. In frame 3340, it is determined that whether the voice command received is effective. If voice command is confirmed as invalid, then process 3300 proceeds to frame 3335, and wherein user is by the order of the invalid reception of wearable device alarm.
If it is determined that voice command is effective, then process 3300 proceeds to frame 3350, where it is determined whether need clarification. For the voice command received, if needing the clarification to voice command, then process 3300 proceeds to frame 3355. In frame 3355, user is by wearable device prompting clarification.
In frame 3356, wearable device receives clarification via another voice command from user. If it is determined that do not need the clarification of voice command, then process 3300 proceeds to frame 3360. In frame 3360, wearable device gets out order and confirms for running and asking. In frame 3370, wearable device receives confirmation. In frame 3380, process 3300 runs order, or order is sent to wearable device for running. Process 3300 then proceedes to frame 3392 and process terminates.
Figure 34 show according to an embodiment for moving/instantiation procedure 3400 of the based drive gesture of wearable device. In one embodiment, process 3400 at wearable device (such as Fig. 2,26,32, wearable device 140) upper receive the order merging based drive gesture, such based drive gesture includes wearable device (such as wrist strap) and detects that wearable device 140 is in response to the predetermined movement of the arm motion of user or motion. In one embodiment, any point in process 3400, user can roll for viewing command with touch screen interaction. In another embodiment, roll the motion gesture that can pass through to identify and complete, other gesture of such as rotary wrist or inclination or upset (pan) wearable device. In one embodiment, user can also cancel voice command by various methods, and it can restart process 3400 from the point of the order cancelled, i.e. the order that prompting is cancelled recently. It addition, after shown prompting, if being not received by voice command or other input within predetermined interval (such as idle period), then this process can time-out and automatically cancelling.
In one embodiment, process 3400 starts at beginning frame 3401 place. In frame 3410, wearable device receives the motion gesture instruction entering pattern of listening to. In frame 3411, wearable device shows the visual cue for voice command. In frame 3412, wearable device receives the audio/speech order for the framework navigated based on event from user. In frame 3413, audio/speech is provided to wearable device (or cloud 150 or main process equipment (such as electronic equipment 120)) and is used for processing.
In frame 3414, receive treated order. In frame 3420, it is determined that whether voice command is effective. If it is determined that voice command is not effective, then process 3400 proceeds to frame 3415, wherein shows visually indicating about invalid order. In frame 3430, it is determined that for the voice command that receives the need of clarification. If it is determined that need clarification, then process 3400 proceeds to frame 3435, and wherein wearable device points out the clarification from user.
In frame 3436, wearable device receives voice clarification. In frame 3437, audio/speech is provided to wearable device for processing. In frame 3438, receive the order of process. If it is determined that need not clarify, then process 3400 proceeds to optional frame 3440. In optional frame 3440, order prepares to run, and also is prepared for the request for confirming. In optional frame 3450, receive confirmation. In optional frame 3460, order is run or is sent to wearable device for running. Process 3400 then proceedes to end block 3472.
Figure 35 shows the example 3500 of the intelligent alarms wearable device 3510 of the use haptic unit 3540 according to an embodiment. In one embodiment, tactile array or multiple haptic unit 3540 can be embedded within wearable device 3510 (such as wrist strap). In one embodiment, this array can by user be customized for haptic unit 3540 different piece (such as part 3550, part 3545 or all haptic unit 3540), around the unique notice with circulation. In one embodiment, the notice of circulation can with an example be rendered as user feel the motion moved around wrist tactile array around chase pattern.
In one embodiment, the different piece of the band of wearable device 3510 can with a mode vibration, for instance, around wrist clockwise or counterclockwise. Other pattern can include rotary mode, and the opposite side being wherein with is pulsed (such as haptic 3550) simultaneously, then next relative sense of touch motor unit collection vibration (such as haptic 3545). In an example embodiment, top and bottom vibrate simultaneously, and then both sides grade. In an example embodiment, the haptic unit 3550 of intelligent alarms wearable device 3510 shows that opposite side vibrates for alarm. In another example embodiment, haptic unit 3545 display vibration 4 point on the band of alarm of intelligent alarms wearable device 3510. In one embodiment, the haptic unit 3540 of intelligent alarms wearable device 3510 is vibrated with the rotation around band.
In one embodiment, the pulsation of haptic unit 3540 can be localized, so user once can only feel one section of pulsation of band. This can complete by using adjacent haptic unit 3540 motor to carry out the vibration of other parts in rejection zone.
In one embodiment, except the notice of the circulation of customization, wearable device can have tactile language, and wherein the pattern of specific vibration pulsation or pulsation has the specific meaning. In one embodiment, vibration mode or pulsation can serve to indicate that the new state of wearable device 3510. In an example embodiment, when receiving important notice or calling, distinguish notice by unique haptic mode, identify sender of the message, etc.
In one embodiment, wearable device 3510 can include being more conducive to allowing the material of the effect that user feels tactile array. Such material can be that softer equipment is to strengthen the sensation of localization. In one embodiment, harder equipment may be used for more unified vibratory sensation or the mixing of vibration generated by tactile array. In one embodiment, the inside of wearable device 3510 can be customized as shown in wearable device 3520 and have different types of material (such as, softer, harder, more flexible etc.).
In one embodiment, as above instructions, sense of touch feed array can customize with specific pattern or program. Programming can adopt and uses physical force resistor sensors or use the input of touch interface. In one embodiment, using arbitrary input method mentioned, wearable device 3510 is initiated and records haptic mode. In another embodiment, wearable device 3510 can be configured to receive the message of non-language, the duplication of tactile from specific people, such as clasps wrist (by pressure, the vibration etc. slowly surrounded). In one embodiment, the message of non-language can be unique vibration or pattern. In an example embodiment, user can squeeze their wearable device 3510, the unique vibration causing pre-programmed is sent to preselected recipient, for instance, extruded tapes (squeezingtheband) sends special notification to kinsfolk. In one embodiment, self-defined vibration mode can be attended by the text message of display, image or special slide plate.
In one embodiment, it is possible to use various methods are used for recording haptic mode. In one embodiment, it is possible to record includes the multidimensional haptic mode of array, amplitude, phase place, frequency etc. In one embodiment, the assembly of such pattern can record dividually or input explanation from user. In one embodiment, the method for replacement can utilize the touch screen with the GUI including the touch input position corresponding with various actuators. In an example embodiment, x and y-axis and power input can be correspondingly mapped to the array of tactile actuator by touch screen. In one embodiment, multi-dimensional model algorithm or module may be used for user's input is compiled into haptic mode (such as, utilizing array, amplitude, phase place, frequency etc.). Another embodiment can consider to use logging program execution haptic mode record on the equipment separate with wearable device 3510 (such as electronic equipment 120). In this embodiment, the pattern preset can be utilized, or program can utilize intelligent algorithm to help user to create haptic mode easily.
Figure 36 shows the instantiation procedure 3600 of the haptic mode for recording customization according to an embodiment. In one embodiment, process 3600 in the upper execution of external equipment (such as electronic equipment 120, cloud 150 etc.), and can be provided to wearable device (such as Fig. 2,26,32,35, wearable device 140 or 3510). In one embodiment, stream receives the input of the initiation of instruction sense of touch input logging mode. In one embodiment, initiate to include display GUI or other UI and accept the input order of the record for customizing. In one embodiment, the logging mode for receiving sense of touch input continues until predetermined limit or the time of reaching, or is not detected by input for specified number of seconds (such as, idle period). In one embodiment, sense of touch record is then processed. Process can include application algorithm and sense of touch input is compiled into the pattern of uniqueness. In an example embodiment, the input of the single force on the time period can be converted to the unique pattern of the change including amplitude, frequency and position (such as, around wrist strap) by algorithm. In one embodiment, process and can include applying one or more filter and convert the input into abundant reproduction experience by strengthening or creatively change the characteristic of sense of touch input. In an example embodiment, filter can smooth sense of touch sample or input applies desalination effect. Record after process can send or send recipient to. Transmission can be undertaken by various communication interface method, such asWiFi, honeycomb, HTTP etc. In one embodiment, the transmission of the record after process can include transmitting little message, and it is routed to cloud rear end, points to phone, then passes throughIt is routed to wearable device.
In one embodiment, the man-machine interaction with wearable device is provided 3610. In frame 3620, initiate the record of sense of touch input. In frame 3630, record sense of touch sample. Frame 3640 determines whether reach the record limit, or whether input is not received by for special time amount (such as, in the second). If not yet reaching the record limit and have been received by input, then process 3600 being made back to frame 3630. If having reached the record limit or being not received by input for special time amount, then process 3600 proceeds to frame 3660. In frame 3660, process sense of touch record. In frame 3670, sense of touch record is sent to recipient. In one embodiment, then process 3600 is made back to frame 3610 and repeats, and stream is in process shown below, or terminates.
Figure 37 shows that wearable device according to an embodiment (such as Fig. 2,26,32,35, wearable device 140 or 3510) receives and play the instantiation procedure 3700 of sense of touch record. In one embodiment, the record 3710 of arrival pretreatment can guarantee that it can play on wearable device in frame 3720, i.e. guarantees suitable formatting, lost/damaged etc. not from transmission. In one embodiment, then record can be play in frame 3730 on wearable device, it is allowed to the record that Consumer's Experience creates.
In one embodiment, record, process and broadcasting can occur completely on single equipment. In this embodiment, it may not be necessary to send. In one embodiment, the pretreatment in frame 3720 can also be omitted. In one embodiment, it is possible to adopt filter frame. In one embodiment, it is possible to adopt filter frame to smooth signal. Other filter can be used to creatively additive effect and single input convert to abundant reproduction experience. In an example embodiment, filter can be used to and alternately desalinates when it is advanced around wearable device band and strengthen record.
Figure 38 shows the example illustration 3800 of the sense of touch record according to an embodiment. In one embodiment, example illustration 3800 illustrates power exemplary haptic record over time. In one embodiment, it is possible to adopt other variable to allow the establishment of the haptic mode customized. In one embodiment, the sense of touch record that diagram 3800 display simplifies, wherein sense of touch value can be dependent only on power, but is also the COMPLEX MIXED of frequency, amplitude and position. In one embodiment, sense of touch record can also be filtered according to different filters, strengthens or creatively change the characteristic of signal.
Figure 39 shows the example 3900 of the single shaft force transducer 3910 of the wearable device 3920 (as being similar to Fig. 2,26,32,35, wearable device 140 or 3510) for recording sense of touch input 3930 according to an embodiment. In one embodiment, touch sensor 3910 may identify which the input of single type, for instance, from the power to sensor of finger 3940. In an example embodiment, inputting 3930 with single sense of touch, sense of touch record can be shown as being similar to the power diagram in time of the diagram 3800 of Figure 38.
Figure 40 shows the example 4000 of the touch screen 4020 of the sense of touch input for wearable device 4010 (as be similar to Fig. 2,26,32, wearable device 140, Figure 35,3510, Figure 39,3920) according to an embodiment. In one embodiment, various ways is adopted to identify that sense of touch inputs. In an example embodiment, a kind of type of the sense of touch input of identification can be user's finger power 4030 to sensor. In an example embodiment, another type of sense of touch inputs 4040 both the power 4030 that can include utilizing on touch screen 4020 and sensor. In this sense of touch inputs, except power 4030, it is possible to identify x and y location on touch screen 4020. This can allow freeform method, and wherein algorithm can adopt position and synthesize haptic signal. In an example embodiment, the GUI that the sense of touch input 4050 of the 3rd type can be used alone on touch screen 4020 performs. This input type can include using the GUI button shown for different signals, tone or effect. In one embodiment, GUI can include the extra combination being mixed for sense of touch input of button and tracking plate.
Figure 41 shows the example block diagram of wearable device 140 system 4100 according to an embodiment. In one embodiment, touch screen 2630, force transducer 4110, and tactile array 4130 can perform function as above. In one embodiment, communication interface modules 2640 can be connected with miscellaneous equipment by various communication interface method, for instance, it is allowed to data transmit or receiveNFC, WiFi, honeycomb etc. In one embodiment, haptic mode module 4120 can control the initiation of sense of touch input and the reproduction of record and the input of the sense of touch on tactile array 4130. In an example embodiment, haptic mode module 4120 can also perform the process of the input of record as above. In this embodiment, haptic mode module 4120 can include the algorithm for creatively synthesizing haptic signal, by position and power be converted to around wearable device 140 with play haptic signal. In one embodiment, haptic mode module 4120 can also be passed through communication interface modules 2640 and send haptic mode to miscellaneous equipment, or reception haptic mode is play on wearable device 140.
Figure 42 show according to an embodiment for by user data venation the block diagram 4200 of process that presents. In one embodiment, in frame 4210, this process includes collecting, from one or more electronic equipments, the information including service action data and sensing data. Frame 4220 provides the time that the is associated organizational information based on collected information. In frame 4230, one or more based on user environment and User Activity, it is provided that one or more in the content information of potential interest and information on services of one or more electronic equipments.
In one embodiment, process 4200 can include carrying out the information filtering to tissue based on the filter of one or more selections. In one example, one or more in position-based information, mobile message and User Activity determine user environment. The information of tissue can present with specific chronological order on graph time line. In an example embodiment, what provide to one or more electronic equipments that content and service one or more of potential interest include in providing alarm, suggestion, event and communicating is one or more.
In one example, content information and information on services are that user can subscribe to for one or more electronic equipments. In one embodiment, the information of tissue is dynamically passed to one or more electronic equipment. In one example, based on user action, it is possible to be the event of labelling by service action data, sensing data and content capture. From the sensing data of one or more electronic equipments and service action data can be provided in the system based on cloud and network system one or more for determining user environment. In one embodiment, user environment is provided to one or more in the notice controlling on one or more electronic equipment and mode activation of one or more electronic equipment.
In one example, the information of tissue is continuously supplied, and includes the life events information collected on the time line. Life events information can be stored in based on one or more in the system of cloud, network system and one or more electronic equipment. In one embodiment, one or more electronic equipments include mobile electronic device, and mobile electronic device includes following one or more: mobile phone, wearable computing equipment, tablet device and mobile computing device.
Figure 43 is the high level block diagram of the information processing system showing the computing system 500 including implementing one or more embodiment. system 500 includes one or more processor 511 (such as ASIC, CPU etc.), and electronic display unit 512 can also be included (be used for showing figure, text and other data), main storage 513 is (such as random access memory (RAM), cache device etc.), storage facilities 514 (such as hard disk drive), removable storage facilities 515 is (such as removable storage drive, removable memory module, tape drive, CD drive, store the computer-readable medium of computer software and/or data wherein), user interface facilities 516 is (such as keyboard, touch screen, keypad, instruction equipment) and communication interface 517 (such as modem, wireless transceiver (such as Wi-Fi, honeycomb), network interface (such as Ethernet card), COM1 or PCMCIA slot and card).
Communication interface 517 allows software and data to be transmitted between computer system and external equipment by the Internet 550, mobile electronic device 551, server 552, network 553 etc. System 500 also includes the communications infrastructure 518 (such as communication bus, crossbar or network) that the said equipment/module 511 to 517 is connected to.
The information transmitted via communication interface 517 can be such as electricity, electromagnetism, optical signal or the form of other signal that can be received by communication interface 517 via communication link, and wherein communication link carries signal and line or cable, optical fiber, telephone wire, cellular phone link, radio frequency (RF) link and/or other communication channel can be used to implement.
In one embodiment (such as mobile phone, smart mobile phone, flat board, mobile computing device, wearable device etc.) of the one or more embodiments in mobile wireless device, system 500 is bag image-capturing apparatus 520 also, such as camera 128 (Fig. 2) and audio capture equipment 519, such as mike 122 (Fig. 2). System 500 can also include application module, such as MMS module 521, SMS module 522, e-mail module 523, social network interface (SNI) module 524, audio/video (AV) player 525, web browser 526, image capture module 527 etc.
In one embodiment, system 500 include life data module 530 and block diagram 100 (Fig. 2) in assembly, wherein life data module 530 can implementation of class be similar to about Fig. 3 describe timeline system 300 process. In one embodiment, life data module 530 can implement system 300 (Fig. 3), 400 (Fig. 4), 1400 (Figure 14), 1800 (Figure 18), 3200 (Figure 32), 3500 (Figure 35), 4100 (Figure 41) and flow chart 1500 (Figure 15), 1600 (Figure 16), 2500 (Figure 25), 2900 (Figure 29), 3300 (Figure 33), 3400 (Figure 34) and 3600 (Figure 36). In one embodiment, life data module 530 and operating system 529 may be implemented as the executable code in the memorizer being present in system 500. In another embodiment, life data module 530 can provide with hardware, firmware etc.
As to well known by persons skilled in the art, above-mentioned exemplary architecture described above can be implemented in many ways according to described framework, such as the programmed instruction run by processor, as software module, microcode, such as the computer program on computer-readable media, such as simulation/logic circuit, such as special IC, such as firmware, such as consumer-elcetronics devices, AV equipment, Wireless/wired transmitter, Wireless/wired receptor, network, multimedia equipment etc. Additionally, the embodiment of described framework can adopt the form of whole hardware embodiment, whole software implementation or comprise the embodiment of hardware and software element.
Illustrate with reference to the flow chart according to one or more embodiments and/or the block diagram of method, device (system) and computer program describes one or more embodiment. Each frame of such explanation/diagram or its combination can be implemented by computer program instructions. Computer program instructions produces machine when being provided to processor so that the instruction run via processor creates for being implemented in the approach of function/operation specified in flow chart and/or block diagram. Each frame in flow chart/block diagram can represent the hardware and/or software module or logic of implementing one or more embodiments. Hi an alternative embodiment, the function write exactly in frame can not occur by the order write in figure, can generation etc. simultaneously.
Use the media of main storage that term " computer program medium ", " computer usable medium ", " computer-readable medium " and " computer program " generically refers to such as to be arranged in hard disk drive, secondary memory, removable storage drive, hard disk. These computer programs are the means for providing software to computer system. Computer-readable medium allows computer system to read data, instruction, message or message packet and other computer-readable information from computer-readable medium. Such as, computer-readable medium can include nonvolatile memory, such as floppy disk, ROM, flash memory, disk drive memory, CD-ROM and other permanent storage. Such as, it is useful for transmitting the information of such as data and computer instruction between computer systems. Computer program instructions can be stored in computer-readable medium, this computer-readable medium can instruct computer, other programmable data to process device or miscellaneous equipment and function in a particular manner, thus the instruction being stored in computer readable medium product produces to include to be implemented in a manufacture of the instruction of the function/behavior specified in flow chart and/or block diagram block.
Represent that the computer program instructions of block diagram here and/or flow chart can be loaded on computer, programmable data process device or process equipment and cause performing sequence of operations thereon to produce computer-implemented process. Computer program (that is, computer control logic) is stored in main storage and/or secondary memory. Computer program can also receive via communication interface. Such computer program makes computer system be able to carry out the feature of embodiment as discussed here when being run. Specifically, computer program is when being run so that processor and/or polycaryon processor are able to carry out the feature of computer system. Such computer program represents the controller of computer system. Computer program includes the tangible storage medium that can be read and store the instruction running the method performing one or more embodiment for computer system by computer system.
Although with reference to its some Description of ������ (Release name) embodiment; But, other version is also possible. Therefore, the spirit and scope of claims should not necessarily be limited by description of the preferred versions contained herein.

Claims (32)

1. for venationization and the method presenting user data, including:
The information including service action data and sensing data is collected from one or more electronic equipments;
The time that is associated organizational information based on collected information; And
One or more based in user environment and User Activity, what provide in the content information of potential interest and information on services to the one or more electronic equipment is one or more.
2. the method for claim 1, also includes:
Filter based on one or more selections carrys out the information filtering to tissue.
3. method as claimed in claim 2, wherein one or more in position-based information, mobile message and User Activity determine user environment.
4. method as claimed in claim 3, the information wherein organized presents with specific chronological order on graph time line.
5. method as claimed in claim 3, wherein to the one or more electronic equipment provide that the content of potential interest includes in providing alarm, suggestion, event and communicating with the one or more of service one or more.
6. method as claimed in claim 5, wherein content information and information on services are that user can subscribe to for the one or more electronic equipment.
7. method as claimed in claim 5, the information wherein organized dynamically is passed to the one or more electronic equipment.
8. the method for claim 1, is wherein the event of labelling based on user action by service action data, sensing data and content capture.
9. the method for claim 1, what be wherein provided in the system based on cloud and network system from the sensing data of the one or more electronic equipment and service action data is one or more for determining user environment, and wherein user environment is provided to one or more in the mode activation controlling on the one or more electronic equipment and notice of the one or more electronic equipment.
10. the method for claim 1, the information wherein organized is continuously supplied, and including the life events information collected on the time line, wherein life events information is stored in following one or more upper: based on the system of cloud, network system and the one or more electronic equipment.
11. the method for claim 1, wherein said one or more electronic equipments include mobile electronic device, and mobile electronic device includes following one or more: mobile phone, wearable computing equipment, tablet device and mobile computing device.
12. a system, including:
Active module, is configured to collect the information including service action data and sensing data;
Molded tissue block, is configured to the time that the is associated organizational information based on collected information; With
Information analyser module, be configured to based in user environment and User Activity one or more come provide in the content informations of potential interest and information on services to one or more electronic equipments one or more.
13. system as claimed in claim 12, wherein molded tissue block provides the filtration to the information organized based on the filter of one or more selections.
14. system as claimed in claim 13, wherein:
User environment is determined by one or more in information analyser module position-based information, mobile message and User Activity; And
The information of tissue presents with specific chronological order on the graph time line on the one or more electronic equipment.
15. system as claimed in claim 14, wherein the content information of potential interest includes following one or more with the one or more in information on services: alarm, suggestion, event and communicate.
16. system as claimed in claim 15, wherein content information and information on services are that user can subscribe to for the one or more electronic equipment.
17. system as claimed in claim 12, wherein one or more electronic equipments include the multiple haptic unit for providing haptic signal.
18. system as claimed in claim 12, wherein in response to the user action receiving identification on the one or more electronic equipment, it is the event of labelling by service action data, sensing data and content capture.
19. system as claimed in claim 12, the information analyser module of the one or more upper operation being wherein provided in the system based on cloud and network system from the sensing data of the one or more electronic equipment and service action data is for determining user environment, and wherein user environment is provided to one or more in the mode activation that controls on the one or more electronic equipment and notice of the one or more electronic equipment.
20. system as claimed in claim 12, the information wherein organized is presented continuously, and including the life events information collected on the time line, wherein life events information is stored in following one or more upper: based on the system of cloud, network system and the one or more electronic equipment.
21. system as claimed in claim 12, wherein said one or more electronic equipments include mobile electronic device, and mobile electronic device includes following one or more: mobile phone, wearable computing equipment, tablet device and mobile computing device.
22. have a non-transient computer-readable medium for instruction, described instruction, when running on computers, performs to include following method:
The information including service action data and sensing data is collected from one or more electronic equipments;
The time that is associated organizational information based on collected information; And
One or more based in user environment and User Activity, what provide in the content information of potential interest and information on services to the one or more electronic equipment is one or more.
23. medium as claimed in claim 22, farther include:
The information filtering to tissue is carried out based on one or more selected filters;
Wherein one or more in position-based information, mobile message and User Activity determine user environment.
24. medium as claimed in claim 23, wherein:
The information of tissue presents with specific chronological order on graph time line; And
It is one or more that content and business information one or more providing potential interest to the one or more electronic equipment include in providing alarm, suggestion, event and communicating.
25. medium as claimed in claim 24, wherein:
Content information and information on services are that user can subscribe to for the one or more electronic equipment;
The information of tissue is dynamically passed to the one or more electronic equipment; And
Based on user action, it is the event of labelling by service action data, sensing data and content capture.
26. medium as claimed in claim 22, what be wherein provided in the system based on cloud and network system from the sensing data of the one or more electronic equipment and service action data is one or more for determining user environment, and wherein user environment is provided to one or more in the mode activation controlling on the one or more electronic equipment and notice of the one or more electronic equipment.
27. medium as claimed in claim 22, the information wherein organized is presented serially, and including the life events information collected on the time line, wherein life events information is stored in following one or more upper: based on the system of cloud, network system and the one or more electronic equipment.
28. medium as claimed in claim 22, wherein said one or more electronic equipments include mobile electronic device, and mobile electronic device includes following one or more: mobile phone, wearable computing equipment, tablet device and mobile computing device.
29. the graphic user interface (GUI) of display on the display of electronic equipment, including:
With the information-related one or more timeline events including service action data and sensing data collected from least electronic equipment; And
One or more to the content information of the potential interest of user and selectable type service, its one or more based in user environment and the User Activity that is associated with the one or more timeline event.
30. GUI as claimed in claim 29, wherein:
One or more icons are to be alternatively used for one or more kinds that display is associated with the one or more timeline event; And
What be supplied in the advice content information of the interest of user and information on services on GUI is one or more.
31. for a display framework for electronic equipment, including:
Timeline, including multiple content element and one or more content element of potential user's interest,
It is one or more that wherein said multiple time-based unit includes in event information, the communication information and environment alarm information, and the plurality of time-based unit shows with specific chronological order, and
Wherein said multiple time-based unit is extended to provide the information of extension based on the user action of the identification received.
32. a wearable electronic, including:
Processor;
It is couple to the memorizer of processor;
The display of bending; And
One or more sensors, it provides sensing data to analyzer module, analyzer module determines environmental information, and use based on sensing data and the always extra determined environmental information of information of one or more receptions in the service action data of host electronic appliance matched and extra sensing data, what provide in the content information of potential interest and information on services is one or more to the timeline module of wearable electronic, wherein timeline modular organisation content timeline interface on the display bent.
CN201480057095.9A 2013-10-17 2014-10-10 Contextualizing sensor, service and device data with mobile devices Pending CN105637448A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201361892037P 2013-10-17 2013-10-17
US61/892,037 2013-10-17
US14/449,091 2014-07-31
US14/449,091 US20150046828A1 (en) 2013-08-08 2014-07-31 Contextualizing sensor, service and device data with mobile devices
PCT/KR2014/009517 WO2015056928A1 (en) 2013-10-17 2014-10-10 Contextualizing sensor, service and device data with mobile devices

Publications (1)

Publication Number Publication Date
CN105637448A true CN105637448A (en) 2016-06-01

Family

ID=52828317

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201480057095.9A Pending CN105637448A (en) 2013-10-17 2014-10-10 Contextualizing sensor, service and device data with mobile devices

Country Status (4)

Country Link
EP (1) EP3058437A4 (en)
KR (1) KR101817661B1 (en)
CN (1) CN105637448A (en)
WO (1) WO2015056928A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106843391A (en) * 2017-01-13 2017-06-13 深圳市合智智能科技有限公司 Tactile intelligence donning system based on multidimensional sensing
CN109891855A (en) * 2016-08-22 2019-06-14 西安姆贝拉有限公司 The method and apparatus of sensor and/or actuator data processing on server
CN110291422A (en) * 2017-02-22 2019-09-27 索尼公司 Information processing unit, information processing method and program
CN110463192A (en) * 2017-04-10 2019-11-15 开利公司 The monitoring station of synchronized playback with detecting event
CN113505157A (en) * 2021-07-08 2021-10-15 深圳市研强物联技术有限公司 IoT cloud-based wearable device pairing method and system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102403062B1 (en) * 2015-05-13 2022-05-27 삼성전자주식회사 Device and method for performing communication service
CN109952610B (en) * 2016-11-07 2021-01-08 斯纳普公司 Selective identification and ordering of image modifiers
US9805306B1 (en) 2016-11-23 2017-10-31 Accenture Global Solutions Limited Cognitive robotics analyzer
US11395254B2 (en) * 2018-01-16 2022-07-19 Maarten Van Laere Cellular alerting add-on

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1664819A (en) * 2004-03-02 2005-09-07 微软公司 Principles and methods for personalizing newsfeeds via an analysis of information dynamics
US20120284256A1 (en) * 2011-05-06 2012-11-08 Microsoft Corporation Location-aware application searching
CN102880672A (en) * 2011-09-09 2013-01-16 微软公司 Adaptive recommendation system
CN102982135A (en) * 2012-11-16 2013-03-20 北京百度网讯科技有限公司 Method and device used for providing presented information
US20130073995A1 (en) * 2011-09-21 2013-03-21 Serkan Piantino Selecting Social Networking System User Information for Display Via a Timeline Interface
CN103078885A (en) * 2011-10-31 2013-05-01 李宗诚 ICT (Information and Communications Technology) network butt-joint technology of user terminal market configuration system of internet
CN103154954A (en) * 2010-08-09 2013-06-12 耐克国际有限公司 Monitoring fitness using a mobile device
CN103164509A (en) * 2011-12-19 2013-06-19 吉菲斯股份有限公司 Computer-implemented method for selectively displaying content to a user of a social network, computer system and computer readable medium thereof
CN103180017A (en) * 2010-08-11 2013-06-26 耐克国际有限公司 Athletic activity user experience and environment
CN103188274A (en) * 2011-11-02 2013-07-03 李宗诚 ICT network connecting technology for Internet user terminal coordination configuration system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7606582B2 (en) * 2005-12-13 2009-10-20 Yahoo! Inc. System and method for populating a geo-coding database
EP2486532A4 (en) * 2009-10-05 2013-08-21 Callspace Inc Contextualized telephony message management
US8909623B2 (en) * 2010-06-29 2014-12-09 Demand Media, Inc. System and method for evaluating search queries to identify titles for content production
US20120326873A1 (en) * 2011-06-10 2012-12-27 Aliphcom Activity attainment method and apparatus for a wellness application using data from a data-capable band
US20130198694A1 (en) * 2011-06-10 2013-08-01 Aliphcom Determinative processes for wearable devices

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1664819A (en) * 2004-03-02 2005-09-07 微软公司 Principles and methods for personalizing newsfeeds via an analysis of information dynamics
CN103154954A (en) * 2010-08-09 2013-06-12 耐克国际有限公司 Monitoring fitness using a mobile device
CN103180017A (en) * 2010-08-11 2013-06-26 耐克国际有限公司 Athletic activity user experience and environment
US20120284256A1 (en) * 2011-05-06 2012-11-08 Microsoft Corporation Location-aware application searching
CN102880672A (en) * 2011-09-09 2013-01-16 微软公司 Adaptive recommendation system
US20130073995A1 (en) * 2011-09-21 2013-03-21 Serkan Piantino Selecting Social Networking System User Information for Display Via a Timeline Interface
CN103078885A (en) * 2011-10-31 2013-05-01 李宗诚 ICT (Information and Communications Technology) network butt-joint technology of user terminal market configuration system of internet
CN103188274A (en) * 2011-11-02 2013-07-03 李宗诚 ICT network connecting technology for Internet user terminal coordination configuration system
CN103164509A (en) * 2011-12-19 2013-06-19 吉菲斯股份有限公司 Computer-implemented method for selectively displaying content to a user of a social network, computer system and computer readable medium thereof
CN102982135A (en) * 2012-11-16 2013-03-20 北京百度网讯科技有限公司 Method and device used for providing presented information

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109891855A (en) * 2016-08-22 2019-06-14 西安姆贝拉有限公司 The method and apparatus of sensor and/or actuator data processing on server
CN106843391A (en) * 2017-01-13 2017-06-13 深圳市合智智能科技有限公司 Tactile intelligence donning system based on multidimensional sensing
CN110291422A (en) * 2017-02-22 2019-09-27 索尼公司 Information processing unit, information processing method and program
CN110291422B (en) * 2017-02-22 2023-11-24 索尼公司 Information processing device, information processing method, and program
CN110463192A (en) * 2017-04-10 2019-11-15 开利公司 The monitoring station of synchronized playback with detecting event
CN113505157A (en) * 2021-07-08 2021-10-15 深圳市研强物联技术有限公司 IoT cloud-based wearable device pairing method and system
CN113505157B (en) * 2021-07-08 2023-10-20 深圳市研强物联技术有限公司 Wearable device pairing method and system based on internet of things (IoT) cloud

Also Published As

Publication number Publication date
EP3058437A4 (en) 2017-06-07
EP3058437A1 (en) 2016-08-24
KR20160058158A (en) 2016-05-24
WO2015056928A1 (en) 2015-04-23
KR101817661B1 (en) 2018-02-21

Similar Documents

Publication Publication Date Title
CN105637448A (en) Contextualizing sensor, service and device data with mobile devices
US20150046828A1 (en) Contextualizing sensor, service and device data with mobile devices
CN105025173B (en) Know profile switching on mobile computing device
US9568891B2 (en) Multi-media wireless watch
US11361016B2 (en) System for providing life log service and method of providing the service
CN106599722B (en) Intelligent terminal and its application program authority control method, device and server
CN103218387B (en) Method and apparatus for the integrated management content in portable terminal
CN104272709B (en) It is determined that the method and apparatus for the context inferred
CN108093126A (en) For refusing the intelligent digital assistant of incoming call
CN107430724A (en) Activity-triggered
EP2811400B1 (en) Method for executing program and electronic device thereof
CN107924311A (en) Customization based on context signal calculates experience
CN107005612A (en) Digital assistants warning system
CN106775615A (en) The method and apparatus of notification message management
CN107077292A (en) Clip and paste information providing method and device
CN110147467A (en) A kind of generation method, device, mobile terminal and the storage medium of text description
CN105409197A (en) Apparatus and methods for providing persistent companion device
CN109683714A (en) Multimedia resource management method, apparatus and storage medium
KR102384311B1 (en) Device for managing user information based on image and method thereof
WO2019119325A1 (en) Control method and device
CN104035995A (en) Method and device for generating group tags
KR20160035564A (en) Data processing methods for eletric device and eletric device for performing the same
CN108959320A (en) The method and apparatus of preview video search result
CN108334651A (en) Collect method, apparatus and storage medium that user's end data realizes preset need
CN108476258A (en) Method and electronic equipment for electronic equipment control object

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160601