KR20170096774A - Activity-centric contextual modes of operation for electronic devices - Google Patents

Activity-centric contextual modes of operation for electronic devices Download PDF

Info

Publication number
KR20170096774A
KR20170096774A KR1020160018446A KR20160018446A KR20170096774A KR 20170096774 A KR20170096774 A KR 20170096774A KR 1020160018446 A KR1020160018446 A KR 1020160018446A KR 20160018446 A KR20160018446 A KR 20160018446A KR 20170096774 A KR20170096774 A KR 20170096774A
Authority
KR
South Korea
Prior art keywords
user
behavior
electronic device
user behavior
action
Prior art date
Application number
KR1020160018446A
Other languages
Korean (ko)
Inventor
이준영
Original Assignee
이준영
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이준영 filed Critical 이준영
Priority to KR1020160018446A priority Critical patent/KR20170096774A/en
Publication of KR20170096774A publication Critical patent/KR20170096774A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L29/00Arrangements, apparatus, circuits or systems, not covered by a single one of groups H04L1/00 - H04L27/00
    • H04L29/02Communication control; Communication processing
    • H04L29/06Communication control; Communication processing characterised by a protocol
    • H04L29/08Transmission control procedure, e.g. data link level control procedure
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10009Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information

Abstract

The present invention relates to a user behavior-based context aware operating mode of an electronic device. According to the present invention, it is known that a user changes a function of an electronic device when the user has a change in the behavior thereof in a result of observing a usage behavior of users of electronic devices. For example, when the user wants to sleep, the user turns off a TV and illumination, sets an alarm, and turns a smartphone into a vibrating or silent mode. All of the functional changes are related to the behavior sleep of the user, and the functional changes requires multiple clicks, the pinching, the swiping, etc. in a button, touch screen, or other interface method, thereby being a very complex task for many users. A system and a method of the present invention are to reduce the complexity of such user interface method. According to the system and the method of the present invention, when a new user behavior is sensed, needed functional changes are automatically applied in accordance with a predefined operation mode to be suitable for the sensed user behavior. Therefore, according to the present invention, a user interface and a user experience are suitable for a modern electronic device which is simple, intuitive, and has a complex function.

Description

[0001] The present invention relates to an electronic device,

A system and a method for providing a new concept of electronic device control and a user interface method in order to solve complexity caused by an increase in functions of electronic devices

Currently, mobile devices and computers have a user interface centered on apps. In other words, it consists of an app icon and function buttons. For example, if the user wants to listen to music, the user clicks the audio app icon and clicks another button to select the desired playlist and play the music. This kind of user interface, or "clicks on the app icons and buttons", was launched in 1984 by Apple with the Macintosh 128K, a graphical user interface (GUI) and a mouse, and has been a typical computing environment.

The groundbreaking technologies such as touch screen and touch motion recognition introduced in Apple's iPhone launched in 1996 have been tried since 1984, but the basics of "clicking on app icons and buttons" have not changed. New technologies such as tabs, swipes, and pinch have been developed, but all are just ways to make it easier to "click on app icons and buttons". In addition, in the case of a general electronic apparatus, a representative user interface is a power switch and function buttons, and can be expressed as "power switch and buttons click ". Therefore, typical user interfaces of mobile devices and general electronic devices are both "power and function button clicks".

With the recent development of smart devices, computing and networking functions have been added to many electronic devices, so that they are linked to the "collective" function with other devices connected to the network beyond the function of each device. For example, smartphones are linked to home appliances, home automation systems, automotive electronics, and other networked electronic devices. While this trend provides more functionality, it increases the complexity of the user interface. In most cases, different systems and methods are used to interact with and control other electronic devices, and more icons, switches, and buttons are needed, which increases complexity and increases the value of interconnectivity and control .

An existing user interface can be described as a "function-based" user interface. The user continuously controls the function of the mobile device and the electronic device according to the user's need by clicking the power switch and the buttons. However, the functional complexity of a single electronic device or several electronic devices connected to the network shows the limitations of a "function-oriented" user interface. Speech recognition and panning recognition are also gaining popularity due to technological advances and ease of user interface, but they are also the role of another convenience feature to "click on the power switch and button".

In the past, increased functionality has resulted in increased complexity, and the new user interface has been a common solution. Now it is time to get a new user interface that goes beyond "clicking the app icon and button" and "clicking the power switch and button". The present invention is to provide a user experience that solves the functional complexity of mobile devices and electronic devices by improving the user interface system and method applied to mobile devices and general electronic devices.

A new user interface is required to solve the functional complexity of mobile devices and electronic devices. Complexity requires simplification and simplification requires a common denominator. The present invention seeks to simplify the functional complexity of electronic devices by applying context, strictly user context, and user context (user context) in English as a common denominator.

The context is defined as context, context. So far user context (user context) has been recognized as location recognition. As mobile devices spread, user's location information was the most common user context (user context).

The present invention utilizes a user's behavior and / or an intention to take a user's behavior as a user context, i.e., the user context, and defines the context as a user behavior based. (Hereinafter, the user's actions and / or the intention to take the actions of the user are abbreviated as the actions of the user.) The present invention utilizes the actions of the user as a common denominator to simplify the functional complexity of the electronic device. Such utilization assumes that a change in function of the electronic device is accompanied by a change in user behavior. For example, when a user is cooking, the user may want to use the smartphone as a hands-free, lights other than kitchen lights, turn off the lights, and listen to classical music. However, when the user watches the television, the user will try to use the smartphone in the vibration mode, weaken the living room lighting, and stop playing the music. When a user performs a different action, the user desires another function or response to the electronic device according to the behavior change.

In other words, complex functional changes of electronic devices can arise from user behavior changes. This one-to-many relationship, that is, a change of function versus a behavior change, is a key factor in simplifying complexity. A one-to-many relationship between the behavior of this user and the functions of the electronic device is defined as the "operating mode". Each operation mode includes a function change of the electronic device necessary for the user's behavior change. Thus, in the present invention, each user action has an associated operating mode (defined as the user behavior based contextual recognition operating mode of the electronic device in the present invention).

In the present invention, in order to utilize a user's behavior as a context of the user (user context, user behavior based context defined in the present invention), the user is allowed to use the two-way rule (who, what, when, Collect all information about you. Therefore, the detailed elements of user behavior can include all or some of the user group (who), object / object (what), time (when), location (where), intent / reason (why) And can be defined using data structures. A standardized data structure may be needed to collect, store, access, and communicate information about user behavior within or between electronic devices.

When the user behavior is defined as the context of user behavior based on the two-dimensional hierarchy, the functional change of the electronic device corresponding to the user behavior is defined as the "user behavior based context aware operation mode" expressed in the present invention. The operating modes may include functions and settings for software and hardware. For example, the mode of operation of the smartphone may include a playlist and volume control of the audio app or ringtone settings of the smartphone device itself. The lighting control system operating mode of the smart home may include a brightness adjustment setting. Therefore, when the user's behavior changes from "cooking" to "watching TV ", the smartphone automatically switches to the vibration mode, stops the audio app, The lighting control system adjusts the living room lighting lightly.

The user may set in advance the context-aware operating mode in advance how electronic devices should operate for each user behavior. Then, each time the user switches to a preset user action, the electronic devices are switched to preset functions and settings. The present invention therefore simplifies the experience for all electronic devices as well as the user's mobile experience. Users can control smart phones, wearable devices, home appliances, home automation devices, automobile related electronic devices and other electronic devices by changing user behavior.

Electronics are becoming increasingly smarter and use sensors to monitor the environment to work differently in different environmental conditions. The present invention assumes the user's intention to take a new behavior and / or a new behavior of the user as a key environmental condition. Thus, the electronic devices of the present invention behave differently according to the intention of the user to perceive the user's intention to take a new behavior and / or a new behavior of the user, and to take different user behavior and / or behavior. For each user action, one or more associated user behavior based context aware operating modes are defined that define how the electronic devices behave differently according to different user behavior. The present invention provides an implementation method of how to implement a context-aware operating mode based on user behavior for a plurality of electronic devices on a network. Therefore, when a new user behavior is detected in one electronic device, the electronic device informs other electronic devices on the network of the new user behavior, and other electronic devices on the network associate with the new user behavior, To the operating mode.

Electronic devices are flooding with functions. The complexity of such functionality ultimately hinders the user experience and limits the possibilities of those electronic devices. For example, smartphones have changed their daily lives with vast functions, but many are having difficulty using those possibilities and making them more difficult to use than traditional flip phones. The present invention defines systems and methods that simplify user interactions with electronic devices. Once context-aware operating modes based on user behavior are defined for different user actions, the electronic devices of the present invention can automatically change their functions when new user behavior is detected. Therefore, the user interaction with the electronic devices is simplified to a level of selecting a new user action, and further simplified to a level at which the electronic devices automatically detect user behavior. When technologies such as human motion recognition and speech recognition are developed in the future, the home automation system of the present invention analyzes and intends the user's intention to jog through the movement of the user to wear a running shoe, Ready to do ", and the user's interation will be simplified to a level of" Yes "or" No ".

1 illustrates an example of a user experience scenario of a conventional electronic device,
2 illustrates a one-to-many association between the user behavior for user behavior "jogging " described in user scenario 100 of FIG. 1, the functions and settings of electronic devices,
3 illustrates a one-to-many association between the user behavior for the user behavior "sleeping " described in user scenario 100 of Fig. 1, the functions and settings of electronic devices,
4 shows an embodiment of an electronic apparatus,
402 input unit
404 output section
406 Control unit
408 graphics module
410 network
412 memory
414 Storage devices
416 communication module
418 User Behavior Based Context Aware Operation Mode Control Module
5 illustrates various embodiments of the electronic device 400,
6 illustrates various embodiments of a method of sensing user behavior,
7 illustrates a data structure of a user behavior according to an embodiment of the present invention,
8 illustrates a data structure of a context-aware operating mode according to an embodiment of the present invention,
Figure 9 also shows the association between user behavior and context aware operating mode according to an embodiment of the present invention,
FIG. 10 illustrates a user experience scenario in which a context-aware operation mode based on user behavior is applied according to an embodiment of the present invention,
11 shows an architecture embodiment of a conventional smart phone,
12 illustrates an architecture of a smartphone according to an embodiment of the present invention,
13A is a flow diagram (1/2), FIG.
13B are flowcharts (2/2),
14 illustrates a home screen screen according to an exemplary embodiment of the present invention,
15 shows an initial screen according to an embodiment of the present invention,
16 is a diagram showing a new user behavior editing screen according to an embodiment of the present invention,
FIG. 17 is a diagram illustrating an automatic detection editing screen according to an embodiment of the present invention,
18 shows an application activation edit screen according to an embodiment of the present invention,
FIG. 19 is a diagram illustrating a sensor input editing screen according to an embodiment of the present invention,
20 illustrates a Bluetooth input edit screen according to an embodiment of the present invention,
FIG. 21 is a diagram illustrating a user behavior based contextual recognition operation mode editing screen according to an embodiment of the present invention,
FIG. 22 illustrates an application activation edit screen according to an embodiment of the present invention,
FIG. 23 is a diagram showing a main device setting editing screen,
FIG. 24 illustrates a peripheral editing screen according to an embodiment of the present invention,
25 is a diagram illustrating a peripheral setting editing screen according to an embodiment of the present invention,
26 is a flowchart illustrating a new user action start confirmation screen according to an embodiment of the present invention,
27 shows a setting editing screen according to an embodiment of the present invention,
28 is a diagram illustrating a new user behavior detection confirmation screen according to an embodiment of the present invention,
29 is a diagram showing a new user behavior notification confirmation screen according to an embodiment of the present invention,
30 is a context-aware operating mode member notification screen according to an embodiment of the present invention;

A system and method for context-aware operation mode based on user behavior of a single or plural electronic devices has been described with reference to Figs. 1 to 30. Fig.

FIG. 1 shows a user scenario 100 according to an embodiment of a user experience for a user's electronic devices when the user takes an action of jogging and sleeping. Scenario 100 begins with the intention of taking a new action, "User decides to jog," as a first step 102. Through steps 104 and 106, the user starts an app that tracks the user's workout and selects the workout type as "jogging" in the app. Through the steps 108, 110 and 112, the user selects a music playlist suitable for jogging and changes the silent / vibration switch to silent / vibrate in order to hear the ringing of the incoming call during jogging, . Through step 114, the user turns off all the lights before leaving home. Now the user is ready for jogging. The user presses the start button of the app that tracks the exercise through step 116 and jogs through step 118. [ In step 120, the user presses the end button of the app to track the exercise after the jogging and stores the tracking information. Through steps 122, 124, 126, and 128, the user enters the house, lights, boilers, takes a shower, and turns off the boiler after the shower is over.

Steps 102 through 128 relate to the user's jogging behavior and the user interacts with three different electronic devices: a smartphone, a boiler, and a lighting device. The user's contact with the three electronic devices is to adjust the functions and settings of the three electronic devices according to the user's jogging behavior.

Steps 130 through 136 illustrate the steps for adjusting the functions and settings of the electronic devices according to the new user behavior "sleeping " for the user to take a shower after a shower. For the user action "sleep" through step 132, the user turns on the audio, selects a music playlist suitable for listening before going to bed, and adjusts the timer of the audio to turn off after 30 minutes. Through steps 134 and 136, the user sets the timer of the luminaire to 30 minutes and starts sleeping. From steps 130 to 136, the user touches two electronic devices, the illuminator and the audio.

Exemplary scenario 100 is an example of user experience with conventional electronic devices and includes two user actions: jogging and sleeping, four electronic devices such as smart phones, boilers, lighting devices, audio, This includes the adjustment of the various functions and settings of the devices. When user behavior changes, the user turns on or off the electronic devices to suit the new user behavior, or adjusts the functions and settings of the electronic devices. As future electronic devices become smarter and more networked, users will enjoy more functionality, but the user interface will increase in complexity as the user needs to continually adjust its functions and settings as user behavior changes.

The present invention provides a solution to solve the complexity of the user interface. To solve the complexity, the present invention uses user behavior as a common denominator to classify the complex functions and settings of electronic devices, and groups only the functions and settings of electronic devices necessary for current user behavior. Figures 2 and 3 illustrate each embodiment as an association 200 and 300 between electronic devices and their functions and settings related to user behavior "jogging" and "sleep " shown in the exemplary user scenario 100. As shown in Fig. 2, the user behavior "jogging" 202 includes the six functions and settings of the three electronic devices 206, 208, 210, 212, 216 and 220, namely, the smartphone 204, the lighting device 214 and the boiler 218 It is relevant. 3 shows that the user behavior "sleep" 302 is related to two functions and settings of the two electronic devices 304 and 308, the lighting device 214 and the audio 306.

As shown in Figs. 2 and 3, a one-to-many relationship between user behavior, electronic devices, functions and settings of the electronic devices is a key element of the present invention. The present invention utilizes this one-to-many relationship to solve complexity and simplifies the user experience. When the user action "jogging" is detected, six functions and settings 206, 208, 210, 212, 216 and 220 can be automatically adjusted in the three electronic devices 204, 214 and 218. When the user action "sleep" is detected, the two functions and settings 304 and 308 can be automatically adjusted in electronic devices 214 and 306. [

It is noted that the present invention adjusts functions and settings because the user desires different operations from the electronic devices as the user behavior changes, as shown in the exemplary scenario 100. Therefore, changes in user behavior are the root cause of changes in various functions and settings of electronic devices. The present invention provides a system and method for automatically detecting a user's behavior and adjusting the functions and settings of the electronic devices according to the detected user behavior. The present invention defines the coordination of predefined functions and settings of the electronic devices as an "operating mode ". Each different user action is associated with each of the other electronic device operating modes. In the present invention, since the user behavior is defined as the user context, that is, the user context, the "operation mode" for each user behavior is defined as "user behavior based context aware operation mode ". The user behavior based context aware operation mode contains information about the mode or state of electronic devices when a change in user behavior is detected. Adjustments to functions and settings include turning on and off one or more functions, apps, and components of electronic devices, and accessing them.

The function includes an input function (eg microphone), an output function (eg audio output), a communication function (eg Bluetooth), a graphic function (eg display brightness control), or a combination of the above noted functions. For example, a user behavior based contextual recognition mode of operation for a user action called "secret meeting " includes turning off the microphone to turn off the camera or turning off the camera function to prevent photography, etc.

The user behavior based context aware operation mode may change the priority and availability of one or more components of the electronic devices that are given accessibility to the user. A component may include media elements (e.g., music, video), communication elements (e.g., email, text, contacts), other components (e.g., favorite links in an Internet browser) . For example, a user behavior based context aware operating mode called "at home " may deactivate business related or corporate confidential emails, contacts, favorite links, and the like.

Figure 4 illustrates an electronic device 400 that includes one or more user behavior based context aware operating modes in accordance with an embodiment of the present invention. The electronic device 400 can be used for various applications such as a music player, a video player, a game machine, a personal computer, a printer, a smart phone, a tablet device, a tablet device, a smart watch, a wearable device, a digital personal assistant, A home automation device, an automobile electronic device, a user interface device such as a kiosk, a combination of the above devices, and the like. In some cases, the electronic device 400 may provide only a simple function such as vacuum cleaning, and may include various functions to reproduce music as well as vacuum cleaning.

The electronic device 400 may be portable, may be carried in a hand, worn on the body, implanted in a human body, or may include other forms for use by the user everywhere. Also, the electronic device 400 may be static like a television or an air conditioner without portability. In addition, the electronic device 400 can be moved as an electronic device included in a moving device, such as a car navigation device, a vehicle driving control device, or an airplane seat, although the device is not portable and static.

The electronic device 400 may include an input unit 402, an output unit 404, a control unit 406, a graphic unit 408, a communication bus 410, a memory unit 412, a storage unit 414, a communication unit 416, The input unit 402 may include a touch interface, a GPS sensor, a microphone, a camera, a neural sensor, and other sensing methods that can sense human behavior or intention to take action. Output 404 includes methods for providing information or media to a display, speaker, and other users. The electronic device 400 may include an operating system or an application. The operating system or application can control the functions and settings of the electronic device 400 while operating in the control unit 406. [ The operating system or application may be stored in the memory unit 412 or the storage unit 414. Graphics unit 408 may include a method of providing a system, software, or other visual information or media to a user. The communication unit 416 may operate in conjunction with a communication network, such as a WiFi, Ethernet, Bluetooth, NFC, Infrared, cellular, or other communication protocol, or any combination thereof. The user behavior based context recognition control module 418 may be implemented in software, in other cases hardware, firmware, or a combination thereof. The user context based context recognition control module 418 may receive information from components such as the input unit 402, the control unit 406, and the communication unit 416 of another electronic device 400 and utilize them to detect new user behavior.

Systems and methods for implementing a context-aware operating mode based on user behavior include a plurality of devices comprised of a single device, such as electronic device 400, a single device of another type, or a combination of electronic devices 400 or other types of devices can do. FIG. 5 illustrates various types of electronic devices corresponding to the embodiment of the context-aware operating mode based on the user behavior through an example 500. FIG. 5, the user interface devices 514 used in distribution, such as the smartphone 502, the wearable device 506, the home appliance 508, the home automation device 510, the automobile electronic device 512, and the kiosk, As shown in FIG. The network 504 may be a wired or wireless type such as Wi-Fi, Bluetooth, Ethernet, TCP / IP, GSM, CDMA, other communication protocols, Electronic devices of Fig. 5 can communicate with each other to share information on the user's current behavior and the context-aware operating mode based on the user behavior. All electronic devices in FIG. 5 can detect new user behavior and communicate information about the user behavior or the corresponding context-aware operating mode to other electronic devices connected to the network 504. In the example of FIG. 5, the smart phone 502 is called a main device and other types of electronic devices are called a peripheral device. All of the electronic devices of the present invention can be main devices or peripherals. In order to facilitate understanding of the user behavior based context recognition operation mode of a plurality of electronic devices, an electronic device in which a user operates mainly is called a main device, and the remaining electronic devices other than the periodic device on the network are called peripheral devices.

The present invention provides a system and method for detecting a user's behavior in the context of a user. Detection of user behavior includes the user's explicit input or implied assumptions. In other words, the user can manually input the user's behavior or analyze the information available to the electronic devices to estimate the user's behavior. Methods for detecting user behavior from FIG. 6 to an embodiment 600 are listed.

As shown in FIG. 6, the user behavior detection 602 is divided into a passive case 604 and an automatic case 612. When the user selects the user action from the user action list as in case 606, if the user performs tagging with NFC, QR code, RFID, or other methods corresponding to the predefined user behavior as in case 608 , Or if the user operates a particular app corresponding to a predefined user action, such as 610. The passive detection includes a direct action indicating the intention of the user to take the new user action. Conversely, automatic detection estimates user behavior from available implicit information. If the current time implies the user's specific action, such as 616, then the automatic detection may be performed by a specific electronic device, such as 618, if the user's location implies the user's specific action, Such as when the proximity to a particular user, such as Case 620, implies a user's specific behavior.

A user's explicit input method may be an internal input using an input of an electronic device (e.g., a touch screen of a smart phone) or an external input using an input of another networked electronic device (e.g., And an input unit of a wearable device such as a smart watch that has been provided.

Automatic detection can be done by location-based sensors, tagging methods, calendar inputs, or other sensing devices such as GPS "work" behavior detection near the office, Starbucks WiFi connection "coffee break" "Shopping" behavior detection, "meeting" behavior detection at that time with a calendar entry for the meeting schedule, "weather" behavior detection at that time with clock alarm input, and "driving" behavior detection with Bluetooth connectivity of the vehicle) . As technology continues to evolve, motion analysis, nervous system analysis, or other forms of user behavior detection, or will detection of user behavior, can be utilized to automatically detect user behavior.

User behavior detection 602 may be performed at any time during the life cycle of a new user action. For example, new user behavior can be detected before, during, during, after, or after user behavior.

When new user behavior is detected, it is possible to collect information on user behavior by utilizing the two or three principles of linguistics (when, where, what, why, how) have. In other words, when the system and method of the present invention detects user behavior, the present invention can be used to determine the time (when), place (where), user group (who), object (what) Why), and other contextual information (how) can be detected, recorded, and recorded.

It is a revolutionary innovation to derive the user behavior from the user context using the two-way and five-way rules, compared to the way of deriving the existing user context. Since smartphones were introduced, location awareness was the most common user context information. However, as the category of electronic devices has increased and functions have increased, there has been a limitation in deriving the user's context by only recognizing the location. Various technologies, such as acceleration sensors, motion recognition, and image analysis, have been developed to derive user context beyond location awareness, but they lacked a framework to define a complete "integrated" user context. By defining the user context by user behavior using the two-way hierarchy as a framework, the present invention gathers information about the user's current context and grasps the completed situation (as if the two-way hierarchy is possible in linguistics).

In order to detect the time of user behavior (when in the two-way hierarchy), the relative or absolute time information of the user's behavior can be measured by the clock function of the timer or electronic devices. Location aware technologies such as GPS, Bluetooth, Wi-Fi, NFC tags, combinations of the above can be used to detect the user's location A user or a user group of user identification information, NFC tags, RFID chips, bar codes, facial recognition, fingerprint recognition, iris recognition, or other related user actions stored in electronic devices can be detected Biometrics recognition technology can be used. The target information (what is in the two-dimensional hierarchy) includes information about all the objects related to the user's behavior (for example, "jogging" user behavior, what the user is carrying, What you are eating). The user's intent includes explicit assumptions that can confirm the user's intent or assumptions deduced as implicit information. When the user takes a user action of "shower ", the user may enter" after jogging " with the intent and assume the intention of "after jogging" from the previous user action "jogging" in the electronic device. How the Six Sigma principle can include various contextual information (eg, user mood or weather, etc.) about a user or environment related to user behavior. The system and method of the present invention may include some or all of the two-way rules according to the needs of user context awareness.

Profiles or definitions related to user behavior may use standardized formats, such as the data structure of .7, to store and access information between various electronic devices in memory or storage devices and to ensure portability and compatibility. The data structure includes a user behavior ID 702, a user behavior name 704, a user behavior description 706, a start time 708, an end time 710, a location 712 of longitude and latitude, a user group 714, an associated object 716, an intention 718, . ≪ / RTI > The data structure 700 may be stored in electronic device 400 (e.g., in memory 412 or storage 414). In addition, some or all of the data structure 700 may be located in another external system or device that communicates with the electronic device 400.

When a new user action is detected in the user context with the data of the two-way rule, a change is applied to the functions and settings of the electronic devices in accordance with the detected user behavior. The change in functions and settings of electronic devices is defined in the present invention as "user behavior based context aware operation mode ". The "user behavior based context aware operation mode" indicates an operation mode corresponding to one user behavior defined as a context of the user. The context-aware operation mode based on the user behavior is defined in advance for each user behavior, and the context-aware operation mode corresponding to the user behavior is accessed from the memory unit 412 or the storage unit 414 and applied to the electronic devices. 8 shows an exemplary data structure 800 for the user behavior based context aware operating mode of scenario 100 shown in FIG. In another embodiment, in addition to specifying a user behavior based context aware operation mode corresponding to a user behavior, a context or a contextual recognition operation mode may be added to a profile or definition required for applying the context aware operation mode to the electronic device 400 And information about applications that need to download or synchronize.

The data structure of the contextual recognition operating mode includes a mode ID 802, a user behavior ID 804, an electronic device ID 806, an electronic device name 808, a mode name 810, a mode description 812, a mode owner 814, a personal or public mode identifier 816, Functions and settings 818, mode priority 820, and the like. The mode ID 802 has a corresponding user behavior ID 804, and when a new user behavior corresponding to the user behavior ID is detected, the context-aware operating mode of the corresponding mode 802 is applied to the electronic device corresponding to the electronic device ID 806. [ For example, when a new user action corresponding to the user action ID SPZ002 is detected, such as row 822 in Fig. 8, the user behavior based context aware operation mode corresponding to the mode ID STP021 is a smart corresponding to the electronic device ID 734066 It applies to phone equipment. The mode name 810 and the mode description 812 define the name and description of the user behavior based context recognition operation mode. The mode owner 814 and the private or public mode delimiter 816 define whether the mode of the context aware operating mode and the mode can be shared for public purposes. Electronic device functions and settings 818 define the functions and settings that need to be changed in the electronic device 808. The change includes turning on, turning off, accessing, etc. one or more functions, apps, and components of the electronic device 808. The mode priority 820 defines the priority relative to other modes.

Row 822, 824, 826, 828, 830 of FIG. 8 shows an example of a user behavior based context aware operating mode in scenario 100 of FIG. In scenario 100, the user needs to change the functions and settings of the smartphone in order to "jog" the user behavior. In the present invention, the changes are predefined for smartphone functions and settings needed for user action "jogging " and are automatically applied to smartphones when user behavior" jogging " is detected. As defined in row 822, when user behavior "jogging" of user behavior ID SPZ002 is detected, the functions and settings defined in column 818 of row 822 are automatically applied. The change would be to turn on the exercise track app (tracking app), jog the music playlist (playlist for jogging), and turn on ringtone (volume level 7). Each line defines one user behavior based context aware operating mode for one electronic device in accordance with the corresponding user behavior. Thus, in the present example, the user behavior "jogging " involves three electronic devices and thus includes three related user behavior based context aware operating modes, as in columns 822, 824, Likewise, two user behavior based context aware operating modes of columns 828 and 830 corresponding to user behavior "sleep " are applied when a user action" sleep "

In another embodiment, a user-specific user behavior based context aware operating mode may be defined for one user behavior. The mode owner 814 defines the original creator of the mode. The personal or public mode delimiter 816 defines whether the user behavior based context aware operation mode is for personal use or for public use. The user may define information for all or only a portion of the user behavior based context aware operating mode, which may be "published" to a network server to allow other users to use their specialized user behavior based context aware operating mode, , A context-aware operating mode based on user behavior that other users "publish" can be used.

FIG. 9 shows an embodiment 900 of a correlation diagram of user behavior and context awareness mode of operation for the example of user behavior "jogging " shown in scenario 100 of FIG. Fig. 9 shows that the user behavior "jogging" defined in row 722 of Fig. 7 is associated with three different user behavior based context aware operation modes defined in rows 822, 824 and 826 of Fig. Therefore, when the user behavior "jogging" defined in row 722 is detected, the user behavior based context aware operating mode defined in lines 822. 824, 826 is applied to the corresponding electronic devices.

FIG. 10 shows an example embodiment scenario 1000 of a user experience according to the user behavior based context aware operation mode of the present invention. Scenario 1000 shows the user experience with electronic devices when the user takes a jogging and sleeping behavior as in the scenario 100 of FIG. Scenario 1000 begins with the user's intent for the new behavior in a first step 1002 of "the user decides to jog." Next, instead of activating the exercise tracking app and selecting "jogging" from the exercise type list of the exercise tracking app, as in steps 104 and 106 of FIG. 1, Select "Jogging". This is the difference in user interface between the present invention and the conventional manner of use. In the present invention, the user's main interaction is to select user behavior, and in the conventional usage manner, the user's main interaction is with the apps. When the user's behavior is detected, the user behavior based context aware operating mode is automatically applied, such as steps 1006, 1008, 1014, 1020, 1022. The user behavior based context aware operation mode for user action (smartphone) user action (smartphone) may be implemented by activating the exercise tracking app, selecting a music playlist corresponding to a preset jogging, as in steps 1006 and 1008, And adjusts the volume to a predetermined setting. As in step 1010, the primary device (smartphone) transmits new user actions to the peripherals over the network so that the peripherals can apply their user behavior based context aware operating modes corresponding to user actions "jogging ". As the user exits home, the user presses the start button of the exercise tracking app as shown in step 1012. As a peripheral device, the user behavior based context recognition operation mode of the lighting controller is to turn off the illumination when the user starts jogging as in step 1014 and to turn on the illumination again when the user finishes jogging as in step 1020. When the jogging of the step 1016 is finished, the user presses the end button of the exercise tracking app as in the step 1018 and stores the exercise tracking information of the jogging. As another peripheral device, the user behavior based context recognition operation mode of the boiler turns on the boiler when the user finishes jogging as shown in step 1022, waits for the user to perform the showering step 1024, and then goes off after a predetermined period of time as shown in step 1026 do.

With the same concept as the user action "jogging ", when the user decides to go to bed as in step 1028, the user only has to select a new user behavior" sleep "in the main device (smartphone) The main device (smartphone) sends new user actions to the peripherals over the network as in step 1032. As a result, as a peripheral, the audio and lighting controllers play predefined playlists, such as steps 1034 and 1036, and stop working after a predefined time period. Finally, the user starts sleeping as in step 1038.

In the scenario-based context-aware operating mode of scenario 1000, the user does not have to worry about the complexity of the function. The user's main interaction only needs to select "jogging" and "sleep" in the user action list in steps 1004 and 1030, respectively. User Behavior Based Contextual Awareness Mode When the controller detects user actions "jogging" and "sleeping", the periodic and peripherals automatically apply user behavior based context aware operating modes that automatically match "jogging" and "sleep". Compared with the scenario 1000 of the present invention, the scenario 100 of the existing method has more user interaction with the functions and settings of the apps and electronic devices. In scenario 100, the user performs the task of continually turning on, off, or changing the respective functions and settings of the electronic devices each time the user changes the behavior. In scenario 1000, the user interaction is limited and intuitive. The user simply selects the user behavior and most of the required changes are automatically applied as specified.

The present invention translates a user's interaction into a selection of user actions in each function and settings. FIG. 11 shows an exemplary smartphone architecture 1100 in a conventional manner. The existing architecture of FIG. 11 consists of four basic steps, hardware steps 1102, operating system steps 1104, application steps including applications 1106, 1108, 1110, and user interface step 1118. For example, if the user changes behavior and wishes to change the functions of 1112, 1114, 1116 of applications 1106, 1108, 1110, the user changes directly through interface step 1118. Therefore, the user interface is directly related to the functions of the application.

Figure 12 illustrates an exemplary smartphone architecture 1200 of the present invention. The new architecture of the present invention comprises five steps: an application step including hardware steps 1202, operating system steps 1204, applications 1206, 1208 and 1210, a user behavior based context aware operating mode control step 1218, and a user interface step 1222 . For example, if the user changes behavior and wishes to change the functions of 1212, 1214, 1216 of applications 1206, 1208, 1210, the user may select user behavior 1220 and the user behavior based context aware operating mode control step 1218 The user behavior based context aware operating mode of user behavior 1220 that alters the functions of applications 1212, 1214, 1216 of applications 1206, 1208, 1210 is automatically applied. Therefore, unlike the existing method shown in Fig. 11, the user interface of the present invention is directly associated with the user behavior based context aware operation mode and only indirectly with the functions of the applications.

Figs. 13A and 13B show a flowchart 1300 according to an embodiment of the present invention, as in Part 1 and Part 2. Flowchart 1300 illustrates an exemplary use procedure of the present invention. As in step 1302, the user sets the corresponding user behavior based context aware operating mode and the automatic detection criteria for one user action. The user behavior based context aware operation mode can be edited and set in electronic devices. In addition, the user behavior based contextual recognition operation mode can be set up through additional editing by bringing it from a storage connected to the network utilizing other electronic devices or communication methods available. Once edited, the user behavior based context aware operating mode may be stored in the electronic device, in a repository on the network, or both. In step 1302, the user can set an automatic detection criterion that matches the corresponding user behavior, which is used in step 1310 to automatically detect new user behavior.

If the user behavior based context aware operating mode and the auto-sensing criteria are set in step 1302, the electronic device waits in a standby state, such as step 1304, until it detects new user behavior, User actions may be detected in a number of ways, such as steps 1308, 1310, 1312. As in step 1308, the user can manually select a user action directly from the preset user behavior list. Passive selection of user behavior by the user is explicitly expressed by the user as a new user action or the intention of the user to take a new user action. As in step 1310, the electronic devices of the present invention can automatically detect new user behavior by monitoring the auto-sensing criteria set in step 1302. [ If the preset criteria are met in step 1302, the electronic device may have a confirmation procedure with the user for a new user action, such as step 1320. [

Peripheral devices may notify of a new user behavior as in step 1312, and the memory part 412 or the storage part 414 is scanned for the user behavior as in step 1314 to check whether a preset user behavior based context aware operation mode exists . If there is no user behavior based context aware operation mode in memory 412 or storage 414, the user may create and set a new context based context aware operation mode as shown in step 1318. [

If the user behavior based context aware operation mode for the new user behavior is accessed in the memory unit 412 or the storage unit 414 or is newly set up as step 1318, such as step 1316, the user confirmation step . After the user verification step 1320, the existing user behavior based context aware operating mode is backed up to the memory unit 412, the storage unit 414, the repository on the network, or a combination thereof, as in step 1322 for future needs. As in step 1324, the user behavior based context aware operating mode is applied to the electronic device. That is, a change in the functions that are already set is applied to the electronic device. Finally, new user actions, such as step 1326, are notified to other devices and the electronics return to the standby mode of step 1304.

Figures 14 through 30 show smartphone screen shots of an embodiment of the present invention. These screen shots can be used to understand the procedures for use and configuration of the present invention with scenario 1300 of Figs. 13A and 13B. These screenshots provide a detailed description of how to make and use the invention without difficulty by those of ordinary skill in the art of operating systems of electronic devices. While it may be difficult for general programmers to create prototypes of the invention due to the absence of a published application programming interface (API) for the operating system and functions of the electronic device, the operating system programmers of the electronic device manufacturer The present invention can be produced.

14 shows an example of a home screen shot of the present invention. In this embodiment, the home screen includes an " Activity "icon 1402, which, when pressed, moves to the initial screen of the present invention, as shown at. In another embodiment, the electronic device may operate the context-aware operating mode based on user behavior in a different manner. (For example, it may be operated when another physical button is used or when an existing button is pressed for a certain period of time or longer). Fig. 15 also shows a title label 1502, " Quot; button 1504, an existing user behavior list, a user behavior selection slider 1506, and a setting button 1508.

Step 1302 of FIG. 13, which sets the user behavior based context aware operating mode and the auto-sensing criteria from .15 to .25, is described in the screen shot of one embodiment. If the "Add new user behavior" button 1504 is clicked, the new user behavior editing screen 1600 is also moved. The new user behavior editing screen 1600 includes a user behavior name editing cell 1606, a user behavior description editing cell 1608, a "Add new automatic detection condition" button 1610, an existing automatic detection condition cell 1612, a " 1614, and a current user behavior based context aware operating mode cell 1616. When the editing cells 1606 and 1608 are touched, a keyboard can be brought up for text editing. When the "Cancel" button 1602 is pressed, the user is returned to the initial screen 1500. When the user completes editing of the new user behavior, the automatic detection condition, and the user behavior based context aware operation mode, the user presses the "Done" button 1604 to perform a new user action, Mode and moves to the initial screen 1500.

The user may add a new automatic detection condition through the "Add new automatic detection condition" button 1610. In step 1310 of FIG. 13, the user may utilize do. An example of the automatic detection condition edit window 1700 of Fig. 17 shows examples of the automatic detection condition of this embodiment. The embodiment is configured in three ways to automatically detect new user behavior. The present embodiment can be automatically detected by a predefined application activation 1704, a predefined sensor input 1706, and a predefined calendar entry 1708. In the example of screen shot 1700, when a navigation application or predefined Bluetooth is detected, as shown in the example in cells 1714 and 1718, a new user action "Driving" . As shown in the disabled switch 1720, automatic detection of the user behavior "Driving" by the calendar entry is not allowed in this example. When the "Cancel" button 1702 is pressed, the user returns to the new user behavior edit window 1600 of FIG. When the user finishes editing the new automatic detection condition, the user can save the new automatic detection condition and move to the new user behavior edit window 1600 by pressing the "Done" button 1710. [

Activating the navigation app, as shown in the example of the app behavior cell 1714, will trigger automatic detection of user behavior "Driving. &Quot; The user may go to the exemplary screenshot of the app operation edit screen 1800 of Fig. 18 via the "Add application activation" button 1712 to add more apps to the app operating cell 1714. In this example, only the navigation cell 1806 is selected from among two selectable applications, an exercise tracker and a navigation app, and is marked with a check mark 1808. When the selection process is completed, the user can press the " Done "button 1804 to store the selection and move to the screen 1700. If the user wishes to cancel the selection and move to the screen 1700, the user may press the "Cancel"

As shown in the example of the sensor input cell 1718, automatic detection of user behavior "operation " occurs by a preset Bluetooth input. The user can add an auto-sensing condition based on the sensor input via the "Add sensor input" button 1716 and move to the exemplary sensor input editing window 1900 of FIG. 19. Figure 19 shows how to add the sensor input as an auto-sensing condition. In this example, cell 1906 shows that a Bluetooth network called "MY_CAR " is found in the user's vehicle and added to the auto-sensing condition. The arrow button 1908 may be moved to the exemplary Bluetooth input edit window 2000 of FIG. 20 to show all discovered Bluetooth networks. In this example, the user selects a Bluetooth network named "MY_CAR " as shown in cell 2006. The information button 2008 provides detailed information about the Bluetooth network of the corresponding cell. When the selection for the Bluetooth network is completed under the automatic detection condition, the user can press the " Done "button 2004 to save the selection result and move to the screen 1900. [ If the user cancels the selection result and wishes to return to the screen 1900, the user presses the "Cancel" Other sensor inputs may be used as auto-sensing conditions, such as NFC, QR code, Wi-Fi, Bluetooth, other tagging technologies, other network technologies, mixed types of the above technologies. Upon completion of selection for all sensor inputs, the user may click on the " Done "button 1904 to save the selection results and move to screen 1700. If the user wishes to cancel the selection result and return to the screen 1700, the user can press the "Cancel" button 1902.

Current user behavior based context aware operation mode cell 1616 shows an exemplary user behavior based context aware operation mode for user behavior "driving. &Quot; Navigation navigation, main device settings, garage gate control are defined as exemplary user behavior based context aware operating modes for user behavior "driving. &Quot; Thus, if an exemplary user behavior "drive" is detected, the predefined functions of navigation navigation, main device settings, and garage gate control are automatically applied, as shown in cell 1616.

In order to edit the user behavior based context aware operation mode, the user may utilize the "Add user context based context aware operation mode" button 1614 of FIG. The "Add user behavior based context aware operation mode" button can be moved to the exemplary user behavior based context aware operation mode edit window of FIG. In an exemplary embodiment of a smartphone, a function change of an electronic device may automatically change the settings of the peripheral device, as shown in 2114, by automatically activating the app as shown at 2106, changing the device settings of the smartphone as shown at 2110 have.

In the example of screen 2100, the navigation app was defined as one of the user behavior based context aware operating modes. Therefore, if the user action "drive" is detected, the navigation app will run automatically. To add more apps to operate for user behavior "driving ", an" Add Apps " The "Add App" button 2108 moves to the example app operation edit window 2200 of Fig. 22 where all appendable apps are marked. In the example of FIG. 22, the exercise app is an app that can add an exercise tracker and a navigation app, and the navigation app is selected as indicated by a check mark 2206. When the selection is completed, the user can press the "Done" button 2204 to save the selection result and return to the screen 2100. Button 2202 if the user cancels the selection result and wants to return to the screen 2100. The " Cancel "

The user behavior based context aware operating mode may define the device settings of the main device, such as 2110, and the predefined main device settings may be applied when an exemplary user behavior "operation " is detected. In the example, pressing arrow 2112 also moves to the example main device setting edit window 2300 of. 23 and the user can change the settings of the main device to match the user behavior "drive. &Quot; As shown on screen 2300, the primary device settings of the user behavior based context-aware operating mode are Wi-Fi (Wi-Fi), Bluetooth network, Cellular Data, Device Sounds, Privacy). If the main device settings are defined in the user behavior based context aware operation mode, the user can press the "Done" button 2304 to save the selection results and return to the screen 2100. Button 2302 if the user cancels the selection result and tries to return to the screen 2100. The " Cancel "

The user behavior based context aware operating mode may also include peripherals and predefined settings may be applied to the peripheral when an exemplary user behavior "driving" is sensed. In the case of the exemplary user behavior "drive ", as shown in Fig. 21, the garage gate control was selected as a peripheral device to operate when the user action" drive " To define additional peripherals settings for user behavior "drive," a " Add Periphery "button 2116 may be utilized. The "add peripheral" button 2116 may be moved to the example peripheral edit window 2400 of FIG. 24 to allow the user to edit or add a peripheral to the user behavior based context aware operating mode. In the example of Figure 24, a garage gate control was selected as a peripheral in a user behavior based context aware operating mode for user behavior "driving. &Quot; The "Detail information" arrow 2406 moves to the exemplary peripheral setting edit window 2500 of FIG. 25 to allow the user to edit the functions and settings of the garage gate control for user action & can do. In this example, the garage gate control can open or close the garage gate in conjunction with the navigational navigation of the smartphone, which is the user's cycle. In this example, the garage gate can be opened when the navigation is turned on around the user's home or when he arrives at home as a destination of navigation. The screen 2400 may include other peripherals (such as vacuum cleaners, smart TVs, lighting control systems, etc.) that are connectable around or on a network (Wi-Fi, Bluetooth, NFC, .

Figures 16 through 25 show how the user behavior based context aware operation mode is defined in step 1302 of Figure 13. If the time operation mode is defined as a user action based context corresponding to the user action, the electronic device senses the user action by a direct input of the user as in step 1308, as in step 1304 of FIG. 13, Or may be received by other electronic devices as notification, such as step 1312, to wait for detection.

26 shows how new user actions are detected by the user's direct input in step 1308. FIG. The user can manually select the new user behavior "jogging " in this example by pressing the slider button 1506 in the initial screen 1500 of the present invention and move to the new user behavior start confirmation window 2600 of FIG. The window 2600 of the present invention accesses and displays a user behavior based context aware operating mode and an automatic detection condition that conform to user behavior "jogging " as in step 1316. [ The user may modify the auto-sensing condition using the button 2604 and modify the contextual recognition operating mode based on the user behavior using the button 2606. [ Button 2604 moves to the same automatic detection condition edit window as the window 1700 of FIG. 17, so that the user can modify the automatic detection condition as shown in step 1302. The button 2606 moves to the same user behavior based context aware operation mode edit window as the window 2100 of FIG. 21, allowing the user to modify the user behavior based context aware operation mode as in step 1302. When the user confirms the auto-sensing condition and the context-aware operating condition based on the user behavior, the user confirms the new user behavior as in step 1320 of 13B by pressing the " Start activity "button 2608, (Action) "button 2602 to cancel the correction and return to the initial screen 1500 of FIG.

28 shows an exemplary new user behavior detection determination window 2800 by way of example. If the predetermined automatic detection condition is satisfied as in step 1310 of FIG. 13A, the electronic device of the present invention displays a new user behavior detection determination window 2800 of FIG. The new user behavior detection confirmation window 2800 includes a new activity "jog" detected 2802, a user-activity description 2804, a user behavior based context aware operation mode information 2806, a Cancel () Cancel) button 2808, and a start activity (start) button 2810. The user can use the Cancel button 2808 to ignore the automatic detection warning and return to the previous window or accept the automatic detection warning as in step 1320 of FIG. 13B using the Start Activity button 2810 Lt; / RTI > user actions.

29 and 30 show an exemplary new user behavior notification window and an exemplary user behavior based contextual recognition operation mode absence warning window, respectively. If the peripherals announce a new user action as in step 1312 of FIG. 13A, the electronic device of the present invention may display a new user behavior notification window 2900 And displays the user behavior based context aware operating mode absence warning window 3000 of FIG. 30 when the known user behavior does not exist in the period, such as step 1318.

If there are known user actions in the periodicity, the periodifier may include a new activity "jog" notified 2902, a user-activity description 2904, a user behavior based context aware operating mode information (mode of operation) 2906, a Cancel button 2908, and a Start activity (accept) button 2910. The user can use the Cancel button 2908 to ignore the notification alert and move to an existing window and accept the notification alert using the Start activity button 2910 as in step 1320 of Figure 13B, User actions can be initiated.

If there is no known user action in the periodic, then the periodic has a new activity "jog" notified no contextual < / RTI > alert with a message that there is no user- mode available) 3002 is displayed. The periodizer may display a user context-based contextual operation mode edit button 3004 and a cancel button 3006 together. The user behavior based contextual recognition mode edit button 3004 can be moved to edit a new user behavior with the same new user behavior edit window as the window 1600 of FIG. The user can return to the existing window by pressing the cancel button 3006, ignoring the notification warning.

In the initial screen 1500 of the present invention, the setting button 1508 can be used to move to the exemplary setting editing window 2700 of Fig. As the exemplary settings of window 2700 illustrate, the settings of the present invention may include automatic detection for enabling / disabling automatic detection, confirm activity changes for activating / deactivating user confirmation, A calendar input switch for enabling / disabling input, a notification for activating / deactivating new user behavior announcements in the peripheral, and the like.

Claims (20)

  1. It manages a variety of context-aware operating modes in a variety of user contexts for various electronic devices such as mobile devices, wearable devices, smart TVs, home appliances, home automation devices, building automation devices, automotive electronic devices, robots, As a method,
    A. Identifying the user's behavior, or the user's willingness to take action, either manually, by the user's input through the input of the electronic device, or automatically, in a behavior-sensing manner; and
    In accordance with the user behavior identified in step A,
    B. Applying one or more associated operating modes to one or more electronic devices. Wherein the operating mode includes changing one or more functions of the electronic device to predetermined setting values.
    Altering the functions of the electronic device in accordance with the detected user behavior (in the context of detected user behavior) provides a " contextual "user experience that dramatically improves and simplifies the user experience of the electronic device.
  2. The method according to claim 1, wherein the information on the user behavior is collected, stored, and stored in a data structure by applying the user's behavior or the user's intention to take action, by applying the two-way rule (when, where, Access method.
  3. 2. The method of claim 1, wherein, in inputting a user's input through an input unit of the electronic device, the user's behavior or a willingness of the user to take an action is determined by using a smart watch, smart necklace, smart ring, smart jewelry, And utilizing the user's input through the input.
  4. The method includes tagging methods for detecting user behavior such as NFC, QR code, barcode, RFID, and other tagging techniques in order to automatically understand the behavior of the user or the willingness of the user to take action in the behavior sensing method .
  5. A method for analyzing a user's behavior and behavior such as voice recognition, motion / gesture recognition, and other artificial intelligence techniques in order to automatically understand the behavior of the user or the willingness of the user to take action .
  6. The method as claimed in claim 1, wherein, in the behavior detection method, when the behavior of the user or the willingness of the user to take an action is grasped, the data, the input unit, or the communication data of the electronic device are compared with a predetermined reference value, And a method for editing, storing, sharing, and / or accessing the predefined reference value in a storage device (memory, storage device, etc.) of an electronic device or a storage device on a network using a predefined data structure.
  7. The method of claim 1, further comprising the steps of: editing, storing, sharing and / or modifying the functional settings of the electronic device in a storage device (memory, storage device, etc.) / RTI >
  8.  The method of claim 1, wherein the function of the electronic device includes accessibility to the input, output, communication module, storage device, memory, media asset, and / or software component of the electronic device.
  9. The electronic device of claim 1, wherein the electronic device is connected to a network for communicating, editing, storing, and accessing via the network a context aware operating mode associated with user behavior or an intention of the user to take action.
  10. The method of claim 1, wherein the user action or the intention of the user to take an action is included as an input item of a calendar function included in the electronic device.
  11. It manages a variety of context-aware operating modes in a variety of user contexts for various electronic devices such as mobile devices, wearable devices, smart TVs, home appliances, home automation devices, building automation devices, automotive electronic devices, robots, As a system,
    A. Part to identify the user's behavior, or the user's willingness to take action, either manually by the user's input through the input of the electronic device,
    In accordance with the user behavior identified in Part A,
    B. Including one or more associated operating modes on one or several electronic devices. Wherein the operating mode includes changing one or more functions of the electronic device to predetermined setting values.
    Altering the functions of the electronic device in accordance with the detected user behavior (in the context of detected user behavior) provides a " contextual "user experience that dramatically improves and simplifies the user experience of the electronic device.
  12. 12. The method according to claim 11, wherein, in defining the user behavior or the intention of the user to take an action, the information about the user behavior is collected, stored, And further includes a part for accessing.
  13. 12. The method according to claim 11, wherein, in inputting a user's input through the input unit of the electronic device, the user's action or the willingness of the user to take an action is determined by using a smart watch, smart necklace, smart ring, smart jewelry, And a part for utilizing the user's input through the input part.
  14. The method of claim 11, further comprising tagging units for detecting user behavior such as NFC, QR code, barcode, RFID, and other tagging technology in automatically detecting the behavior of the user or the willingness of the user to take action .
  15. 11. The method according to claim 11, further comprising analyzing the behavior and behavior of the user such as speech recognition, movement / gesture recognition, and other artificial intelligence techniques in order to automatically understand the behavior of the user or the user's willingness to take action .
  16. 12. The method as claimed in claim 11, wherein, in the behavior detection method, when the behavior of the user or the willingness of the user to take an action is grasped, the data, the input unit, or the communication data of the electronic device are compared with a predetermined reference value, And a section for editing, storing, sharing, and / or accessing the predefined reference value in a storage device (memory, storage device, etc.) of an electronic device or a storage device on a network by utilizing a predefined data structure.
  17. 12. The method of claim 11, wherein, for predefined setting values, the function settings of the electronic device are edited, stored, shared and stored in a storage device (memory, storage device, / RTI >
  18. 12. The method of claim 11, wherein the function of the electronic device includes accessibility to the input, output, communication module, storage device, memory, media assets, and / or software components of the electronic device.
  19. 12. The electronic device of claim 11, wherein the electronic device is connected to a network for communicating, editing, storing, and accessing via the network a context aware operating mode associated with user behavior or an intention of the user to take action.
  20. In the eleventh aspect, the user behavior or the intention of the user to take an action is included as an input item of the calendar function included in the electronic device.
KR1020160018446A 2016-02-17 2016-02-17 Activity-centric contextual modes of operation for electronic devices KR20170096774A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160018446A KR20170096774A (en) 2016-02-17 2016-02-17 Activity-centric contextual modes of operation for electronic devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160018446A KR20170096774A (en) 2016-02-17 2016-02-17 Activity-centric contextual modes of operation for electronic devices
PCT/KR2016/002099 WO2017142116A1 (en) 2016-02-17 2016-03-02 Activity-centric contextual modes of operation for electronic devices

Publications (1)

Publication Number Publication Date
KR20170096774A true KR20170096774A (en) 2017-08-25

Family

ID=59625271

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160018446A KR20170096774A (en) 2016-02-17 2016-02-17 Activity-centric contextual modes of operation for electronic devices

Country Status (2)

Country Link
KR (1) KR20170096774A (en)
WO (1) WO2017142116A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5444346B2 (en) * 2009-06-29 2014-03-19 真旭 徳山 Workflow processing program, information processing apparatus, and workflow processing method
US10346849B2 (en) * 2011-07-12 2019-07-09 Ca, Inc. Communicating personalized messages using quick response (QR) codes
JP5915341B2 (en) * 2012-04-06 2016-05-11 ソニー株式会社 Information processing apparatus, information processing method, and computer program
US9066326B2 (en) * 2013-03-14 2015-06-23 Google Technology Holdings LLC Automatic user notification, with quick response (QR) code generation following failed NFC device pairing
US9356710B2 (en) * 2014-04-08 2016-05-31 Mastercard International Incorporated Methods and apparatus for consumer testing of an NFC device

Also Published As

Publication number Publication date
WO2017142116A1 (en) 2017-08-24

Similar Documents

Publication Publication Date Title
AU2015202943B2 (en) Reducing the need for manual start/end-pointing and trigger phrases
JP6391232B2 (en) Mobile terminal and mobile terminal control method
US6848104B1 (en) Clustering of task-associated objects for effecting tasks among a system and its environmental devices
DE112011103728B4 (en) Automatic profile change on a mobile computing device
ES2643176T3 (en) Method and apparatus for providing independent view activity reports that respond to a tactile gesture
RU2628558C2 (en) Method and smart terminal handling device
TWI605394B (en) Method and system for detecting a notification event on a mobile device,and non-transitory computer-readable medium
US9661105B2 (en) Virtual assistant system to enable actionable messaging
TWI519969B (en) Intelligent assistant for home automation
US9271111B2 (en) Response endpoint selection
CN104503688B (en) The control method and device of intelligent hardware devices
KR101390103B1 (en) Controlling image and mobile terminal
US8497796B2 (en) Methods and apparatus for controlling one or more electronic devices based on the location of a user
EP2830321A1 (en) Display apparatus and method for providing personalized service thereof
US20130339850A1 (en) Interactive input device
KR20170088982A (en) Device arbitration for listening devices
US20140033298A1 (en) User terminal apparatus and control method thereof
Shafer et al. Interaction issues in context-aware intelligent environments
Jang et al. Ubi-UCAM: a unified context-aware application model
CN104735057B (en) Share the method and device of equipment control
CN109739469A (en) The context aware service provision method and equipment of user apparatus
US9703778B2 (en) Method of remotely controlling external services and selectively sharing control of the same
US20150128050A1 (en) User interface for internet of everything environment
KR101276846B1 (en) Method and apparatus for streaming control of media data
US9811870B2 (en) Information processing method, apparatus and payment system

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application