KR20140096956A - Method for executing function of device, and device thereof - Google Patents

Method for executing function of device, and device thereof Download PDF

Info

Publication number
KR20140096956A
KR20140096956A KR1020130084384A KR20130084384A KR20140096956A KR 20140096956 A KR20140096956 A KR 20140096956A KR 1020130084384 A KR1020130084384 A KR 1020130084384A KR 20130084384 A KR20130084384 A KR 20130084384A KR 20140096956 A KR20140096956 A KR 20140096956A
Authority
KR
South Korea
Prior art keywords
device
information
function
processor
state
Prior art date
Application number
KR1020130084384A
Other languages
Korean (ko)
Inventor
류종현
박용국
채한주
최원영
강정관
김남훈
홍현수
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to KR1020130010102 priority Critical
Priority to KR20130010102 priority
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority claimed from PCT/KR2014/000773 external-priority patent/WO2014119894A1/en
Priority claimed from US14/167,226 external-priority patent/US10540013B2/en
Publication of KR20140096956A publication Critical patent/KR20140096956A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Abstract

The present invention relates to a method and a device and a recording medium for a method for performing a predetermined function of a device more quickly and easily based on information about motion of the device. A device according to a preferred embodiment of the present invention includes a motion detection unit for detecting a motion of a device in a standby mode of a device, motion information based on motion information, and at least one function information corresponding to the motion information And a control unit for controlling the device to execute the function corresponding to the motion information of the device in the standby mode using the motion information, the motion information, and at least one function information.

Description

METHOD FOR EXECUTING FUNCTION OF DEVICE, AND DEVICE THEREOF [0002]

Field of the Invention [0002] The present invention relates to the execution of functions of a device, and more particularly, to a method of executing a function of a device based on movement of the device and a device executing the method.

As the functions of portable devices such as smart phones become smart, applications, services, and contents that can be used by users based on the devices are increasing, and functions that can be executed by the devices are diversified.

As a result, accessibility to applications, services, and content or accessibility to functions executable by the device may be impaired. In particular, accessibility to applications, services, and content in the standby mode of the device, and / or accessibility to functions executable by the device may degrade.

SUMMARY OF THE INVENTION It is an object of the present invention to provide a method for executing a function of a device based on movement of the device in a standby mode of the device, and a device and a recording medium therefor.

It is another object of the present invention to provide a method for executing a function of a device based on device movement and status information about the device in a standby mode of the device, and a device and a recording medium therefor.

It is another object of the present invention to provide a method of executing a function of a device based on an operation mode state of the device and a motion of the device, and a device and a recording medium therefor.

It is another object of the present invention to provide a method for executing a function of a device on the basis of an operation mode state of the device, a motion of the device, and status information about the device, and a device and a recording medium therefor.

According to an aspect of the present invention, there is provided a device including: a motion detector for detecting a motion of the device in a standby mode of the device; A storage unit for storing motion information based on the motion information and at least one function information corresponding to the motion information; And a control unit for controlling the device to execute the function corresponding to the motion information of the device in the standby mode using the motion information, the motion information, and the at least one function information Do.

The standby mode state of the device may include at least one of an idle state of an application processor included in the device, a deactivation state of a function related to the touch screen included in the device, and a screen lock state of the device.

The function related to the touch screen may include at least one of a touch sensing function of the touch screen and a display function of the touch screen.

The standby mode state of the device is a state in which a deactivation state of another component included in the device other than the motion detection unit, the storage unit, and the control unit, or a state in which power consumption by the other component is not generated, , The storage unit, and a state where only the power consumption by the control unit is generated.

The control unit may control the device so that a gateway screen is displayed before executing the function.

The gateway screen may include notification information for notifying the execution of the function and selection information for selecting an execution mode for the function.

The gateway screen may include selection information for selecting an execution mode for each of the plurality of functions in a case where the plurality of functions corresponding to the motion information of the device are plural.

The device comprising: a context information detector for detecting at least one context information about the device; Wherein the storage unit stores the motion information, the at least one function information, and the mapping information related to the detected at least one context information, and the function executed by the control unit is a function of the at least one The information on the motion of the device, and the mapping information.

The at least one status information may include at least one of current time information, location information of the device, schedule information stored in the device, and log information of the device.

According to another aspect of the present invention, there is provided a method of performing a function of a device, the method comprising: detecting movement of the device in a standby mode of the device; Detecting motion information based on the motion information; Detecting at least one function information corresponding to the detected motion information; And executing a function based on the detected at least one function information.

There is provided a computer readable recording medium having recorded thereon one or more programs including instructions for executing a method for executing a function of a device according to a preferred embodiment of the present invention, This is performed in the same manner as the function execution of one device.

1 is an example of a functional block diagram of a device according to a preferred embodiment of the present invention.
2 is a view for explaining an example of mapping information regarding an operation mode of a device, motion information of a device, and function information of a device according to a preferred embodiment of the present invention.
3 is a diagram for explaining an example of motion information of a device.
4A to 4J are diagrams for explaining an example of a function of a device executed based on motion information of a device and an operation mode of the device.
5 is a diagram for explaining operations between a sensing unit based on a continuous sensing platform (SSP) and a processor.
FIG. 6A is a flowchart illustrating a method of performing a function of a device according to an exemplary embodiment of the present invention.
6B is a flowchart illustrating a method of performing a function of a device according to another embodiment of the present invention.
7 is an operational flowchart for explaining a process that can be performed in step S604 of Fig. 6A or step S609 of Fig. 6B.
8 is a diagram for explaining an example in which a predetermined function is executed by the device in steps S701 and S702 in Fig.
Fig. 9 is an operational flowchart for explaining a process that can be performed in step S604 in Fig. 6A or step S609 in Fig. 6B.
10 is a diagram for explaining an example in which a predetermined function is executed by the device in steps S901 and S902 in Fig.
11 is a functional block diagram of a device according to another preferred embodiment of the present invention.
FIG. 12 is an example of classifying programs and / or instruction sets stored in the storage unit of FIG. 11 into modules.
13A and 13B are operational flowcharts of a method of executing a function of a device according to another preferred embodiment of the present invention.
14 is a functional block diagram of a device according to another preferred embodiment of the present invention.
FIG. 15 is a flowchart illustrating a method of executing a function of a device according to another preferred embodiment of the present invention.
16A and 16B are flowcharts illustrating a method of performing a function of a device according to another preferred embodiment of the present invention.
17 is an example for explaining an example of the function of the device according to the method shown in Fig.
18A to 18F are illustrations of a gateway screen.
FIGS. 19A and 19B are flowcharts for explaining a method of executing a function of a device according to another preferred embodiment of the present invention.
Fig. 20 is an exemplary diagram of a screen for explaining a case of executing a function of a device according to the method shown in Fig. 19; Fig.
21 is a functional block diagram of a device according to another preferred embodiment of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS The present invention is capable of various modifications and various embodiments, and specific embodiments are illustrated in the drawings and described in detail in the detailed description. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

The terms first, second, etc. may be used to describe various elements, but the elements should not be limited by terms. Terms are used only for the purpose of distinguishing one component from another.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the invention. Also, in certain cases, there may be a term selected arbitrarily by the applicant, in which case the meaning thereof will be described in detail in the description of the corresponding invention. Therefore, the term used in the present invention should be defined based on the meaning of the term, not on the name of a simple term, but on the entire contents of the present invention.

The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

Input information referred to throughout the specification is input information of a user, and is touch-based input information. The touch-based input information may include gesture-based input information of the user. For example, there may be a tab or tap, a long tap or a long touch, a touch and hold, a touch and drag, a double tap, a drag, But are not limited to, touch-based input information such as panning, flick, drag and drop, and sweep.

The input information is not limited to the touch-based input information described above. For example, the input information may include movement-based input information or vision-based input information.

The motion-based input information may be based on a motion-based user's gesture of the device (e.g., device waving, device rotation, and device lifting, etc.). For example, if a gesture of a motion-based user of a device that is upside-down in the gravitational direction as described in the embodiment to be described later is input to a motion-based input Information can be set.

The vision-based input information can be based on the recognized information by analyzing the input image acquired by the camera without contacting the device. For example, as described in the following embodiments, the information recognizing the face of the user included in the input image acquired by the camera or the information recognizing the eyes of the user may be referred to as an activation request Based input information indicating the vision-based input information.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. Referring to the accompanying drawings, the same or corresponding components are denoted by the same reference numerals, do.

1 is a functional block diagram of a device 100 in accordance with a preferred embodiment of the present invention. 1 is a block diagram of a device 100 that performs a predetermined function based on motion information based on information about movement of the device 100 and an operation mode state of the device 100. [ Yes.

1, a device 100 includes, but is not limited to, a sensing unit 101, a storage unit 102, a processor 103, and an information input / output unit 104. That is, the device 100 may include more or fewer components than the components shown in FIG.

For example, the device 100 may further include an element capable of detecting at least one contextual information regarding the device 100, as shown in FIG. 14, described below. The device 100 may also store at least one contextual information about the device 100 by sending and receiving data between the at least one component capable of detecting the contextual information described above, the processor 103, and the storage 102 As shown in FIG. At least one situation information will be described in more detail later in FIG. Information on the operating mode status of the device 100 may be included in the status information regarding the device 100. [

The operational mode state of the device 100 may include, but is not limited to, a standby mode state and an active mode state.

The idle mode state of the device 100 may include a black screen state of the device 100, an idle state of an application processor included in the device 100, A deactivation state of a function related to a touch screen, a state of setting a screen lock state of the device 100, and the like.

The deactivation state of the function related to the touch screen may include at least one of a deactivation state of the touch sensing function of the touch screen and a deactivating state of the display function of the touch screen. The touch sensing function deactivation state may indicate the off state of the touch sensing function of the device 100. [ The display function deactivation state of the touch screen may indicate the black screen state of the device 100. [

The standby mode state of the device 100 may include an inactive state of components other than the sensing unit 101, the storage unit 102, and the processor 103 included in the device 100. [ The standby mode state of the device 100 is determined based on a function based on an interface with the sensing unit 101 and the storage unit 102 among the functions of the processor 103 included in the device 100, And may include a deactivation state for the function of the device 100 other than the function related to the storage unit 102. [

The standby mode state of the device 100 may include a low power state in which only the sensing unit 101, the storage unit 102, and the processor 103 included in the device 100 are operated. That is, the standby mode state of the device 100 may include a low power state where power is consumed by the processor 103, the sensing unit 101, and the storage unit 102. The low power state may be a state in which power is consumed, for example, from several mW to several milliamps, but the low power state is not limited to the above-described one.

The standby mode state of the device 100 includes a low power state in which power consumption by the components other than the sensing unit 101, the storage unit 102, and the processor 103 included in the device 100 is not generated . The standby mode state of device 1000 may include a state in which power is consumed less than the power consumed in the active mode state of device 100.

The processor 103 includes a coprocessor that is capable of performing functions based on an interface between the processor 103 and the sensing unit 101 and an interface between the processor 103 and the storage 102 have. In this case, the above-described low power state can be referred to as a state in which power is consumed by the coprocessor, the sensing unit 101 and the storage unit 102. [ The coprocessor may use, for example, a microcontroller unit operating at a low clock rate.

The standby mode state of the device 100 may include an activation state of an application processor included in the device 100. [ That is, when the screen lock setting state of the device 100 is operated in the active state of the application processor and the screen lock setting state of the device 100 is set to the standby mode state of the device 100, The mode state may include an activation state of the application processor.

An application processor may be included in the processor 103. The inclusion of an application processor in the processor 103 may indicate that the processor 103 includes an application processor and the above-described coprocessor. In the case where the application processor and the coprocessor are included in the processor 103, the standby mode state of the device 100 described above may be the coprocessor included in the processor 103 in the active state and the application processor in the inactive state. The standby mode state of the device 100 is not limited to that described above.

The device 100 may be, for example, a smart phone, a smart TV, a personal computer (PC), a desktop PC, a notebook, a smart board, a tablet PC, A mobile device, a handheld device or a handheld computer, a media player, an electronic book terminal, a PDA (Personal Digital Assistant), a digital camera having a function of sensing motion of a device, A digital CE (Consumer Electronics) having a function capable of sensing a signal, and the like.

For example, the device 100 may be a wearable device. The wearable devices include, for example, watches, glasses, belts (e.g., waist belts, hair belts, etc.), various ornaments (e.g., rings, bracelets, feet, hairpins, , Earphones, helmets, various body protectors (eg knee protector, elbow protector). Shoes, gloves, clothing, hats, handicapped persons, handicapped persons, and the like. A user wearable device includes a communication function and a data processing function. Devices worn by a user are not limited to devices that can be used for the above-mentioned purposes.

The sensing unit 101 detects a movement of the device 100. The movement of the device 100 may be accomplished in various ways, including, for example, rotation-based movement and rotation direction (e.g., clockwise, counterclockwise, + z- the z-axis direction), the movement distance of the device 100 according to the linear-direction-based movement and the linear-direction-based movement, such as lifting the device 100 horizontally horizontally, and the like. For example, movement of device 100 may include wiggling-based movement of device 100 as described above.

The sensing unit 101 includes at least one sensor for detecting the movement of the device 100. [ That is, the sensing unit 101 includes a gyro sensor capable of sensing the rotation-based movement of the device 100 and an acceleration sensor (not shown) capable of sensing the movement and movement distance of the device 100 based on the linear direction Accelerometer Sensor). ≪ / RTI >

For example, the sensing unit 101 may include a magnetic field sensor capable of sensing the direction of rotation of the device 100, an orientation sensor capable of sensing the tilting direction of the device 100, A gravity sensor that senses the position information of the device 100, a gravity sensor that senses the gravity direction of the device 100, and a sensor that senses the number of revolutions of the device 100 And a rotation speed sensor. The sensor further included in the sensing unit 101 is not limited to the above-described one.

The gyro sensor may be composed of three gyro sensors so as to sense the rotational angular velocity of three axes (x axis, y axis, z axis) of the device 100. In this case, the sensing unit 101 senses the x-axis rotation angle (roll angle), the y-axis rotation angle (pitch angle) of the device 100, the z- (yaw, horizontal rotation) angles) and the rotation direction, respectively, into electric signals and output the electric signals.

The acceleration sensor may be configured to sense acceleration variation amounts of three axes (x axis, y axis, z axis) or two axis (x axis, y axis) of the device 100, respectively. In this case, the sensing unit 101 may convert the result of sensing the linear acceleration of the device 100 and the tilt angle in each axis direction into an electric signal and output the electric signal. The electric signal for the sensed result output from the sensing unit 101 is transmitted to the processor 103. [

The sensing unit 101 may be referred to as a motion detection unit that detects the motion of the device 100. [ The signal output from the sensing unit 101 may be referred to as a sensing value regarding the motion of the device 100 or information about the motion of the device 100. [

The storage unit 102 stores at least one program and data related to the program. The at least one program stored in the storage unit 102 may include at least one of the motion information of the device 100 based on the information about the motion of the device 100 and the motion information of the at least one device 100 corresponding to the motion information of the device 100. [ And a program capable of controlling the device to execute the function based on the function information. The data related to the program may include the above-described information on the motion of the device 100, the motion information of the device 100, and the mapping information on the function information of the device 100 described above.

2 is an example of a table for explaining mapping information regarding the operation mode status information of the device 100, the motion information of the device 100, and the function information of the device 100. Fig. The example of the table shown in Fig. 2 does not include information explaining the mapping relationship between the information about the motion of the device 100 and the motion information of the device 100. Fig. However, motion information of a device such as 90 degrees clockwise, 180 degrees clockwise, and 270 degrees clockwise is determined based on information about the motion of the device 100 output from the sensing unit 101 .

The data related to the program stored in the storage unit 102 may include table information as shown in FIG. The data related to the program stored in the storage unit 102 does not include information related to the activation mode status of the device 100 and includes information about the movement of the device 100 related to the standby mode status of the device 100, Motion information of the device 100, and mapping information related to the function information of the device 100. [

Referring to FIG. 2, the operation mode state of the device 100 may include a standby mode state and an active mode state. The motion information of the device 100 may include information indicative of a clockwise 90 ° rotation, information indicating a clockwise 180 ° rotation, and information indicating a clockwise 270 ° rotation.

The above-described standby mode state can be referred to as a sleep mode state, a deactivation mode state, or an idle state, but is not limited thereto. The above-described active mode state may be referred to as a running mode state, but is not limited thereto.

FIGS. 3A and 3B are views for explaining rotation-based motion information of the device 100. FIG. That is, FIG. 3A is an example for describing motion information based on a state in which the device 100 is placed vertically. The motion information shown in FIG. 3A includes horizontal span motion information along 90 degrees in the clockwise direction, horizontal span motion information along 270 degrees in the clockwise direction, and upside-down motion information according to 180 degrees rotation.

The rotation-based motion information of the device 100 is not limited to 90 DEG rotation, 180 DEG rotation, and 270 DEG rotation as shown in FIG. 3A. For example, the rotation-based motion information of the device 100 may be set based on information about movement along a rotation of 90 degrees or less in a clockwise direction. That is, the rotation-based motion information of the device 100 can be set based on the information about the motion as it rotates by 45 degrees in the clockwise direction.

The rotation-based motion information of the device 100 may include motion information based on information about motions that occur as the device 100 rotates 90 degrees counterclockwise. The rotation-based motion information of the device 100 may include motion information based on motion information generated as the device 100 rotates by 90 degrees in a clockwise direction and motion information based on rotation of the device 100 in 90 degrees counterclockwise And may include motion information based on information about the generated motion.

The rotation-based motion information of the device 100 may include motion information based on information about a motion that is upside down (rotated by 180 degrees) in the + z-axis direction on the basis of the state in which the device 100 is placed vertically, And motion information based on information related to a motion of up-down (180 ° rotation) in the? Z-axis direction. The rotation-based motion information of the device 100 may include motion information based on information about the movement of the device 100 in the up-down (180 ° rotation) in the ± z-axis with respect to the lateral position of the device 100.

Meanwhile, the standby mode state of the device 100 may be changed to the active mode state according to the motion information detected in the standby mode of the device 100.

The information input / output unit 104 may be in an inactive state in the standby mode of the device 100. [ For example, the touch screen is included in the information input / output unit 104, and the standby mode of the device 100 is set to the black screen state of the touch screen or the disable state of the touch screen related function When the device 100 is set to the screen lock setting state, the information input / output unit 104 may be in an inactive state while the device 100 is in the standby mode.

2 may include an activation state in which both the sensing unit 101, the storage unit 102, the processor 103, and the information input / output unit 104 included in the device 100 are activated. The active mode state of FIG. 2 is a state in which the sensing unit 101, the storage unit 102, the processor 103, and the information input / output unit 104 included in the device 100, as well as at least one other component And may include an activation state.

2 includes a power consumption state in which power is consumed by the sensing unit 101, the storage unit 102, the processor 103, and the information input / output unit 104 included in the device 100 . The active mode state of FIG. 2 is applied to the sensing unit 101, the storage unit 102, the processor 103, and the information input / output unit 104 included in the device 100 as well as to at least one other component Lt; RTI ID = 0.0 > power consumption < / RTI >

The active mode state of FIG. 2 may include an active state of an application processor included in device 100. The activation state of the application processor may indicate a state where power is consumed by the application processor.

The active mode state of FIG. 2 may include a state in which power is consumed in excess of the power consumed by device 100 in the standby mode of FIG.

The active mode state of FIG. 2 may include a state in which at least one of the applications, services, and content that are set in the device 100 or downloadable from the outside by the device 100 may be executed or requested to be executed have.

The state in which at least one of the application, the service, and the content is being executed may include a multitasking state. The status in which the execution of at least one of the application, the service and the content can be requested is provided to the information input / output unit 104 through an icon, a screen shot or a user interface ) May be displayed or may be displayed.

The relationship between the operation mode state information of the device 100, the motion information of the device 100, and the function information of the device 100 will be described in more detail with reference to FIG.

That is, when the motion information of the device 100 corresponds to information about motion representing 90 degrees clockwise rotation, and when the operation mode state of the device 100 is the standby mode state, The predetermined function is a time check function.

4A to 4J are examples for explaining a predetermined function of the device 100 executed in accordance with the operation mode state of the device 100 and the motion information of the device 100. [

4A, when the operation mode of the device 100 is in the standby mode and the black screen is being displayed (401) through the information input / output unit 104, the device 100 When the sensing value (information on the motion) indicating the rotation in the clockwise direction is received, the processor 103 detects the motion information of the device 100 from the information stored in the storage unit 102 according to the received sensing value . The motion information detected at this time is motion information indicating rotation by 90 degrees in the clockwise direction.

The processor 103 detects the function information from the storage unit 102 using the motion information of the device 100 detected in the standby mode of the device 100. [ Detecting information from the storage unit 102 may be referred to as an information lead or an information search, but is not limited thereto.

2, when the motion information of the device 100 indicates 90 degrees clockwise rotation and the operation mode of the device 100 is in the standby mode, the function information detected from the storage unit 102 is Time check function information. In accordance with the detected function information, the processor 103 controls the function of the device 100 so that the time checking function is executed. Accordingly, the black screen of the information input / output unit 104 is changed to the screen 402 including the time information.

The screen provided in accordance with the execution of the time checking function on the black screen may be referred to as a first screen. The first screen may represent, but is not limited to, a screen initially provided on a black screen. For example, a first screen can be a screen that is first provided on a black screen, and can display a screen that can provide various information for a user to stay for a certain time or more. The various information may include, but is not limited to, for example, gateway-related information to be described later.

When a first screen is provided, the application processor included in the device 100 may be in an inactive state, but may be in an active state. This can be determined according to the information provided on the first screen. For example, if information provided via the first screen is provided in a deactivated state of an application processor included in the device 100, the application processor may be in a deactivated state. If the information provided via the first screen is provided in the active state of the application processor included in the device 100, the application processor may be in an active state.

When the screen 402 including the above-described time function is displayed, the application processor included in the device 100 may be in an inactive state. The application processor may be included in the processor 103 as described above and set to the inactive state or the active state. However, the application processor may be installed outside the processor 103. [

2, a predetermined function that can be executed by the device 100 when the operation mode state of the device 100 is the standby mode and the motion information of the device 100 indicates 180 rotation clockwise This is a Quick Note function. This is an example of the screen shown in FIG. 4B.

4B, when the operation mode of the device 100 is in the standby mode and the black screen is being displayed 403 through the information input / output unit 104, the device 100 may be disconnected from the sensing unit 101 When a sensing value indicating rotation of 180 degrees in the clockwise direction is received, the processor 103 recognizes the motion information of the device 100 as being rotated by 180 degrees in the clockwise direction. The motion information recognition according to the sensing value by the processor 103 may be based on the operation of detecting the motion information from the storage unit 102 using the received sensing value. Thus, motion information recognition by the processor 103 may be referred to as motion information detection, but is not limited thereto.

The processor 103 may detect the motion mode state of the device 100 after detecting the motion information of the device 100. [ The processor 103 can detect from the storage unit 103 the motion information of the detected device 100 and predetermined function information corresponding to the detected operation mode state of the device 100. [

2, when the operation mode state of the device 100 is in the standby mode and it is detected that the motion information of the device 100 indicates 180 degrees in the clockwise direction, Is quick note function information. Accordingly, the processor 103 executes the quick note function, and the screen of the information input / output unit 104 can be changed from the black screen to the note screen 404 as shown in FIG. 4B.

The recording icon and the camera icon included in the note screen 404 shown in FIG. 4B can be used for the quick note function. That is, when a user's command for selecting a voice recording icon is input on the screen 404, the recorded content can be displayed on the note screen. For example, when a voice signal of a user "I am going to school at 8 am" is input, the processor 103 converts the input voice signal of the user into text information and outputs the converted text information to the note screen Display. As a result, the user can see the message "I am going to school at 8 am" on the note screen.

For this purpose, the processor 103 may include a function of converting the audio signal received through the information input / output unit 104 into textual information that can be displayed. At this time, the font of the text information to be displayed can be set in advance. The text information to be displayed may be stored in the storage unit 102 according to a user's storage request.

You can use the stylus pen to make notes on the note screen, but you can use the recording function to memorize what you want. Accordingly, the user can memorize ideas instantaneously more quickly, and use the quick note function of the device 100 without a stylus pen.

In addition, when the stylus pen is attached to the device 100, the quick note function of the device 100 can be used without removing the stylus pen. When the recording icon is selected, the processor 103 can display the execution screen corresponding to the recording application through the information input / output unit 104 while executing the recording application.

When a command to select a camera icon is input on the screen 404, the captured image can be displayed on the quick note screen using the camera. For this purpose, the processor 103 may perform a function of superimposing the captured image on the quick note screen through the camera.

For example, the processor 103 can divide and display a screen according to the execution of the camera application and a quick note screen. For example, an image captured through a camera may be displayed on the left side, and a note screen may be displayed on the right side to input a message related to an image captured through the note screen. The area where the captured image is displayed and the area where the note screen is displayed are not limited to this. The captured image may be a still image or a moving image.

The size of the screen on which the captured image is displayed and the screen of the quick note can be set in advance. Only the image captured in accordance with the user's command can be stored in the storage unit 102 or the captured image and the message input through the note screen can be stored together in the storage unit 102. [ Accordingly, the user can quickly capture an image to be captured using the quick note function and store it, or store a message related to the captured image together.

Once the captured image is saved, the camera application execution may be terminated automatically, but the camera application may be terminated upon user request. This operation depends on the configuration of the device 100. For the recording function and the capturing function, the information input / output unit 104 may include a microphone and a camera.

When the corresponding application is executed using the recording icon and the camera icon included in the quick note screen 404 described above, the screen 404 may be defined as a gateway screen for executing a recording application or a camera application have.

The gateway screen may include notification information indicating the execution of the function before execution of the predetermined function. The gateway screen may include selection information for selecting an execution mode for at least one function. The selection information can be defined as the recording icon and the camera icon described above. The description of the gateway screen will be described later in more detail with reference to FIG. 17 and FIGS. 18A to 18F.

The quick note function can be based on the note application installed in the device 100 when the quick note function indicates a function of quickly executing the note application installed in the device 100. [

In the quick note function, the recording icon and the camera icon can be selected at the same time. In this case, the captured image and the text information of the audio signal corresponding to the captured image can be displayed together on the note screen. Simultaneously selecting the recording icon and the camera icon can be performed by multi-touching the recording icon and the camera icon. The simultaneous selection of the recording icon and the camera icon may be made according to the preferences of the device 100 for which the camera icon is automatically selected according to the selection of the recording icon. The simultaneous selection of the recording icon and the camera icon may be in accordance with the preferences of the device 100 in which the recording icon is automatically selected according to the selection of the camera icon.

When the operation mode state of the device 100 is in the standby mode and the motion information based on the information about the motion of the device 100 indicates the 180 rotation in the clockwise direction, The information may be Universal Queue capability information. This corresponds to the example shown in FIG.

4C shows a state in which the operation mode of the device 100 is in the standby mode, the black screen 405 is displayed on the information input / output unit 104, and the motion information based on the information about the motion of the device 100 is clockwise 180 DEG rotation, the universal queue function is executed.

That is, the predetermined function information read from the storage unit 102 by the processor 103 is the universal queue function information, and the processor 103 reads the screen of the information input / output unit 104 on the black screen And controls the device 100 to be converted to the information screen 406. [ The universal queue may be included in the storage unit 102.

The predetermined function information that can be executed by the device 100 when the operation mode state of the device 100 is the standby mode and the motion information of the device 100 indicates 180 rotation in the clockwise direction, Queue) list display function information.

When the predetermined function information is the cue list display function information, the screen of the information input / output unit 104 is displayed on the black screen 407 as shown in FIG. 4D as the cue list display function is executed by the processor 103, To cue list screen 408. [ The cuelist is stored in the storage unit 103, and information based on the cuelist can be provided from the storage unit 103 or a server (not shown). A screen 409 in FIG. 4D is a screen for downloading information of a selected queue based on a cue list from a server (not shown).

In Figure 2, when the operating mode state of the device 100 is in the standby mode and the motion information of the device 100 indicates a 270 degree rotation in the clockwise direction, certain function information that can be executed by the device 100 Is a Byte Information Viewer function.

The byte information viewer function is a function that displays necessary information according to the screen or environment set in the device 100. [ For example, information that can be provided or information that can be provided using the byte information viewer function, such as ticket information, security card information, billing barcode information, memo information, coupon information, etc. previously designated by the user is not limited thereto.

By using the byte information viewer function, the user can quickly swipe information left on the clipboard while using a specific application, information left by the byte information viewer through a specific interaction, screen capture, swiping.

4E is an example of a screen for executing the byte information viewer function. That is, when the operation mode of the device 100 is in the standby mode and the black screen is displayed 410 through the information input / output unit 104, the device 100 is moved from the sensing unit 101 clockwise When the sensing value indicating the rotation is received, the processor 103 recognizes the motion information of the device 100 corresponding thereto.

When the motion information is recognized, the processor 103 detects predetermined function information from the storage unit 102 using the operation mode state of the detected device 100 and the motion information of the device 100. [ Referring to FIG. 2, the predetermined function information detected from the storage unit 102 is byte information viewer function information. The processor 103 executes the byte information viewer function, and the screen of the information input / output unit 104 is converted into the information screen 411 previously set on the black screen.

Referring to FIG. 2, when the device 100 is in the active mode, the motion information of the device 100 indicates 90 degrees in the clockwise direction, and the function currently performed by the device 100 is web browsing Function, the predetermined function information that can be executed by the device 100 is browser secret mode execution function information. FIG. 4F is an example of a screen corresponding thereto.

Referring to FIG. 4F, when the operation mode state of the device 100 is executing the web browsing function 412, the device 100 is moved 90 clockwise based on the sensing value received from the sensing unit 101 The processor 103 detects predetermined function information from the storage unit 102 by using the motion information of the device 100 and the operation mode state of the device 100. [ The detected predetermined function information is browser secret mode execution function information. Accordingly, the processor 103 executes the browser secret mode. Log information such as a search history, a surfing history, log-in information, etc. of a user to be executed subsequently is not recorded in the device 100 according to the browser secret mode.

When the sensing value indicating the rotation of the device 100 by 90 degrees counterclockwise is received from the sensing unit 101 (that is, when the sensing value is received as a reference value) while the secret mode is being executed, Can turn off incognito mode of web browsing. The secret mode execution function and the secret mode release function by the processor 103 read the motion information from the storage unit 102 using the received sensing value and store the motion information in the storage unit 102 using the read motion information and the sensed value, And reading the predetermined function information from the memory.

In Figure 2, if the device 100's operating mode state indicates an active mode state, the motion information of the device 100 indicates a 180 degree clockwise rotation, and the device 100 is currently executing any application A predetermined function that can be executed by the device 100 is a function of switching to another application. FIG. 4G shows an example of the screen.

Referring to FIG. 4G, when the operation mode of the device 100 is executing a specific social network service (SNS) 414, the device 100 receives a signal indicating that the device 100 has rotated 180 degrees clockwise When the sensing value is received, the processor 103 detects the motion information of the device 100 from the storage unit 102 using the sensing value. The processor 103 detects predetermined function information from the storage unit 102 using the motion information of the detected device 100 and the operation mode state of the device 100. [ The predetermined function information detected in the case of FIG. 4G is the function information for switching to another application.

Thus, the processor 103 performs the function of switching to another application (415, 416). Other applications described above may include, but are not limited to, a preconfigured application, an application that was executed just before, and another application being multitasked. If there are multiple applications being multitasked, the other application may include the application that was recently executed.

Upon application switching, the processor 103 may switch the screen immediately, but may provide animation effects as shown in screens 415 and 416 of FIG. 4G. The screens 415 and 416 of Figure 4g display a heavy stone on the downside of the device 100 with an accordion effect and receive a sensing value from the sensing unit 101 indicating that the device 100 rotates 180 degrees clockwise , The processor 103 displays a screen for gradually compressing the original application screen while gradually lowering the heavy stone downward.

The animation effect is not limited to the accordion effect described above. For example, an animation effect may include various animation effects, such as an animation effect in which the blurred image becomes clear, an animation effect in which the image moves from the top to the bottom, and an animation effect that rotates the image the same as the rotation angle of the device 100 . These various animation effects can be set by the user in advance. Before being set by the user, the animation effect can be displayed in a demonstration form in advance.

4H, when the operating mode state of the device 100 indicates that it is executing a particular application (e.g., Facebook) (417), it is determined that the device 100 has been rotated 180 DEG clockwise (E.g., KakaoTalk) (step 418). If the device 100 is rotated 180 degrees clockwise again 419 while a predefined application (e.g., KakaoTalk) is running, switching to a particular application (e.g., Facebook) that was previously executed 420). The motion information of the device 100 stored in the storage unit 102, the operation mode status information of the device 100, and the predetermined information that can be executed by the device 100, such that the application executed by the device 100 is switched, Can be set. The rotation of the device 100 on the screen 419 can be set to be rotated by 180 degrees counterclockwise.

The predetermined function information of the device 100 according to the predetermined motion information of the device 100 may be transmitted to another service or another content or another application Lt; / RTI >

Indicates that the operation mode state of the device 100 is executing a service or an application and when a plurality of accounts are set as a currently executed service or application, The predetermined function information can be set to perform a function of changing an account currently used to another account.

In FIG. 2, if the device 100 is in an active mode state and displays a specific list, and the motion information of the device 100 indicates a 180 rotation in a clockwise direction, The predetermined function may be a list sorting method changing function as shown in FIG. 4I. That is, if the motion information of the device 100 indicates a rotation of 180 degrees clockwise in the list sorted by Ascending as shown in the screen 421 of FIG. 4I, the processor 103 sets the descending order (Descending) It can be changed to the sorted list screen 422.

2, if the operating mode state of the device 100 is the active mode state and the motion information of the device 100 indicates a 270 rotation in the clockwise direction, regardless of the function being performed by the device 100, A predetermined function can be set to perform the function. For example, when the device 100 is displaying a home screen, when a sensing value indicating that the device 100 is rotated by 270 degrees in the clockwise direction is received, the device 100 stores the predetermined function information The voice recorder function can be executed accordingly.

The predetermined motion information of the device 100 and the predetermined function information of the device 100 according to the operation mode state of the device 100 are not limited to those defined in Fig. For example, the predetermined motion information of the device may be defined only in the upside-down (180 ° rotation of the device 100) motion information in the clockwise direction. The predetermined motion information of the device 100 can be defined only in the motion information 423 that horizontally lifts the device 100 and lifts the device 100 in the vertical direction as shown in Fig. The predetermined function information corresponding to the motion 423 shown in FIG. 4J may match one of the functions defined in FIG. However, the matching function information is not limited to the one defined in Fig.

The predetermined motion information of the device 100 is not limited to the one described above. For example, the predetermined motion information of the device 100 may include motion information for moving the device 100 in the vertical direction and lifting the device 100 in the vertical direction, motion information for moving the device 100 in the vertical direction, Information, moving information in the horizontal direction, and moving information in the left and / or right direction by a predetermined distance. The predetermined distance may be defined as a distance over a minimum distance at which the device 100 can be recognized as being moved in the left or right direction by the sensing unit 101. [ The minimum distance may be determined according to the sensor included in the sensing unit 101. [

4J shows a case where a sensing value indicating that the operation mode of the device 100 is in the standby mode and the device 100 is lifted in the vertical direction with the sensing device 101 in the horizontal direction is generated, This is the case. For example, the sensing value indicating that the device 100 is lifted in the lateral direction and lifted in the vertical direction is, for example, a case where the x-axis variation is less than 30, the y-axis variation is more than 90, the acceleration variation is greater than T, It may be set to be an integer, but the sensing value is not limited to the above-mentioned one. That is, the sensing value may be determined according to the sensor included in the sensing unit 101.

At least one program stored in the storage unit 102 is an application executing program, a service executing program, and a content storing program. And a content playback program.

The processor 103 executes at least one program stored in the storage unit 102 as described above and stores the information sensed by the sensing unit 101 and information stored in the storage unit 102, The motion information can be recognized.

The processor 103 may detect the operating mode state of the device 100. [ The processor 103 controls the flag register or the storage unit 102 in the case where information on the operation mode state of the device 100 is stored in a flag register of the processor 103 or in a predetermined area of the storage unit 102, It is possible to determine the operation mode state of the device 100 by reading information on the operation mode state of the device 100 from the above-described predetermined area of the device 100. [ The determination of the operating mode state of the device 100 may be referred to as the detection of the operating mode state of the device 100. [ The processor 103 may include a RAM that includes the above-described flag register.

Processor 103 may be referred to as one or more processors as it controls the overall operation of device 100. The processor 103 may control the operation of the sensing unit 101, the storage unit 102, and the information input / output unit 104. The processor 103 may be referred to as a controller, a microprocessor, a digital signal processor, or the like.

The processor 103 may include a low power processor 510 and an application processor 520 for controlling the sensing unit 101 by a continuous sensing platform (SSP) as shown in FIG. The low power processor 510 may be defined as a coprocessor, and the application processor 520 may be defined as a main processor. FIG. 5 is a diagram for explaining the relationship between the SSP-based sensing unit 101 and the processor 103. FIG.

The low power processor 510 shown in FIG. 5 may be configured as a low power MCU operating in the standby mode state of the device 100 mentioned in FIG. The low power processor 510 may include a sensor hub 511 and an SSP manager 512. First to nth sensors 501_1 to 501_n may be attached to the sensor hub 511. [ The first to n-th sensors 501_1 to 501 - n are included in the sensing unit 101. The SSP manager 512 may be included in the framework of the application processor 520.

The sensor hub 511 can receive the sensed values from the first to nth sensors 501_1 to 501_n. The first to nth sensors 501_1 to 501_n and the sensor hub 511 can transmit and receive sensed values based on communication such as low-power Bluetooth low energy communication. Communication between the first to nth sensors 501_1 to 501_n and the sensor hub 511 is not limited to low-power Bluetooth communication.

When the operation mode state of the device 100 indicates the standby mode state, the application processor 520 may be set to the standby mode state. When a situation occurs in which the application processor 520 is set in the standby mode and the application processor 520 is set in the standby mode to be in the active mode, the sensor hub 511 receives the SSP Platform) based on a data communication protocol. The case where the application processor 520 is to be changed from the standby mode state to the active mode state can be determined according to predetermined function information to be executed by the device 100 mentioned in the above embodiments.

The operation between the sensor hub 511 and the SSP manager 512 will be described below. That is, the sensor hub 511 transmits an interrupt signal indicating that there is data to be transmitted to the SSP manager 512 (513). The SSP manager 512 transmits a signal requesting the data type and length to be transmitted by the sensor hub 511 to the sensor hub 511 in operation 514. The sensor hub 511 transmits the contents of the format and the length of data to be transmitted to the SSP manager 512 (515).

The SSP manager 512 sends a start to read MSG to the sensor hub 511 (516). The sensor hub 511 processes the sensed value into a predetermined packet and transmits it to the SSP manager 512 (517) when a start to read MSG is received (516). The sensed value transmitted from the sensor hub 511 to the SSP manager 512 may be referred to as being transmitted in an electrical signal.

The SSP manager 512 detects motion information from the information stored in the storage unit 120 using the sensed value received from the sensor hub 511 in the standby mode of the device 100 as mentioned in FIG. . The SSP manager 512 can detect predetermined function information of the device 100 based on the detected motion information and control the function of the device 100 based on the detected function information.

1, the SSP manager 512 detects the motion information from the information stored in the storage unit 120 using the sensed value received from the sensor hub 511, The operation mode of the device 100 is checked. The SSP manager 512 detects the predetermined function information of the device 100 from the storage unit 120 using the checked operation mode and the detected motion information, Function can be controlled.

The information input / output unit 104 may include a display unit capable of outputting screen information according to the execution of the predetermined function by the processor 103. The display unit may include a touch screen to receive touch-based input information of a user. The information input / output unit 104 may be in a black screen state in the standby mode of the device 100 as described above, may be in a state in which power is not consumed, a function related to the touch screen may be in a disabled state, The display function associated with the screen may be inactive, but is not limited to this.

FIG. 6A is a flowchart illustrating a method of performing a function of a device according to an exemplary embodiment of the present invention. 6A shows a case where a predetermined function is executed according to the operation mode state of the device 100 when the motion information of the device 100 is predetermined motion information. 6A can be performed by the processor 103 of the device 100. The flowchart of FIG.

In step S601, the processor 103 receives information about the motion of the device 100. [ Information about the motion of the device 100 can be received from the sensors included in the sensing unit 101 mentioned in FIG. 1 on the basis of a continuous sensing platform as mentioned in FIG.

Based on the information about the motion of the device 100, the processor 103 detects predetermined motion information (S602). When the motion information of the device 100 is detected, the processor 103 detects information about the operation mode state of the device 100 (S603). Detecting information about the operating mode state of the device 100 may be referred to as checking the operating mode state of the device 100. [

If the operation mode state of the device 100 is the active mode state, the information about the operation mode state of the device 100 may include information indicating what state the device 100 is currently in. For example, information regarding the operating mode state of device 100 may include, but is not limited to, information indicating when an application is running, which application is running. Detecting the operation mode state of the device 100 in step S603 may be performed in the manner described in the description of the processor 103 in Fig.

In step S604, the processor 103 executes a predetermined function based on the information on the operation mode state of the device 100 and the motion information on the device 100. [ The predetermined function can be detected and executed from the storage unit 102 as described in Figs. 2 and 4A to 4J.

6B is a flowchart illustrating a method of performing a function of a device according to another embodiment of the present invention. 6B shows a case where the operation mode state of the device 100 is set to the standby mode state. 6B may be performed by the processor 103 of the device 100. [

6B, when the operation mode of the device 100 is in the standby mode, the processor 103 receives information on the movement of the device 100 from the sensor unit 101 . The processor 103 may then be the low power processor 510 of FIG. Therefore, the processor 103 can receive information on the motion of the device 100 according to the movement of the device 100 by the data communication protocol based on the SSP (Seamless Sensing Platform).

In the standby mode of the device 100 (S605), the processor 103 receives information about the motion of the device 100 from the sensor unit 101 (S606). The processor 103 detects motion information from the storage unit 102 using information on the received motion (S607). When the motion information is detected, the processor 103 detects at least one function information corresponding to the detected motion information from the storage unit 102 (S608).

According to the storage structure of the motion information stored in the storage unit 102 and the at least one function information, the motion information detection and the at least one function information detection described above may be referred to as at least one function information detection. For example, when at least one piece of function information is detected from the storage unit 102 by the processor 103 using information about motion received from the sensor unit 101, the above-described motion information detection and the above- The at least one function information detection may be referred to as the at least one function information detection described above.

The processor 103 controls the device 100 to execute a function based on at least one function information received from the storage unit 102 (S609). The operation flow chart shown in Fig. 6B can be performed as described in Figs. 4A to 4E described above. In step S609, the processor 103 may display a gateway screen as shown in Fig. 17 and Figs. 18A to 18F to be described later before executing the function.

Fig. 7 is a flow chart for explaining a process that may be included in step S604 of Fig. 6A or step S609 of Fig. 6B described above.

7 relates to a process for executing a predetermined function of the device 100 in accordance with a result of determining whether the user is looking at the device 100. Fig.

That is, in step S701, the processor 103 determines whether the user is looking at the device 100 before executing the predetermined function. Determining whether the user is viewing the device 100 may be determined depending on whether the user's face is included in the image obtained using the camera included in the device 100. [ Or whether the user is looking at the device 100 according to whether the user's face included in the obtained image described above is looking at the device 100, Is not limited to the above-mentioned one. .

The processor 103 may determine whether the user's face is included in the acquired image using the face area extraction technique. The processor 103 may determine whether the face of the user included in the acquired image is looking at the device 100 using the feature value detection method. The feature value detection method is to detect the feature value of the distance between the user's eyes, the thickness of the nose, the height and shape of the cheekbones, and the ratio of the width and height of the forehead in the face. The processor 103 may use the detected feature value to estimate the direction in which the face of the user included in the image is directed to determine whether the user is looking at the device 100. [

Instead of the above-described method of recognizing the face of the user, it is possible to detect the pupil of the user from the obtained image, estimate the movement of the pupil, and determine whether the user is looking at the device 100.

If it is determined in step S701 that the user is viewing the device 100, the processor 103 controls the device 100 to execute the function based on the predetermined function information (S702).

If it is determined in step S702 that the user is not looking at the device 100, the processor 103 controls the device 100 so as not to execute the function based on the predetermined function information (S703).

The processor 103 may determine whether the user is looking at the device 100 using the facial region extraction and feature value detection or the pupil's motion estimation method as described in FIG. At least one associated application program may be stored and provided to the processor 103 at the request of the processor 103. [ The processor 103 may execute an associated application program provided from the storage unit 102 to determine whether the user is viewing the device 100 as described above.

FIG. 8 shows an example of a screen in which a predetermined function is activated by the device 100 according to steps S701 and S702 of FIG. That is, when the operation mode state of the device 100 is in the standby mode and the screen is in the black screen state as shown in the screen 810 of FIG. 8, the device 100 is rotated 180 degrees clockwise by the sensing value If it is recognized, the processor 103 activates the function of the front camera 821. The processor 103 temporarily switches the operation mode state of the device 100 to the active mode state in order to execute the application related to the front camera 821 as shown in the screen 820 but the information input / ) Can maintain the black screen state. However, the processor 103 may output a screen on which the camera mode is executed through the information input / output unit 104. [

The processor 103 may determine whether the user is viewing the device 100 according to the face recognition of the user or the motion estimation of the user's eye described above for the image 822 obtained using the front camera 821 . If it is determined that the user is viewing the device 100, the processor 103 performs a predetermined function that can be performed according to the operation mode state of the device 100 and the motion information of the device 100 (830).

Fig. 9 is an operational flowchart for explaining another process that may be included in step S604 in Fig. 6A or step S609 in Fig. 6B described above. Fig. 9 relates to a process in which a predetermined function is executed by the device 100 upon receiving a preset user's gesture-based information corresponding to an execution request for a predetermined function.

That is, in step S901, the processor 103 determines whether the gesture-based information of the user corresponding to the execution request for the predetermined function has been received before executing the predetermined function. The gesture-based information of the user is displayed on the touch screen of the device 100 in a state in which the device 100 is held horizontally while holding the device 100 with both hands, as shown in Fig. 10, ). ≪ / RTI > However, when the device 100 is placed horizontally and the device 100 is held in one hand, it is predefined as one touch (for example, one touch using the thumb of the hand holding the device 100) .

The state of the device 100 is not limited to a horizontally laid state as described above. For example, the state of the device 100 may include a vertically laid state. The state in which the device 100 is placed horizontally may be a bezel of the device 100 having height information (Height or vertical information) included in information on the size of the device, The surface having a value approximate to a right angle with respect to the paper surface. The vertical orientation of the device 100 may indicate a state where the bezel surface of the device 100 with width information included in the information about the size of the device 100 has a value approximately orthogonal to the paper surface have. A state in which the bezel surface of the device 100 has a value close to a right angle to the paper surface can be set considering the error range sufficiently.

The above-described two-touch and one-touch can be recognized using information about the predicted touch area stored in advance. The information on the predicted touch area described above may include an error range. The predicted information on the touch area can be set in advance using the size information of the device 100 and the size information of the user's hand. The size information of the user's hand can be changed according to the user's body size. The size information of the user's hand can be determined using the image of the hand of the scanned user. The size information of the device 100 may use the information included in the specification information of the device 100 stored in advance.

The predicted information on the touch area can be set through the touch area registration process by the user. For example, the user can use the device 100 to create the same situation, and register the touched area as a predicted touched area in the created situation, and use the touched area. The user can set a tolerable error range when registering the touch region on the screen.

The gesture-based information of the user may be stored in the storage unit 104 so as to be mapped to information on a corresponding operation mode of the device 100 and motion information of the device 100. [

The gesture-based information of the user stored in the storage unit 104 may include coordinate information of at least a touch surface. The coordinate information of the touch surface described above may be stored in the device 100 as the size of the device 100, coordinate information of the touch surface predicted to generate two thumb-based touches of both hands when the user holds the device 100 with both hands, 100 based on the coordinate information of the touch surface expected to cause one thumb-based touch of the right hand when holding the device 100 with the right hand, But is not limited to, coordinate information of a touch surface that is expected to cause a touch. The coordinate information of the touch surface may include coordinate information indicating a two-dimensional screen area. If it is determined in step S901 that an execution request for a predetermined function has been received based on the above-described gesture of the user, the processor 103 executes the predetermined function described in step S902.

If it is determined in step S903 that an execution request for a predetermined function is not received based on the above-described gesture of the user, the processor 103 does not execute the predetermined function described in step S903.

Fig. 10 shows an example of a screen in which a predetermined function is executed by the device 100 according to steps S901 and S902 in Fig. 10, when the operation mode state of the device 100 is in the standby mode and the screen is in the black screen state, the sensing value 1010 indicating the movement of the device 100 in the vertical direction by a predetermined distance or more The processor 103 recognizes the motion information of the device 100.

After recognizing the motion information of the device 100 and recognizing that the multitouch based on the thumb of both hands holding the device 100 is generated (S1020), the processor 103 determines whether the multi- And a predetermined function determined in accordance with the motion information of the device 100 (1030). In the case of FIG. 10, the predetermined function is a case of continuously executing the application that was executed just before the operation mode state of the device 100 is set to the standby mode state.

11 is a functional block diagram of a device 1100 according to another preferred embodiment of the present invention. 11, the device 1100 includes an information input unit 1101, a sensing unit 1102, a touch screen 1103, a camera 1104, an audio input unit 1105, an audio output unit 1106, a storage unit 1107, a wireless communication unit 1108, a wired communication unit 1109, a processor 1110, and a power supply unit 1114. The configuration of the device 1100 is not limited to that shown in Fig. That is, the configuration of the device 1100 may include more or fewer components than those shown in FIG. For example, the device 1100 may not include the wired communication unit 1109. [

The device 1100 illustrated in FIG. 11 may be configured to perform a predetermined function based on the motion of the device 1100 in the standby mode of the device 1100, based on the first processor 1111, . The first processor 1111 may be a low-power processor as described above, and the second processor 1112 may be defined as an application processor as described above, but is not limited thereto. The relationship between the first processor 1111 and the second processor 1112 will be described in more detail below when describing the corresponding components.

The standby mode state of the device 1100 may include the low power consumption state referred to in FIG. 1, which is the power consumption state by the sensing unit 1102, the first processor 1111, and the storage unit 1107. The standby mode state of the device 1100 may include a state in which power is not consumed by components of the device 1100 other than the sensing unit 1102, the first processor 1111, and the storage unit 1107 . The standby mode state of the device 1100 may include a deactivation state of the AP 1113 of the second processor 1112. The standby mode state of the device 1100 may include a black screen state of the touch screen 1103. The standby mode state of the device 1100 may include at least one of a disabled state of a function related to a touch screen of the touch screen 1103 and a disabled state of a touch sensing function of the touch screen 1103. [ The standby mode state of the device 1100 is not limited to the above-described one.

The information input unit 1101 may be a state in which the device 1100 does not consume power in the standby mode. The information input unit 1101 may be in the inactive state of the device 1100 in the standby mode. The information input unit 1101 may input input data for controlling the operation of the device 1100. [ For example, the power on / off command of the device 1100 can be input. The information input unit 1101 may include a key pad, a dome switch, a jog wheel, a jog switch, a hardware button, a hot key, Panels, and the like.

The sensing unit 1102 may be active in a standby mode and an active mode state of the device 1100. [ The sensing unit 1102 is configured as described above with reference to the sensing unit 101 of FIG. 1, and outputs a signal sensing the movement of the device 1100. The sensing unit 1102 may be referred to as a motion detection unit that detects the motion of the device 1100. [

The sensing unit 1102 can output a signal sensing the position of the device 1100, the presence or absence of the user, the orientation of the device 1100, acceleration or deceleration of the device 1100, and the like. The sensing unit 1102 may further include a proximity sensor and a motion sensor. A proximity sensor is a sensor that detects the presence of an object approaching a preset detection surface or an object existing in the vicinity without mechanical contact using the force of the electromagnetic field or infrared rays. Examples of proximity sensors include transmission type photoelectric sensors, direct reflection type photoelectric sensors, mirror-launched type photoelectric sensors, high-frequency oscillation proximity sensors, capacitive proximity sensors, magnetic proximity sensors, and infrared proximity sensors.

The sensing unit 1102 may output a signal obtained by sensing a gesture of a sensor-based user. The sensing unit 1102 may include first to nth sensors 501_1 to 501_n as the sensing unit 101 shown in FIG. The sensing unit 1102 can operate in both the standby mode and the active mode state of the device 1100. [

The touch screen 1103 may be in a black screen state as described above in the standby mode of the device 1100. The touch screen 1103 may be in a state where the device 1100 does not consume power in the standby mode. The touch screen 1103 may be a disabled state of the function related to the touch screen 1103 in the standby mode of the device 1100 and / or a disabled state of the touch sensing function of the touch screen 1103. [ The touch screen 1103 can output a screen or information indicating the screen lock setting state of the device 1100 in the standby mode of the device 1100. [ The touch screen 1103 may be in the inactive state of the device 1100 in the standby mode.

The touch screen 1103 may be formed by a resistive film (reduced pressure) method or a capacitive method, but is not limited thereto. The touch screen 1103 may receive input information of the user depending on the touch-based gesture of the user described above. The input information of the user depending on the user's touch-based gesture can be defined by various combinations of the number of touches, the touch pattern, the touch area, and the touch intensity.

The touch screen 1103 may include various sensors for sensing a touch or a proximity touch of the touch screen 1103. The sensor provided on the touch screen 1103 can generate a signal sensing the gestures or patterns of the touch-based user. The proximity sensor for the touch screen 1103 may be the same as the proximity sensor included in the sensing unit 1102. [

An example of a sensor for sensing the touch of the touch screen 1103 may include a tactile sensor. The tactile sensor can detect various information such as the roughness of the contact surface, the rigidity of the contact object, and the temperature of the contact point. The proximity sensor refers to a sensor that detects the presence of an object approaching a predetermined detection surface or an object in the vicinity thereof without mechanical contact using the force of the electromagnetic field or infrared rays. Examples of proximity sensors include transmission type photoelectric sensors, direct reflection type photoelectric sensors, mirror-launched type photoelectric sensors, high-frequency oscillation proximity sensors, capacitive proximity sensors, magnetic proximity sensors, and infrared proximity sensors.

The touch of the touch screen 1103 is a case where a pointer is touched on the touch panel. The proximity-touch of the touch screen 1103 is a case where the pointer is not actually touched to the touch panel but is approached within a predetermined distance from the touch panel. The pointer is a tool for touching or touching a specific portion of the touch screen 1103. Examples of pointers include, but are not limited to, a stylus pen, a finger, and the like.

The touch screen 1103 displays information output by the device 1100. For example, the touch screen 1103 may display a screen responding to a gesture of a user or a touch pattern sensed through sensors provided on the touch screen 1103. The touch screen 1103 may display a screen responding to the control data input through the information input unit 1101 or the input information of the user. The touch screen 1103 can display a screen responding to the sensed signal through the sensing unit 1102. [ The touch screen 1103 can display at least one of the screens as shown in Figs. 4A to 4J.

The touch screen 1103 may be referred to as an input and output device. The screen displayed on the touch screen 1103 includes a GUI (Graphic User Interface) screen based on a UI (User Interface).

The touch screen 1103 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display), AMOLED (Active-Matrix Organic Light-Emitting Diode), and the like. The touch screen 1103 may be referred to as a display. The device 1100 may include two or more touch screens 1103. In the case where the device 1100 includes two touch screens 1103, the touch screen 1103 can be configured in a dual touch mode. The dual touch mode refers to a front side touch mode of the device 1100 and a rear side touch mode of the device 1100, but is not limited thereto.

The camera 1104 processes an image frame such as a still image or a moving image obtained by an image sensor (or an optical sensor) in a video communication mode or a photographing mode. The image frame processed by the camera 1104 can be displayed on the touch screen 1103. [ The image frame processed by the camera 1104 may be stored in the storage unit 1107 or transmitted to the outside via the wireless communication unit 1108 or the wired communication unit 1109. [

The camera 1104 may be in a state where power is not consumed in the standby mode of the device 1100. [ The camera 1104 may be in the inactive state of the device 1100 in the standby mode. More than one camera 1104 may be provided depending on the configuration of the device 1100. That is, the device 1100 may include a front camera and a rear camera. The front face camera can be operated in the standby mode of the device 1100 to recognize the face of the user or obtain the image for estimating the motion of the user's pupil as shown in Fig. Estimating the movement of the user's pupil may be referred to as tracking the movement of the user's pupil. The camera 1104 can be used as an input device for recognizing a user's space gesture.

The audio input unit 1105 may be in a state in which the device 1100 does not consume power in the standby mode. The audio input unit 1105 may be in the inactive state of the device 1100 in the standby mode. The audio input unit 1105 receives an external sound signal in a communication mode, a recording mode, a voice recognition mode, or the like, converts it into electrical voice data, and transmits the voice data to the processor 1110. The audio input unit 1105 may be formed of, for example, a microphone. The audio input unit 1105 may include a function based on various noise removal algorithms for eliminating noise generated in receiving an external sound signal.

The audio input unit 1105 can be activated when the recording icon is selected in the above-described FIG. 4B. An external sound signal input through the audio input unit 1105 may be stored in the storage unit 1107 through the processor 1110. [ An external sound signal input through the audio input unit 1105 can be transmitted to the outside through the processor 1110 and the wireless communication unit 1108. [ An external sound signal input through the audio input unit 1105 may be transmitted to the outside through the processor 1110 and the wired communication unit 1109. [

The audio output unit 1106 may be in a state in which the device 1100 does not consume power in the standby mode. The audio output 1106 may be in the inactive state of the device 1100 in the standby mode. The audio output unit 1106 outputs an audio signal or an audio signal received from the outside or read from the storage unit 1107 in a communication mode, an audio reproduction mode, or the like. The audio output unit 1106 may be constituted by a speaker. The audio output unit 1106 outputs an audio signal included in the reproduced content when the reproduced content includes an audio signal when the content is reproduced. The audio input unit 1105 and the audio output unit 1106 may be integrally formed as a headset.

The storage unit 1107 can be operated in a standby mode and an active mode state of the device 1100. [ The storage unit 1107 may be in an active state in a standby mode and an active mode state of the device 1100. The storage unit 1107 may store at least one program and / or a set of instructions and resources configured to be executable by the processor 1110 to be described later.

The at least one program described above may include at least one program for implementing a method of performing a function of the device 1100 according to the preferred embodiment of the present invention. The at least one program described above may be implemented as an operating system program of the device 1100, an application program related to various functions (or services) performed by the device 1100, hardware components , An application program for controlling at least one external device of the device 1100, and the like.

The external device described above may include an accessory of the device 1100. An accessory refers to a device that is functionally controlled by the device 1100 as an application program associated with the accessory is executed by the device 1100. However,

The storage unit 1107 may store at least one program capable of determining whether a value sensed by the sensing unit 1102 corresponds to predetermined motion information of the device 1200 and data related to at least one program have. The first processor 1111 loads the at least one program stored in the storage unit 1107 and executes the loaded program to determine whether the received sensing value corresponds to predetermined motion information of the device 1200 have.

That is, when the program is executed, the first processor 1111 receives the sensing value from the sensing unit 1102, and uses the received sensing value to generate predetermined motion information corresponding to the sensing value received from the storage unit 1107 Can be detected.

At least one program for determining whether the sensing value corresponds to predetermined motion information of the device 1100 may be stored in advance in the first processor 1111. [ The first processor 1111 may include a memory such as a random access memory (RAM) to store the at least one program described above.

The resource stored in the storage unit 1107 may include a sensing value, predetermined motion information of the device 1100, and information obtained by mapping predetermined function information, as described above with reference to FIG. The resources stored in the storage unit 1107 may include information related to the device 1100, information necessary for operating an application program set in the device 1100, and information necessary for executing a program for driving the above-described hardware components But are not limited to.

The information about the device 1100 may include, but is not limited to, user information of the device 1100. The storage unit 1107 may store the information referred to when the above-described Figs. 7 and 9 are described.

The storage unit 1107 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a ROM (Read Only Memory), an EEPROM (Electronically Erasable Programmable Read but is not limited to, at least one type of storage medium, such as a non-volatile memory, a non-volatile memory, a programmable read-only memory (PROM)

At least one program and / or instruction set stored in the storage unit 1107 may be classified into a plurality of modules according to functions.

12 shows an example in which programs and / or instruction sets stored in the storage unit 1107 are classified into modules. 12, the storage unit 1107 includes an operating system 1201, a wireless communication module 1202, a wired communication module 1203, a graphics module 1204, a global positioning system (GPS) module 1205, a UI But are not limited to, a user interface module 1206, a sensing module 1207, a contact and movement module 1208, a power module 1209, and an application database (DB) module 1210.

The application DB module 1210 includes a device function execution module 1211, a camera module 1212, a voice recorder module 1213, a web browsing module 1214, and a queue management module 1215 according to a preferred embodiment of the present invention. But are not limited to. For example, the application DB module 1210 may include various application modules such as an email module, a social network service (SNS) module, a video conferencing module, an image management module, a browsing module, a calendar module, a widget module, .

The operating system 1201 can control and manage the general functions of the device 1100. The operating system 1201 may include software components that enable communication between hardware and software components within the device 1100.

The wireless communication module 1202 may enable communication with at least one external device through the wireless communication unit 1108. [ The wireless communication module 1202 may include a software component for processing data received from at least one external device via the wireless communication unit 1108 and data transmitted to the at least one external device. The wireless communication module 1202 may enable wireless communication with a server (not shown) or a repeater (not shown) through the wireless communication unit 1108, but the wireless communication object is not limited to the above-described ones.

The wired communication module 1203 can enable communication between a wired communication unit 1109 composed of elements such as a universal serial bus (USB) port and the like and at least one external device (not shown). The wired communication module 1203 may include a software component for processing data transmitted to and received from at least one external device via the wired communication unit 1109. [

The graphics module 1204 includes a software component for providing brightness and rendering of the graphics being displayed on the touch screen 1103 and a virtual keyboard (or soft keyboard) for inputting text in the application module 1210 .

The GPS module 1205 may include a software component that determines the location of the device 1100 and provides the determined location information to an application that provides location based services. The UI module 1206 may include a software component that provides a UI required for an application providing UI information based on the touch screen 1103.

The sensing module 1207 may include a software component that determines a sensing value received from the sensing unit 1102 and provides a sensing value to a specific application included in the application DB module 1210 based on the determined result . For example, if the sensing value received from the sensing unit 1102 is information on the motion of the device 1100, the sensing module 1207 transmits the received sensing value to the device function execution module 1211.

The device function execution module 1211 detects the motion information of the device 1100 corresponding to the received sensing value from the storage unit 1107. When the motion information of the device 1100 corresponding to the received sensing value is detected, the device function execution module 1211 reads the information stored in the storage unit 1107 or the flag register (not shown) included in the first processor 1111 The operating mode state of the device 1100 can be checked using the stored information. The timing of checking the operation mode state of the device 1100 is not limited to the above-described one. For example, the operating mode state of device 1100 may be checked before receiving the sensing value described above.

The device function execution module 1211 detects information about the checked operation mode state and predetermined function information mapped to the detected motion information of the device 1100 and executes a function corresponding to the detected predetermined function information have.

The device function execution module 1211 can be operated according to the operation mode state of the device 1100 without checking the operation mode state of the device 1100. [

That is, it can be executed in the standby mode of the device 1100 as shown in FIG. 6B. That is, the device function execution module 1211 can receive the sensing value in the standby mode of the device 1100. Receiving the sensing value in the standby mode of the device 1100 may be performed in a manner as described in FIG.

When the sensing value is received in the standby mode of the device 1100, the device function execution module 1211 detects the motion information of the device 1100 from the storage unit 1107 using the received sensing value. Using the motion information of the detected device 1100, the device function execution module 1211 detects predetermined function information mapped to the detected motion information from the storage unit 1107, and detects the motion information corresponding to the detected predetermined function information Function can be executed.

The device function execution module 1211 detects the motion information and predetermined function information of the device 1100 from the storage unit 1107 according to the received sensing value when the device 1100 is in the activation mode, It is possible to execute the function based on the function information of the user.

The touch and movement module 1208 may include a software component that senses a touch contact based on the touch screen 1103 and provides the results of tracking the contact based movement to a particular application included in the application DB module 1210 have. For example, the contact and motion module 1208 transmits information about the sensed touch contact to the device function execution module 1211 when the touch contact based on the touch screen 1103 as shown in FIG. 10 is detected .

The power module 1209 includes a software component that interlocks with the operating system 1201 to control the power supply to the hardware components in the device 1100 and to control the power save mode for the power supplied to the touch screen 1103 .

The storage unit 1107 does not store the modules included in the application DB 1210 among the programs and / or instruction sets shown in FIG. 12, and indicates location information such as a URL (Uniform Resource Locator) of the application module and application modules Only the display information can be stored.

In such a case, the processor 1110 can use the corresponding program and / or instruction set in connection with the external device having the application DB via the wireless communication unit 1108 or the wired communication unit 1109. [ The external device at this time includes, but is not limited to, a device having a cloud server or application database.

Only the location information of the application module and the display information that can represent the application module can be stored in the storage unit 1107. [ In this case, when the user's selection signal based on the information about the application module being displayed through the touch screen 1103 is received, the processor 1110 uses the location information of the application module selected by the user, ) Or the wired communication unit 1109, the information stored in the external device can be retrieved.

The storage unit 1107 may store information stored in the storage unit 102 shown in FIG. The storage unit 1107 may further store the schedule information of the user and the log information of the device 1100. The schedule information stored in the storage unit 1107 and the log information of the device 1100 may be provided to the first processor 1111 as status information on the device 1200. The first processor 1111 may request the storage unit 1107 with the status information on the device 1200 or may request the storage unit 1107 with the schedule information and the log information.

The wireless communication unit 1108 may be in a state in which the device 1100 does not consume power in the standby mode. The wireless communication unit 1108 may be in the inactive state of the device 1100 in the standby mode. The wireless communication unit 1108 may be controlled by the first processor 1111 in the standby mode of the device 1100 and set to the active state.

The wireless communication unit 1108 may be a wireless communication unit such as a wireless Internet communication unit, a wireless intranet communication unit, a wireless telephone network communication unit, a wireless LAN communication unit, a Wi-Fi communication unit, a WiFi direct communication (WFD) 4G (Long Term Evolution) communication, Bluetooth communication, Infrared Data Association (RFID), Radio Frequency Identification (RFID) communication, UWB (Ultra Wide Band) communication, Zigbee communication, Data can be exchanged with an external device via a wireless network such as a communication.

The wireless communication unit 1108 may include at least one of a broadcast receiving module, a mobile communication module, a wireless Internet module, a wired Internet module, a short distance communication module, and a location information module, but is not limited thereto.

The wired communication unit 1109 may be in a state in which no power is consumed in the standby mode of the device 1100. [ The wired communication unit 1109 may be in the inactive state of the device 1100 in the standby mode. The wired communication unit 1109 can exchange data with an external device via a wired network such as a wired Internet. The wired communication unit 1109 can transmit and receive data with an external device (not shown) using a Plug and Play interface such as a USB (Universal Serial Bus) port (not shown).

The power supply 1114 supplies power to the hardware components included in the device 1100. The power supply unit 1114 includes at least one power source such as a battery and an AC power source. The device 1100 may include a connection unit (not shown) that does not include the power supply unit 1114 and may be connected to an external power supply (not shown). The power supply unit 1114 may supply power to the sensing unit 1102, the first processor 1111, and the storage unit 1107 in the standby mode of the device 1100. The power supply unit 1114 may not supply power to components other than the sensing unit 1102, the first processor 1111, and the storage unit 1107 in the standby mode of the device 1100.

Processor 1110 may be referred to as one or more processors for controlling the overall operation of device 1100. [ The processor 1110 is connected to the information input unit 1101, the sensing unit 1102, the touch screen 1103, the camera 1103, and the like using the operating system 1201 and various modules 1202 to 1215 stored in the storage unit 1107. [ The audio input unit 1105, the audio output unit 1106, the storage unit 1107, the wireless communication unit 1108, the wired communication unit 1109, and the power supply unit 1114. [ The processor 1110 may be referred to as a controller, a microprocessor, a digital signal processor, or the like.

The processor 1110 is connected to the information input unit 1101, the sensing unit 1102, the touch screen 1103, the camera 1104, and the audio input unit 1105 using the operating system 1201 and the UI module 1206. [ A user interface can be provided.

The processor 1110 executes at least one program related to a method for executing a function of the device 1100 according to a preferred embodiment of the present invention to perform the above-described operations of Figs. 6A, 6B, and 13A or 13B can do.

The processor 1110 can read the program from the storage unit 1107 and execute it. The processor 1110 can download the program from an external device connected through the wireless communication unit 1108 or the wired communication unit 1109 and execute the program. The above-described external device may be referred to as an application providing server or an application market server. The external device may include, but is not limited to, a cloud server or a device capable of communicating around the device 1100. The processor 1110 may be understood to include an interface function between the processor 1110 and various hardware components within the device 1100.

The processor 1110 includes a first processor 1111 and a second processor 1112. The first processor 1211 corresponds to the low power processor 510 of FIGURE 5 and the second processor 1212 may correspond to the application processor 520 of FIGURE 5 but is not limited thereto. The second processor 1112 shown in FIG. 11 may include an AP 1113. This is because the second processor 1112 may be referred to as including an additional processor such as a Communication Processor (CP). A communication processor is a processor that controls the operation of a communication infrastructure.

The standby mode state of the second processor 1112 may indicate the standby mode state of the device 1100. [ The standby mode state of the second processor 1112 may indicate the standby mode state of the AP 1113. [

In the standby mode of the device 1100, the first processor 1111 can operate as described above. The first processor 1111 is required to wake up the AP 1113 of the second processor 1112 in order to execute a function based on the predetermined function information detected by the first processor 1111. In this case, It is possible to transmit an activation mode state setting request signal. The first processor 1111 may be configured as an MCU (Micro Control Unit) as described above.

The first processor 1111 is connected to the wireless communication unit 1108 and can receive status information on the device 1100 such as position information and time information of the device 1100 from the outside. For this, the first processor 1111 can set the wireless communication unit 1108, which is set in the inactive state of the device 1100 in the standby mode, to the active mode state. The activation mode setting of the wireless communication unit 1108 may be temporarily performed.

13A and 13B illustrate a method of executing a function of a device according to another preferred embodiment of the present invention based on the relationship between the sensing unit 1102 shown in Fig. 11, the first processor 1111 and the second processor 1112 Fig. The second processor 1112 shown in Figs. 13A and 13B may be referred to as AP 1113 in Fig.

In step S1301, the sensing unit 1101 transmits the sensed value to the first processor 1111. [ In step S1302, the first processor 1111 receives the sensing value. Step S1301 and step S1302 of FIG. 13A may be referred to as a step of continuously receiving the sensing value from the sensing unit 1101 by the first processor 1111 in the standby mode of the device 1100.

In step S1303, the first processor 1111 detects the motion information of the device 1100 using the received sensing value.

When the motion information of the device 1100 is detected, the first processor 1111 checks whether the operation mode of the device 1100 is in the standby mode (S1304). The checking of the operation mode state of the device 1100 in step S1304 may be performed in a storage location such as a flag register (not shown) included in the first processor 1111 as mentioned in Figs. 1 and 2 Information on the stored operation mode state or information on the operation mode state stored in a predetermined area of the storage unit 1107 can be detected.

The information on the operation mode state stored in the flag register and the information on the operation mode state stored in the storage unit 1107 can be stored in the flag register by using the values of '0' and '1' It can indicate whether it is in the standby mode or the active mode. In the case of the active mode state, additional information may be included that further indicates the operating mode state of the device 1100. [

The additional information may include, for example, information indicating that the current device 1100 is currently being browsed, or may include information indicating that Facebook is running. The additional information may be stored in the above-mentioned predetermined area of the storage unit 1107. [ It is to be appreciated that more detailed detection of the operating mode state of the device 1100 using the additional information, if the operating mode state of the device 1100 is the active mode state and there is the abovementioned additional information, ≪ / RTI > The predetermined area of the storage unit 1107 is an area that can be changed with respect to the recorded data and is an area accessible by both the first processor 1111 and the second processor 1112. [

If the operation mode state of the device 1100 is not the standby mode state, it indicates that the second processor 1112 is in the active mode state. If the second processor 1112 is in the active mode and the detected predetermined function information is executed by the second processor 1112, the first processor 1111 sends the detected motion information to the second processor 1112 (S1305).

The second processor 1112 detects predetermined function information from the storage unit 1107 using the received motion information (S1306).

The second processor 1112 uses the additional information on the operation mode state of the device 1100 stored in the storage unit 1107 as described above to detect the detailed operation mode state information before detecting the predetermined function information Can be detected. The additional information may be stored in a temporary storage location included in the second processor 1112. [ When the additional information is stored in the temporary storage place included in the second processor 1112, the second processor 1112, when motion information is received from the first processor 1111, Additional information can be used to detect information about the detailed mode of operation of the device 1100. The temporary storage location of the second processor 1112 may be a storage medium such as a RAM.

The second processor 1112 determines whether the subject to execute a predetermined function based on the detected predetermined function information is the second processor 1112 (S1307). Such determination may be performed using predetermined function-specific meta data, but is not limited thereto. For example, when the level of power (or the power consumption level) necessary for executing a predetermined function included in the meta data of a predetermined function is equal to or greater than a predetermined reference value, the second processor 1112 executes a predetermined function The second processor 1112 can determine the subject to be processed. The predetermined reference value information may be determined according to the level of power consumed by the device 1100 when only the first processor 1111 is operated. The predetermined function-specific meta data may be stored in the storage unit 1107.

The second processor 1112 can determine the subject to execute a predetermined function by using the information about the subject executing the predetermined function stored in the storage unit 1107. [ That is, the information on the execution subject mapped to the predetermined function information stored in the storage unit 1107 (information indicating the first processor 1111, information indicating the second processor 1112) ) Can determine a subject to execute a predetermined function.

When it is determined by the second processor 1112 that the subject to perform a predetermined function is the second processor 1112, the second processor 1112 executes the predetermined function described above (S1308).

The second processor 1112 requests the first processor 1111 to execute a predetermined function in the case where the subject to perform the predetermined function is not the second processor 1112 but the first processor 1211 , The second processor 1112 is switched to the standby mode (S1310). Accordingly, the first processor 1111 executes a predetermined function (S1311). When the second processor 1112 transitions from the active mode state to the standby mode state, the power consumption state of the device 1100 may be a low power consumption state as in the standby mode state of the device 1100 described above. That is, the power consumption state of the device 1100 may be a power consumption state based on the sensing unit 1101, the first processor 1111, and the storage unit 1107.

On the other hand, when the operation mode state of the device 1100 is the standby mode state in step S1304, the method proceeds to step S1312.

The first processor 1111 detects predetermined function information from the storage unit 1107 using the detected motion information (S1312). If the predetermined function information is detected, the first processor 1111 determines whether the main processor to execute the predetermined function according to the detected predetermined function information is the first processor 1111 (S1313). The determination as to the subject to perform a predetermined function may be performed in the manner as mentioned in step S1307 described above. That is, when the power level required for executing the predetermined function included in the meta data of the predetermined function information is smaller than the predetermined reference value, the main processor that can execute the predetermined function information can be determined by the first processor 1111 have.

The method of determining a subject to perform a predetermined function is not limited to the above-described one. For example, as described above, the first processor 1111 can determine the subject to execute a predetermined function by using the information about the executing entity mapped to the predetermined function information stored in the storage section 1107 .

As a result of the determination, if the subject to execute a predetermined function is the first processor 1111, the first processor 1111 executes a predetermined function while maintaining the operation mode state of the device 1100 in the standby mode state ( S1314).

If it is determined in step S1313 that the subject to perform a predetermined function is not the first processor 1111, the first processor 1111 transmits an activation mode state setting request signal to the second processor 1112 (S1315). Accordingly, the second processor 1112 switches from the standby mode state to the active mode state (S1316).

When the second processor 1112 is switched to the active mode state, the second processor 1112 notifies the first processor 1111 of the transition to the active mode state (S1317). The flag information indicating the operation mode state of the device 1100 stored in the first processor 1111 or the storage unit 1107 is updated to the active mode state. The first processor 1111 transmits a predetermined function execution request signal to the second processor 1112 (S1318), and the second processor 1112 executes a predetermined function (S1319).

14 is a functional block diagram of a device according to another preferred embodiment of the present invention. 14 is an example of executing a function of a device on the basis of information about motion of the device and status information about the device. The function execution of the device to be described later can be performed as at least one example of the screen examples shown in Figs. 4A to 4J.

14, the device 1400 includes, but is not limited to, a sensing unit 1410, a status information detection unit 1420, a storage unit 1430, an information input / output unit 1440, and a processor 1450. That is, as mentioned in FIG. 1, the components included in the device 1400 may be smaller or larger than those shown in FIG.

The status information detector 1420 may not be included in the device 1400, for example, when the processor 1450 is configured to detect status information about the device 1400. [ The device 1400 may include a status information receiving unit instead of the status information detecting unit 1420 when receiving status information such as position information and time information of the device 1400 from the outside. The situation information receiving unit may be configured as a component communicable with the wireless communication unit 1108 shown in FIG. 11, but is not limited thereto. The outside may include a GPS (Global Positioning System) server.

The sensing unit 1410 senses the motion of the device 1400, such as the sensing unit 101 of FIG. 1, based on the SSP of FIG. 5 described above, and transmits the sensed value to the processor 1450. The transmitted sensing value may be referred to as information about the motion of the device 1400. The sensing unit 1410 may be operated in a standby mode and an active mode state of the device 1400.

The status information detector 1420 is configured to detect the current time information, the current position information of the device 1400, and the operation mode status of the device 1400, but is not limited thereto. That is, the current status information on the device 1400 can be detected by further referring to the schedule information stored in the storage unit 1430 and the log information of the device 1400.

The context information detector 1420 may also operate based on a seamless context detection platform. The situation information detector 1420 may be configured to detect the presence or absence of a situation information detector 1420 in a manner similar to the data communication method between the sensor hub 511 and the SSP manager 512 shown in FIG. And the processor 1450 may be connected to transmit context information.

In this case, the situation information detector 1420 may be configured to receive control under the low-power processor when detecting the situation information such as time information and under the control of the application processor when detecting the situation information other than the time information . To this end, the context information detector 1420 may include independent components according to detectable context information. For example, the situation information detection unit 1420 may independently include a component that detects time information and a component that detects position information.

If the context information detector 1420 includes separate components as described above, then in the standby mode of device 1400, some of the components are in an active state and the remaining components are inactive have. The activation state of some components of the situation information detection unit 1420 may indicate a power consumption state. The inactive state of the components other than some components of the situation information detection unit 1420 may indicate a state in which power is not consumed by the components other than the certain components.

Some of the components described above may be components controlled by a low-power processor, for example, components that detect status information regarding the above-described time information. The components other than the above-mentioned components may be components controlled by the above-described application processor, for example, may include components for detecting the above-described position information, but components included in the situation information detection unit 1420 The elements are not limited to those described above.

The situation information that can be detected by the situation information detection unit 1420 can be detected as follows, but is not limited thereto.

That is, when predetermined motion information for the device 1400 is recognized, the situation information detecting unit 1420 can detect whether the operation mode state of the device 1400 is the standby mode state or the active mode state. The situation information detection unit 1420 can detect whether the device 1400 is located indoors or outdoors. Detecting whether the device 1400 is located indoors or outdoors can be performed using triangulation based on GPS received information or information received from a plurality of APs (Access Points).

When the device 1400 is located indoors, the situation information detection unit 1420 can detect indoor information (e.g., information such as office or house) by communication with a plurality of indoor APs. When the device 1400 is located outdoors, it is possible to detect location information (for example, information such as the A train station, information before the B movie theater) by the GPS information or information received from a plurality of APs.

The situation information detection unit 1420 can detect the current time information based on the GPS-based time information or the system clock signal in the device 1400. [

The context information detector 1420 can detect related schedule information and / or log information among the schedule information and / or log information stored in the storage unit 1430 using the detected time information and the location information of the device 1400 . The situation information detection unit 1420 can infer status information using the detected schedule information and / or log information.

When the predetermined motion information for the device 1400 is recognized, the situation information detector 1420 detects that the user of the device 1400 has walked or stopped, and the operation mode state of the device 1400 is the standby mode Status information indicating that the status is " on " In order to detect such situation information, the situation information detection unit 1420 can detect the information stored in the storage unit 1430 and / or the information stored in the processor 1450 and the moving speed of the device 1400 And the like. In order to perform a function of detecting the moving speed of the device 1400, the situation information detection unit 1420 holds the reference speed information necessary for detecting the moving speed in advance, or the reference speed information stored in the storage unit 1430 . ≪ / RTI >

When predetermined motion information for the device 1400 is recognized, the situation information detector 1420 detects that the user is holding the device 1400 in front of the B movie theater, the operation mode state of the device 1400 is in the standby mode , The schedule information of the device 1400, the time information, and the log information, and can detect the situation information indicating that pre-paid information exists at a time adjacent to the current time in the B movie theater. To this end, the situation information detection unit 1420 may include a function to detect the moving speed of the device 1400, a function to read necessary information from the storage unit 1430 and / or the processor 1450, A function of acquiring position information and time information of the device 1400, and the like.

When predetermined motion information for the device 1400 is recognized, the situation information detecting unit 1420 detects that the user holds the device 1400 in the A train station, the operation mode state of the device 1400 is the standby mode state , The schedule information of the device 1400, the time information, and the log information to detect the situation information indicating that the train ticket departing from the A train station is reserved. To this end, the situation information detector 1420 may include a function to detect the moving speed of the device 1400, a function to read necessary information from the storage 1430 and / or the processor 1450, Location information of the device 1400, and the ability to acquire time information.

When the predetermined motion information for the device 1400 is recognized, the situation information detecting unit 1420 detects that the user holds the device 1400, the operation mode state of the device 1400 is the active mode state, 1400) is in the course of web browsing. For this, the situation information detector 1420 may include a function of detecting the moving speed of the device 1400, a function of reading necessary information from the storage 1430 and / or the processor 1450, and the like.

When predetermined motion information on the device 1400 is recognized, the situation information detector 1420 detects that the user is holding the device 1400 at the exit 7 of the Gangnam station and the operation mode of the device 1400 is in the standby mode , The user can detect the appointment information of the time adjacent to the current time as the situation information. To this end, the situation information detecting unit 1420 may include a function for detecting position and time information, a function for reading necessary information from the storage unit 1430 and / or the processor 1450, a function for checking the schedule information of the device 1400, . ≪ / RTI >

The situation information detection unit 1420 can be configured to detect only the time information and the position information as the situation information. In this case, the processor 1450 refers to the information about the operation mode state of the device 1400 stored in the storage unit 1430, the schedule information, and the log information based on the received time information and the location information, ) Can be detected in more detail.

Meanwhile, the storage unit 1430 may be configured as the storage unit 102 shown in FIG. 1 and may store information and programs. The information input / output unit 1440 may be configured the same as the information input / output unit 104 shown in FIG.

When the processor 1450 operates as a low power processor (or coprocessor), such as the processor 103 of FIG. 1, the standby mode state of the device 1400 is determined by the sensing portion 1410, A power consumption state or an activation state by some or all components included in the information detection unit 1420, the storage unit 1430, and the processor 1450.

The standby mode state of the device 1400 is controlled by the sensing unit 1410 included in the device 1400 and some or all of the components included in the state information detection unit 1420 A state in which power is not consumed by elements included in the device 1400, the storage 1430, and the processor 1450.

When the processor 1450 operates with the above-described low-power processor, the standby mode state of the device 1400 is related to the touch screen as referred to in Fig. 1 described above for the touch screen included in the information input / A touch sensing function and a display function of the touch screen, and a screen lock setting state of the device 1400. [

The processor 1450 may include the low power processor and the application processor discussed above with reference to FIG. The standby mode state of the device 1400 may include an active state of the low power processor included in the processor 1450 and an inactive state of the application processor when the processor 1450 includes an application processor. The standby mode state of the device 1400 may include a power consumption state by the low power processor included in the processor 1450 when the processor 1450 includes an application processor. The standby mode state of the device 1400 may include a state in which power is not consumed by the application processor included in the processor 1450. [

When the processor 1450 includes a low-power processor and an application processor, it may operate as the operation flow chart shown in Figs. 15, 16A, 16B, 19A, and 19B below. The operation of the processor 1450 will be described based on the operation flow chart shown in Figs. 15, 16A, 16B, 19A and 19B.

FIG. 15 is a flowchart illustrating a method of performing a function of a device according to another preferred embodiment of the present invention. Referring to FIG. 15, in step S1501, the processor 1450 receives a sensing value from the sensing unit 1410. FIG. Reception of the sensing value from the sensing unit 1410 may be based on the SSP as mentioned in FIG.

The processor 1450 detects predetermined motion information from the storage unit 1430 based on the received sensing value (S1502). The predetermined motion information stored in the storage unit 1430 may take into account the error range of the received sensing value. For example, when the predetermined motion information described above indicates 180 ° rotation, the received sensing value may be represented by an electrical signal corresponding to 180 ° ± α. alpha may have an integer value equal to or greater than zero.

When predetermined motion information is detected, the processor 1450 receives the context information from the context information detector 1420 (S1503). Receiving the context information from the context information detector 1420 may be performed by transmitting a context information request signal from the processor 1450 to the context information detector 1430.

The processor 1450 executes the function of the device based on the received status information and predetermined motion information about the device 1400 (1504).

When the time information and the position information are received from the topology information detector 1420, the processor 1450 calculates the schedule information, the log information, and the device information 1400 stored in the storage unit 1430 using the received time information and the position information, Information about the operation mode status of the device 1400 can be detected to detect the current status information regarding the device 1400. [ In this case, step S1503 may be referred to as a situation information detection step, and the situation information used in step S1504 is the above-described current situation information regarding the device 1400 described above.

Figure 15 may be performed based on the idle mode state of device 1400 or based on the active mode of device 1400. [ Accordingly, the function performed in step S1504 may be different depending on the operation mode state of the device 1400. [

16A is an operational flowchart for explaining a method of executing a function of a device according to another preferred embodiment of the present invention. FIG. 16A is an example in which the function of executing an embedded gateway in the device 1400 is added in the operation flow chart of FIG. 15 when the operation mode state of the device 1400 indicates the standby mode state. If the predetermined function executed by the device 1400 is configured as a service, the above-mentioned gateway may be referred to as a service gateway.

The processor 1450 receives the sensing value from the sensing unit 1410 (S1601). The received sensing value may be referred to as information regarding the motion of the device 1400. Using the received sensing value, the processor 1450 detects predetermined motion information of the device 1400 (S1602). Upon receiving the status information from the status information detection unit 1420 (S1603), the processor 1450 checks the operation mode status of the device 1400. [ If the operation mode state of the device 1400 is the standby mode (S1604), the processor 1450 reads the embedded gateway information from the storage unit 1430 and executes the embedded gateway screen (S1605). The processor 1450 detects predetermined function information based on predetermined motion information and situation information, and executes a predetermined function according to the detected predetermined function information (S1606).

If the operation mode state of the device 1400 is not the standby mode state in step S1604, the processor 1450 does not execute the embedded gateway screen and detects predetermined function information based on the state information and the motion information of the device 1400 , And executes a predetermined function according to the detected predetermined function information (S1607).

16B is an operational flowchart for explaining a method of performing a function of a device according to another preferred embodiment of the present invention when the operation mode of the device 1400 is in a standby mode as shown in FIG. 6B.

Referring to FIG. 16B, in step 1608, when the operation mode of the device 1400 is in the standby mode, the processor 1450 receives the sensing value from the sensing unit 1410 (S1609). The received sensing value may be referred to as information regarding the motion of the device 1400.

The processor 1450 detects predetermined motion information of the device 1400 using the received sensing value (S1610). Upon receiving the context information from the context information detector 1420 (S1611), the processor 1450 executes the embedded gateway screen using the received context information and the detected motion information (S1612). The processor 1450 detects predetermined function information based on predetermined motion information and status information, and executes a predetermined function according to the detected predetermined function information (S1613).

17 is an example of a screen for explaining the embedded gateway screen when predetermined motion information of the device 1400 is detected and the operation mode state of the device 1400 is in the standby mode.

That is, when the device 1400 is rotated in the clockwise direction by 90 degrees when the operation mode of the device 1400 is in the standby mode or the black screen state as shown in the screen 1710, As shown, the device 1400 displays a gateway screen. The gateway screen can be displayed for a very short time. For example, it may be displayed for about 5 seconds, but is not limited thereto. The time at which the gateway screen is displayed may be set during configuration of the device 1400.

17, the gateway screen may be executed by the coprocessor when the processor 1450 includes a coprocessor (e.g., a low power processor) and a main processor (e.g., an application processor). Accordingly, when a predetermined function to be executed is to be executed by the coprocessor and the main processor is in the standby mode, the device 1400 executes the predetermined function by the coprocessor while maintaining the standby state of the main processor (1730).

The embedded gateway screen may be provided in various forms as shown in Figs. 18A to 18F.

That is, the gateway screen may include the manufacturer identification information (e.g., a logo) of the device 1400 and a command to ask the user to unlock the device as shown in FIG. 18A.

The gateway screen may include information identifying the manufacturer identification information of the device 1400 and a predetermined executable application, as shown in FIG. 18B. FIG. 18B shows an example in which information that can select a predetermined executable application is shown as an icon. However, the information that the user can select for a given executable application is not limited to the icon described above. For example, information that can select a predetermined executable application may be provided in text form.

When four icons are provided as shown in Fig. 18B, it indicates that there are four applications corresponding to predetermined function information executable according to the motion information and the situation information of the device 1400. [ Accordingly, the user can select at least one of the four executable applications using the icon.

The gateway screen may include manufacturer identification information of the device 1400 and information that the user can select whether to perform a predetermined function, as shown in FIG. 18C. In the case of FIG. 18C, information that allows a user to select whether or not to execute a predetermined function is provided in the form of an image in a content / advertisement area.

The information being displayed through the content / advertisement area may be, but is not limited to, an image that is a portion of an image (e.g., a thumbnail or a representative image) or a reduced size image representing a certain feasible function. The user can select a predetermined function execution based on the touch on the content / advertisement area. That is, the user can request the execution of a predetermined function by touching the content / advertisement area. The information displayed through the content / advertisement area corresponds to predetermined function information determined according to the motion information and the context information of the device 1400, and may be stored in the storage unit 1430 in advance.

The gateway screen shown in Fig. 18D includes the manufacturer identification information of the device 1400, a lock (or screen lock setting status) release message, and an icon for selecting at least one application executable. The gateway screen shown in FIG. 18E includes information on the manufacturer / manufacturer information of the device 1400, a lock release message, and information based on the content / advertisement area, which is information that the user can select whether or not the device 1400 executes the predetermined function . The gateway screen shown in Fig. 18F includes the manufacturer identification information of the device 1400, the lock release message, information (content / advertisement area) in which the user can select whether or not to execute the predetermined function, and the above- And includes information (icon) for selecting one application.

The gateway screen is not limited to Figs. 18A to 18F. For example, the gateway screen may not include the " unlock " message shown in Figs. 18A, 18D, 18E and 18F.

The gateway screen shown in Figs. 18A to 18F can be applied to the device 100 shown in Fig. 1 or the device 1100 shown in Fig. When applied to Figs. 1 and 11, the gateway screen shown in Figs. 18A to 18F can be referred to as being provided according to the operation mode state of the device and the motion information of the device without considering the situation information as described above .

19A is a flowchart for explaining a method of executing a function of a device according to another preferred embodiment of the present invention. FIG. 19A is a diagram showing an operation flow chart of FIG. 16A further including a function for selecting whether to execute the function through the embedded gateway screen. Therefore, the steps S1901 to S1904 in FIG. 19A are the same as the steps S1601 to S1604 in FIG. 16A, and therefore, a description thereof will be omitted below to avoid redundant description.

If the operation mode state of the device 1400 is in the standby mode in step S1904, the processor 1450 reads and executes the embedded gateway from the storage unit 1430 (S1905). The embedded gateway screen to be executed at this time may correspond to one of the screens of Figs. 18B to 18F.

Thus, information for selecting the maker identification information of the device 1400 and whether to execute the function is output through the information input / output unit 1440. In particular, when a plurality of executable functions are provided, a plurality of pieces of information that can select whether or not to execute the function as shown in one of Figs. 18B, 18D and 18F can be displayed. The information for selecting whether or not to execute the function corresponds to predetermined function information determined based on predetermined motion information of the device, detected status information, and information on the operation mode status of the device 1400. [

In step S1906, when an execution request for a predetermined function to be executed is received, the processor 1450 executes the predetermined function for which execution is requested (S1907). However, if the function execution is not requested in step S1906, the processor 1450 returns to the operating state before the motion information recognition of the device 1400 is recognized.

In the case where the operation mode state of the device 1400 is the standby mode state, it can be determined that the execution of the function is not requested when the execution request is not received for a predetermined time or more. The predetermined time can be managed by the processor 1450 and stored in the storage unit 1430 and managed.

In step S1904, if the operation mode state of the device 1400 is not the standby mode (the operation mode state of the device 1400 is the active mode state), the processor 1450 outputs information that enables or disables the function execution (S1909). The information output at this time may be output in the form of a pop-up window, but is not limited thereto. Further, the outputted information is information capable of selecting whether to execute a predetermined function determined according to the operation mode state of the device 1400, the motion information, and the situation information.

If a function execution request is received based on the information being displayed in the device 1400, the processor 1450 executes the function requested in step S1907. If a function execution request is not received based on the information being displayed, the processor 1450 returns to the operating state prior to the motion information recognition of the device 1400. [

When the device 1400 is in the active mode and the information for selecting whether or not to execute the function is displayed as described above, if another function menu is requested or selected, or if execution is not requested for a certain period of time as described above, It can be determined that the function execution request is not received.

FIG. 19B is a flowchart illustrating a method of executing a function of a device according to another preferred embodiment of the present invention. FIG. 19B is a view showing an operation flow chart of FIG. 16B including a function for selecting whether to execute the function through the embedded gateway screen. Therefore, steps S1910 to S1913 in Fig. 19B are performed similarly to steps S1608 to S1611 in Fig. 16B, and steps S1914 to S1917 are performed similarly to steps S1905 to S1908 in Fig. 19A.

Fig. 20 includes information (e.g., an icon) capable of selecting at least one executable application as shown in Fig. 18f and information (e.g., a content / advertisement area) in which the user can select whether to execute a predetermined function And requests the device 1400 performing the predetermined function execution through the embedded gateway screen.

That is, when upside down motion information and status information of the device 1400 are detected when the operation mode state of the device 1400 is the standby mode and the black screen state (2010), the motion information of the device 1400 and the state The screen 2020 of Fig. 20 is displayed by the predetermined function information determined based on the information.

As the content / advertisement area is selected on the screen 2020 of FIG. 20, a screen 2030 including the train schedule information is displayed. At this time, an image obtained by reducing the size of the train image or the train image displayed on the screen 2030 in the content / advertisement area may be displayed, but it may be provided in the form of text information.

The above-described step S1504 of Fig. 15, steps S1606 and S1607 of Fig. 16A, step S1613 of Fig. 16B, step S1907 of Fig. 19A, and step S1917 of Fig. 19B include the processes shown in Figs. 7 and 9 It can be deformed.

21 is a functional block diagram of a device according to another preferred embodiment of the present invention. FIG. 21 is an example including the situation information detecting unit 1420 included in the functional block diagram of the device 1400 of FIG. 14 in the functional block diagram of FIG. Accordingly, the processor 2111 may operate as the processor 1450 of FIG.

The processor 2111 includes a first processor 2122 and a second processor 2113 as shown in FIG. The first processor 2122 receives the sensing value from the sensing unit 2102 based on the continuous sensing platform mentioned in FIG. 5, and based on the continuous situation information detection platform mentioned in FIG. 14, And can receive the situation information from the information detection unit 2103.

A touch screen 2104, a camera 2105, an audio input unit 2106, an audio output unit 2107, a wireless communication unit 2109, a wired communication unit 2110, a power source unit 2115 A touch screen 1103, a camera 1104, an audio input unit 1105, an audio output unit 1106, a wireless communication unit 1108, a wired communication unit 1109, a power supply unit 1105, 0.0 > 1114 < / RTI >

The storage unit 2108 shown in FIG. 21 has a configuration similar to that of the storage unit 1430 shown in FIG. 14 and the storage unit 1107 shown in FIG. 11, and includes a storage unit 1430 and a storage unit 1107 ) And at least one program.

The standby mode state of the device 2100 may include states as described in FIGS. 1, 11, and 14. That is, the standby mode state of the device 2100 may include a power consumption state by the sensing unit 2102, some components of the situation information detection unit 2103, the first processor 2112, and the storage unit 2108 have. The standby mode state of the device 2100 may include a power consumption state by the sensing unit 2102, the situation information detection unit 2103, the first processor 2112, and the storage unit 2108. The standby mode state of the device 2100 is controlled by a component included in the device 2100 excluding the sensing unit 2102, the components of the situation information detection unit 2103, the first processor 2112, Lt; RTI ID = 0.0 > a < / RTI > The standby mode state of the device 2100 is controlled by a component included in the device 2100 other than the sensing unit 2102, the situation information detection unit 2103, the first processor 2112, and the storage unit 2108, It may include a state that is not consumed.

The standby mode state of the device 2100 may include at least one of a function related to the touch screen 2104 referred to in Fig. 11 and a screen lock setting state of the device 2100.

One or more programs including instructions for causing a computer to perform a method of performing a function of a device in accordance with embodiments of the present invention may be recorded on a computer readable recording medium with computer readable code . A computer-readable recording medium includes all kinds of storage devices in which data that can be read by a computer system is stored. Examples of the computer-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like. The computer readable recording medium may also be distributed over a networked computer system and stored and executed as computer readable code in a distributed manner.

The present invention has been described with reference to the preferred embodiments. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed embodiments should be considered in an illustrative rather than a restrictive sense. The scope of the present invention is defined by the appended claims rather than by the foregoing description, and all differences within the scope of equivalents thereof should be construed as being included in the present invention.

Claims (27)

  1. A motion detector for detecting a motion of the device in a standby mode of the device;
    A storage unit for storing motion information based on the motion information and at least one function information corresponding to the motion information; And
    And a control unit for controlling the device to execute a function corresponding to motion information of the device in the standby mode using the motion information, the motion information, and the at least one function information.
  2. 2. The method of claim 1,
    Wherein the device includes at least one of a state in which power is consumed by the motion detection unit, the storage unit, and the processor, and a state in which an operation of the device based on the motion detection unit, the storage unit, and the processor is being performed.
  3. 2. The method of claim 1,
    A state in which power is not consumed by the motion detection unit, the storage unit, and other components included in the device other than the processor, and a state in which the operation of the device by the other component is not performed Device.
  4. 2. The method of claim 1,
    The device comprising an idle state of an application processor included in the device.
  5. 2. The method of claim 1,
    A disabled state of a function related to a touch screen included in the device, and a state of setting a screen lock of the device.
  6. The method of claim 1, wherein the function associated with the touch screen comprises:
    A touch sensing function of the touch screen, and a display function of the touch screen.
  7. 2. The method of claim 1,
    Wherein the device comprises at least one of an idle state of an application processor included in the device, a deactivation state of a function related to a touch screen included in the device, and a state of setting a screen lock state of the device.
  8. The apparatus of claim 1,
    And controls the device so that a gateway screen is displayed before executing the function.
  9. The method as claimed in claim 8,
    And notification information indicating the execution of the function.
  10. The method as claimed in claim 8,
    And selection information for selecting an execution mode for the function.
  11. The method as claimed in claim 8,
    And selection information capable of selecting an execution mode for each of the plurality of functions when there are a plurality of the functions corresponding to the motion information of the device.
  12. 2. The apparatus of claim 1,
    Further comprising: a status information detection unit for detecting at least one status information about the device,
    Wherein the storage unit stores mapping information related to at least one contextual information of the device in the motion information and the at least one function information,
    Wherein the function executed by the control unit is determined based on the at least one situation information detected by the situation information detecting unit, information about motion of the device, and the mapping information.
  13. 13. The method of claim 12, wherein the at least one contextual information comprises:
    Current time information, location information of the device, schedule information stored in the device, and log information of the device.
  14. Detecting movement of the device in a standby mode of the device;
    Detecting motion information based on the motion information;
    Detecting at least one function information corresponding to the detected motion information; And
    And executing a function based on the detected at least one function information.
  15. 15. The method of claim 14,
    A device including at least one of a state in which power is consumed by the processor and a state in which the operation of the device based on the motion detection unit, the storage unit, and the processor is being performed, How to perform functions of.
  16. 15. The method of claim 14,
    At least one of a state in which power is not consumed by the motion detection unit, a storage unit, and other components included in the device other than the processor, and a state in which the operation of the device by the other component is not performed The method comprising the steps of:
  17. 15. The method of claim 14,
    The idle state of an application processor included in the device.
  18. 15. The method of claim 14,
    A disabled state of a function related to a touch screen included in the device, and a state of setting a screen lock of the device.
  19. 15. The method of claim 14, wherein the function associated with the touch screen comprises:
    A touch sensing function of the touch screen, and a display function of the touch screen.
  20. 15. The method of claim 14,
    Comprising: at least one of an idle state of an application processor included in the device, a deactivation state of a function related to a touch screen included in the device, and a screen lock state of the device.
  21. 15. The method of claim 14,
    And displaying the gateway screen before performing the function execution step.
  22. 22. The method of claim 21,
    And notification information indicating the execution of the function.
  23. 22. The method of claim 21,
    And selection information capable of selecting an execution mode for the function.
  24. 22. The method of claim 21,
    And selection information capable of selecting an execution mode for each of the plurality of functions when a plurality of the functions corresponding to the motion information of the device are present.
  25. 15. The method of claim 14,
    Further comprising detecting at least one contextual information about the device,
    Wherein the step of detecting the function information detects the at least one piece of function information using the detected motion information and the detected at least one piece of context information.
  26. 26. The method of claim 25, wherein the at least one contextual information comprises:
    Current time information, location information of the device, schedule information stored in the device, and log information of the device.
  27. 26. A computer-readable recording medium having recorded thereon one or more programs containing instructions for executing a method for performing a function by a device as claimed in any one of claims 14 to 26.
KR1020130084384A 2013-01-29 2013-07-17 Method for executing function of device, and device thereof KR20140096956A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020130010102 2013-01-29
KR20130010102 2013-01-29

Applications Claiming Priority (11)

Application Number Priority Date Filing Date Title
IN331CH2014 IN2014CH00331A (en) 2013-01-29 2014-01-27
RU2015136861A RU2635246C2 (en) 2013-01-29 2014-01-28 Method of performing device function and device for execution of method
PCT/KR2014/000773 WO2014119894A1 (en) 2013-01-29 2014-01-28 Method of performing function of device and device for performing the method
RU2017136528A RU2017136528A (en) 2013-01-29 2014-01-28 Method for performing the device function and the device for performing the method
AU2014213152A AU2014213152B2 (en) 2013-01-29 2014-01-28 Method of performing function of device and device for performing the method
CN201410043726.3A CN103970441B (en) 2013-01-29 2014-01-29 Execute the method and apparatus for carrying out the process of the function of equipment
US14/167,226 US10540013B2 (en) 2013-01-29 2014-01-29 Method of performing function of device and device for performing the method
CN201810947170.9A CN109284001A (en) 2013-01-29 2014-01-29 Execute the method and apparatus for carrying out the process of the function of equipment
JP2014014143A JP6545432B2 (en) 2013-01-29 2014-01-29 Device function execution method and device therefor
EP14153011.3A EP2759922A3 (en) 2013-01-29 2014-01-29 Method of performing a function of a device based on motion of the device and device for performing the method
AU2016235039A AU2016235039B2 (en) 2013-01-29 2016-10-03 Method of performing function of device and device for performing the method

Publications (1)

Publication Number Publication Date
KR20140096956A true KR20140096956A (en) 2014-08-06

Family

ID=51744696

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130084384A KR20140096956A (en) 2013-01-29 2013-07-17 Method for executing function of device, and device thereof

Country Status (4)

Country Link
KR (1) KR20140096956A (en)
AU (2) AU2014213152B2 (en)
IN (1) IN2014CH00331A (en)
RU (2) RU2017136528A (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060139328A1 (en) * 2004-12-29 2006-06-29 Nina Maki Mobile communications terminal and a method therefor
US9086779B2 (en) * 2005-12-22 2015-07-21 Core Wireless Licensing S.A.R.L. Input device
US8896529B2 (en) * 2007-08-01 2014-11-25 Nokia Corporation Apparatus, methods, and computer program products providing context-dependent gesture recognition
KR101506488B1 (en) * 2008-04-04 2015-03-27 엘지전자 주식회사 Mobile terminal using proximity sensor and control method thereof
US7873849B2 (en) * 2009-09-02 2011-01-18 Apple Inc. Motion sensor data processing using various power management modes
KR101672212B1 (en) * 2010-06-15 2016-11-04 엘지전자 주식회사 Mobile terminal and operation method thereof
KR102006740B1 (en) * 2010-10-20 2019-08-02 삼성전자 주식회사 Method and apparatus for displaying screen in mobile terminal
KR101855250B1 (en) * 2010-11-03 2018-05-09 삼성전자 주식회사 Touch Control Method And Portable Device supporting the same

Also Published As

Publication number Publication date
AU2016235039B2 (en) 2017-11-09
AU2014213152B2 (en) 2016-07-07
AU2014213152A1 (en) 2015-07-02
IN2014CH00331A (en) 2015-04-03
RU2017136528A (en) 2019-02-08
AU2016235039A1 (en) 2016-10-27
RU2635246C2 (en) 2017-11-09
RU2015136861A (en) 2017-03-03

Similar Documents

Publication Publication Date Title
KR101685363B1 (en) Mobile terminal and operation method thereof
US9733752B2 (en) Mobile terminal and control method thereof
US8391697B2 (en) Mobile terminal and method of controlling the operation of the mobile terminal
US9081496B2 (en) Mobile terminal and method of controlling operation of the mobile terminal
JP6328797B2 (en) Transition from using one device to using another device
US9256283B2 (en) Mobile terminal and method of controlling operation thereof
US8745490B2 (en) Mobile terminal capable of controlling various operations using a multi-fingerprint-touch input and method of controlling the operation of the mobile terminal
KR101629645B1 (en) Mobile Terminal and Operation method thereof
US9588609B2 (en) Mobile terminal and method of controlling the operation of the mobile terminal
KR20100003587A (en) Controlling a mobile terminal
KR20140147557A (en) Mobile terminal and method for detecting a gesture to control functions
US8774869B2 (en) Mobile terminal and control method thereof
KR101672212B1 (en) Mobile terminal and operation method thereof
CN105830422B (en) Foldable electronic and its interface alternation method
KR20110080348A (en) Mobile terminal, mobile terminal system and operation control method thereof
EP2431851A2 (en) Mobile terminal and method for controlling operation of the mobile terminal
KR101643869B1 (en) Operating a Mobile Termianl with a Vibration Module
TWI625646B (en) Method, electronic device and non-transitory computer-readable storage medium for managing alerts on reduced-size user interfaces
US8827811B2 (en) Mobile terminal capable of providing multiplayer game and operating method of the mobile terminal
KR20140079110A (en) Mobile terminal and operation method thereof
KR101933289B1 (en) Devices and methods for a ring computing device
KR20120001476A (en) Mobile terminal and operation control method thereof
CN102238282B (en) Mobile terminal capable of providing multiplayer game and operating method thereof
EP2874051A1 (en) Mobile terminal and control method thereof
AU2015312634B2 (en) Electronic device with bent display and method for controlling thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal