CN104008628A - Prompting method and equipment - Google Patents
Prompting method and equipment Download PDFInfo
- Publication number
- CN104008628A CN104008628A CN201310058892.6A CN201310058892A CN104008628A CN 104008628 A CN104008628 A CN 104008628A CN 201310058892 A CN201310058892 A CN 201310058892A CN 104008628 A CN104008628 A CN 104008628A
- Authority
- CN
- China
- Prior art keywords
- electronic equipment
- user
- characteristic information
- behavior characteristic
- head
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Telephone Function (AREA)
Abstract
The invention relates to the field of electronic equipment, especially to a prompting method and equipment. the method is applied to first electronic equipment and comprises the following steps: user behavior characteristic information is obtained; whether the user behavior characteristic information conforms to a first preparatory condition is judged and a first judgment result is obtained; and when the first judgment result shows that the user behavior characteristic information conforms to the first preparatory condition, prompting information is outputted. According to the method provided by the embodiment of the invention, the user behavior characteristic information can be acquired. When the user behavior characteristic information is analyzed and it is judged that a user is in a fatigue state, alarm prompting information is outputted so as to avoid dangers.
Description
Technical field
The present invention relates to electronic device field, particularly relate to a kind of reminding method and equipment.
Background technology
When user is during in fatigue state, if reminding user not in time tends to produce certain hidden danger.For example,, when user is in driving procedure, if when binocular is not eyed to the front due to fatigue, tend to produce great potential safety hazard.In prior art, do not exist and can detect user's fatigue state method or the equipment of output alarm prompting.
Summary of the invention
For solving the problems of the technologies described above, the embodiment of the present invention provides a kind of reminding method and equipment, can detect user behavior feature, in the time judging user in fatigue state by analysis user behavioural characteristic, and output information.Technical scheme is as follows:
According to the first aspect of the embodiment of the present invention, a kind of reminding method is disclosed, described method is applied to the first electronic equipment, comprising:
Obtain user behavior characteristic information;
Judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result;
When showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.
In conjunction with a first aspect of the present invention, the present invention has the implementation that the first is possible, and wherein, described the first electronic equipment has first sensor, obtains user behavior characteristic information described in to comprise:
Utilize described first sensor to obtain first signal, described first signal is for characterizing user's head behavior characteristic information.
In conjunction with a first aspect of the present invention, the present invention has the possible implementation of the second, and wherein, described the first electronic equipment is connected with the second electronic equipment data, and described the second electronic equipment has the second sensor, obtains user behavior characteristic information described in to comprise:
Receive the secondary signal that described the second electronic equipment sends, described secondary signal utilizes described the second sensor to gather by described the second electronic equipment, and described secondary signal is for characterizing user's head behavior characteristic information.
Have the third possible implementation in conjunction with a first aspect of the present invention, the possible implementation the present invention of the second of the present invention, wherein, described to judge whether described user behavior characteristic information meets first pre-conditioned, obtains the first judged result and comprise:
When analyzing head that first signal judge user in the first state, definite meet first pre-conditioned.
In conjunction with a first aspect of the present invention, the present invention has the 4th kind of possible implementation, and wherein, described the first electronic equipment has the first image capture module, obtains user behavior characteristic information described in to comprise:
Utilize described the first image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information, obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
In conjunction with a first aspect of the present invention, the present invention has the 5th kind of possible implementation, and wherein, described the first electronic equipment is connected with the second electronic equipment data, described the second electronic equipment has the second image capture module, obtains user behavior characteristic information described in to comprise:
Receive the user's of described the second electronic equipment transmission head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result, described user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result utilize described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information the user's to collection head behavior characteristic information and/or eye behavior characteristic information to carry out image recognition by described the second electronic equipment and obtain.
In conjunction with a first aspect of the present invention, the 4th kind of possible implementation of the present invention and the 5th kind of possible implementation of the present invention, the present invention has the 6th kind of possible implementation, wherein, described to judge whether described user behavior characteristic information meets first pre-conditioned, obtains the first judged result and comprise:
Analyze described head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result;
When the head that judges user is in the time that the first state or the eye behavior that judges user show user in fatigue state, determine meet first pre-conditioned.
In conjunction with a first aspect of the present invention, the present invention has the 7th kind of possible implementation, and wherein, described output information comprises:
By voice output module output language information; With or
Export vibration prompt information by vibration module; And/or
Export text or picture cues information by display module.
According to the second aspect of the embodiment of the present invention, a kind of reminding method is disclosed, described method is applied to the second electronic equipment, and described the second electronic equipment is connected with the first electronic equipment data, comprising:
Gather user behavior characteristic information;
The described user behavior characteristic information gathering is sent to the first electronic equipment, to make described the first electronic equipment obtain described user behavior characteristic information, judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result, when showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.
In conjunction with a second aspect of the present invention, the present invention has the 8th kind of possible implementation, and wherein, described the second electronic equipment has the second sensor, and described collection user behavior characteristic information comprises:
Utilize described the second sensor to gather secondary signal, described secondary signal is for characterizing user's head behavior characteristic information.
In conjunction with a second aspect of the present invention and the 8th kind of possible implementation of the present invention, the present invention has the 9th kind of possible implementation, wherein, described the second electronic equipment has stuck-module, described stuck-module is in the time that user wears described the second electronic equipment, maintain the relative position relation of described the second electronic equipment and described user's head, described the second electronic equipment is different from described the first electronic equipment.
In conjunction with a second aspect of the present invention, the present invention has the tenth kind of possible implementation, and wherein, described the second electronic equipment has the second image capture module, and described collection user behavior characteristic information comprises:
Utilize described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information, obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
According to the third aspect of the embodiment of the present invention, a kind of the first electronic equipment is disclosed, described the first electronic equipment comprises:
Acquisition module, has and obtains user behavior characteristic information;
Judge module, the user behavior characteristic information sending for receiving described acquisition module, judges whether described user behavior characteristic information meets first pre-conditioned, obtains the first judged result;
Output module, the first judged result sending for receiving described judge module, meets described first when pre-conditioned, output information when described the first judged result shows described user behavior characteristic information.
In conjunction with a third aspect of the present invention, the present invention has the 11 kind of possible implementation, and wherein, described the first electronic equipment has first sensor, described acquisition module specifically for:
Utilize described first sensor to obtain first signal, described first signal is for characterizing user's head behavior characteristic information.
In conjunction with a third aspect of the present invention, the present invention has the 12 kind of possible implementation, and wherein, described the first electronic equipment is connected with the second electronic equipment data, and described the second electronic equipment has the second sensor, described acquisition module specifically for:
Receive the secondary signal that described the second electronic equipment sends, described secondary signal utilizes described the second sensor to gather by described the second electronic equipment, and described secondary signal is for characterizing user's head behavior characteristic information.
In conjunction with a third aspect of the present invention, the present invention has the 13 kind of possible implementation, and wherein, described the first electronic equipment has the first image capture module, described acquisition module specifically for:
Utilize described the first image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information, obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
In conjunction with a third aspect of the present invention, the present invention has the 14 kind of possible implementation, and wherein, described the first electronic equipment is connected with the second electronic equipment data, and described the second electronic equipment has the second image capture module, described acquisition module specifically for:
Receive the user's of described the second electronic equipment transmission head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result, described user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result utilize described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information the user's to collection head behavior characteristic information and/or eye behavior characteristic information to carry out image recognition by described the second electronic equipment and obtain.
In conjunction with a third aspect of the present invention, the present invention has the 15 kind of possible implementation, wherein, described the first electronic equipment has stuck-module, described stuck-module, in the time that user wears described the first electronic equipment, maintains the relative position relation of described the first electronic equipment and described user's head.
According to the fourth aspect of the embodiment of the present invention, a kind of the second electronic equipment is disclosed, described the second electronic equipment is connected with the first electronic equipment data, comprising:
Acquisition module, for gathering user behavior characteristic information;
Sending module, for the described user behavior characteristic information gathering is sent to the first electronic equipment, to make described the first electronic equipment obtain described user behavior characteristic information, judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result, when showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.
In conjunction with a fourth aspect of the present invention, the present invention has the 16 kind of possible implementation, wherein, described the second electronic equipment has stuck-module, described stuck-module is in the time that user wears described the second electronic equipment, maintain the relative position relation of described the second electronic equipment and described user's head, described the second electronic equipment is different from described the first electronic equipment.
The beneficial effect of an aspect of the embodiment of the present invention is: in embodiments of the present invention, by obtaining user behavior characteristic information, judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result; When showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.The method providing according to the embodiment of the present invention, can gather user behavior characteristic information, and in the time judging user in fatigue state by analysis user behavior characteristic information, output alarm information, occurs to avoid dangerous.
Brief description of the drawings
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, to the accompanying drawing of required use in embodiment or description of the Prior Art be briefly described below, apparently, the accompanying drawing the following describes is only some embodiment that record in the present invention, for those of ordinary skill in the art, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Reminding method the first embodiment schematic diagram that Fig. 1 provides for the embodiment of the present invention;
Reminding method the second embodiment schematic diagram that Fig. 2 provides for the embodiment of the present invention;
Reminding method the 3rd embodiment schematic diagram that Fig. 3 provides for the embodiment of the present invention;
Reminding method the 4th embodiment schematic diagram that Fig. 4 provides for the embodiment of the present invention;
Reminding method the 5th embodiment schematic diagram that Fig. 5 provides for the embodiment of the present invention;
Reminding method the 6th embodiment schematic diagram that Fig. 6 provides for the embodiment of the present invention;
The first electronic equipment schematic diagram that Fig. 7 provides for the embodiment of the present invention;
The second electronic equipment schematic diagram that Fig. 8 provides for the embodiment of the present invention.
Embodiment
The embodiment of the present invention provides a kind of reminding method and equipment, can detect user behavior feature, in the time judging user in fatigue state by analysis user behavioural characteristic, and output information.
In order to make those skilled in the art person understand better the technical scheme in the present invention, below in conjunction with the accompanying drawing in the embodiment of the present invention, technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiment.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtaining under creative work prerequisite, should belong to the scope of protection of the invention.
Referring to Fig. 1, it is reminding method the first embodiment process flow diagram provided by the invention.
A kind of reminding method, described method is applied to the first electronic equipment, and described the first electronic equipment includes but not limited to desk-top computer, laptop computer, mobile terminal (comprising smart mobile phone, non intelligent mobile phone, various panel computer), car-mounted terminal etc.
S101, obtains user behavior characteristic information.
User behavior characteristic information is specially the information that characterizes user's current state.In embodiments of the present invention, the object of obtaining user behavior characteristic information is that the user behavior characteristic information by obtaining judges that user is current whether in fatigue state.When specific implementation, user behavior characteristic information specifically can comprise head behavior characteristic information and/or eye behavior characteristic information.In the time that user behavior characteristic information is specially head behavior characteristic information, specifically can comprise user's head in bow, face upward head, head keep look squarely in information such as horizontalitys.In the time that user behavior characteristic information is specially eye behavior characteristic information, specifically can comprise eyes of user in opening, closed, pupil is in information such as amplification, deflated state.Certainly, it will be appreciated by persons skilled in the art that and these are only and exemplary description be not considered as limitation of the present invention.User behavior characteristic information can also comprise other information, and those skilled in the art all belong to protection scope of the present invention not paying other modes of obtaining under creative work.
It should be noted that, the mode of obtaining user behavior characteristic information also can comprise multiple.When specific implementation, the first electronic equipment can directly gather user behavior characteristic information.Certainly, the first electronic equipment also can pass through extrinsic pathways, for example, obtain user behavior characteristic information with other electronic equipments or the device of the first electronic equipment communication connection.It should be noted that, in the time that the first electronic equipment obtains user behavior characteristic information by extrinsic pathways, other electronic equipments or device can be for having the autonomous device of processor, and it has acquisition module, and described acquisition module is used for gathering user behavior characteristic information.Certainly, the first electronic equipment also can obtain user behavior characteristic information by communicating with it outside acquisition module being connected, at this moment, described outside acquisition module can be the module without processor, it can realize the function that gathers user behavior characteristic information, but can be equipment or the module without processor.
S102, judges whether described user behavior characteristic information meets first pre-conditioned, obtains the first judged result.
Wherein, the first pre-conditioned being specifically as follows when by analysis user behavior characteristic information while judge user in fatigue state, judgement is satisfied first pre-conditioned.When specific implementation, in the time that user behavior characteristic information is head behavior characteristic information, the first pre-conditioned being specially when the head that judges user is during in the first state, determine meet first pre-conditioned.Wherein, user's head is the second state in the time of horizontality, and user's head is the first state in the time of non-standard state.Certainly, can preset in the time that user's head and the angle of surface level exceed the first predetermined threshold value, judge that user's head is in the first state, determine that it meets first pre-conditioned.The first predetermined threshold value can be set as 20 degree or 30 degree, and concrete numerical value can be set by system or user as required, does not limit at this.Certainly, also can preset in the time that user's head and the changing value of horizontal angle exceed the second predetermined threshold value, judge that user's head is in the first state, determine that it meets first pre-conditioned.Concrete judgment mode can be very flexibly, can set as required.
In the time that user behavior characteristic information is eye behavior characteristic information, the first pre-conditioned being specifically as follows judges that user's eye behavior shows that user is in the time of fatigue state, determine meet first pre-conditioned.Because people is when the normal condition, the size (for example distance between palpebra inferior on eye) of its interpupillary distance, eyes folding is all in certain numerical range.When people is during in fatigue state, pupil can amplify, and interpupillary distance also can change.People is in the time of fatigue state, and the distance on eye between palpebra inferior can reduce, and the present invention utilizes above-mentioned eye behavior characteristic information just, judges that whether user is in fatigue state.When specific implementation, can gather the eye behavior characteristic information of user under normal condition by training study by the first electronic equipment, eye behavior characteristic information under the eye behavior characteristic information of current collection and normal condition is compared, to judge that whether user is in fatigue state.For example, can set in the time that user's interpupillary distance is greater than the 3rd predetermined threshold value, determine meet first pre-conditioned.Certainly, eyes of user also can be set in the time opening state, when the distance on eye between palpebra inferior is less than the 4th predetermined threshold value, determine meet first pre-conditioned.
S103, meets described first when pre-conditioned, output information when described the first judged result shows described user behavior characteristic information.
When specific implementation, the mode of output information specifically can comprise:
By voice output module output language information; With or
Export vibration prompt information by vibration module; And/or
Export prompting text or picture cues information by display module.
In first embodiment of the invention, by obtaining user behavior characteristic information, judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result; When showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.The method providing in the embodiment of the present invention, can gather user behavior characteristic information, and in the time judging user in fatigue state by analysis user behavior characteristic information, output alarm information, occurs to avoid dangerous.
Below in conjunction with accompanying drawing, several specific implementations of the present invention are at length introduced.In second embodiment of the invention and the 3rd embodiment, the first electronic equipment can directly gather user behavior characteristic information.In fourth embodiment of the invention and the 5th embodiment, the first electronic equipment obtains user behavior characteristic information by extrinsic pathways.
Referring to Fig. 2, reminding method the second embodiment schematic diagram providing for the embodiment of the present invention.
In second embodiment of the invention, the first electronic equipment has first sensor, and described first sensor is used for gathering first signal, and described first signal is for characterizing user's head behavior characteristic information.First sensor is specifically as follows acceleration transducer, angular-rate sensor (gyroscope) or gravity sensor G-sensor(Gravity-sensor).
S201, utilizes first sensor to obtain first signal, and described first signal is for characterizing user's head behavior characteristic information.
When specific implementation, the first electronic equipment has first sensor, and described first sensor is used for gathering first signal.In the time that first sensor is gravity sensor or acceleration transducer, the tractive force that it can be subject to by perception the first electronic equipment or the variation of gravity, measure the variation of its corresponding acceleration, and then go out electronic equipment angle of inclination with respect to the horizontal plane according to the measure of the change of acceleration.Using angular-rate sensor is that gyroscope can directly be measured electronic equipment angle of inclination with respect to the horizontal plane.When specific implementation, the first electronic equipment can have stuck-module, and described stuck-module, in the time that user wears described the first electronic equipment, maintains the relative position relation of described the first electronic equipment and described user's head.For example, the form of the first electronic equipment is specifically as follows glasses type electronic equipment or wear-type electronic equipment.In the time that user's head inclination angle changes, first sensor is exportable first signal.
S202, judges whether described first signal meets first pre-conditioned.
When specific implementation, the first electronic equipment has processing module, and for analyzing first signal, and it is first pre-conditioned to judge whether described first signal meets.
S203, when when analyzing head that first signal judge user in the first state, definite meet first pre-conditioned.
When specific implementation, user's head is the second state in the time of horizontality, and user's head is the first state in the time of non-standard state.Certainly, also can preset in the time that user's head and the angle of surface level exceed the first predetermined threshold value, judge that user's head is in the first state, determine that it meets first pre-conditioned.The first predetermined threshold value can be set as 20 degree or 30 degree, and concrete numerical value can be set by system or user as required, does not limit at this.Certainly, also can preset in the time that the changing value of user's head and the angle of surface level exceedes the second predetermined threshold value, judge that user's head is in the first state, determine that it meets first pre-conditioned.Concrete judgment mode can be very flexibly, can set as required.
S204, described the first electronic equipment output information.
When specific implementation, in the time that the first electronic equipment has voice output module, can be by voice output module output language information; In the time that the first electronic equipment has vibration module, can export vibration prompt information by vibration module; In the time that the first electronic equipment has display module, can export text or picture cues information by display module.
In second embodiment of the invention, the first sensor that the first electronic equipment can have by self directly gathers user behavior characteristic information, and judges whether to export information by analysis user behavior characteristic information.Especially, in second embodiment of the invention, the user behavior characteristic information of the first electronic equipment collection is head behavior characteristic information, at this moment, the first electronic equipment can also have stuck-module, fixes for the relative position relation of the head that maintains the first electronic equipment and user.At this moment, the first electronic equipment can directly be worn on user's head, can gather easily user's head behavior characteristic information, and in the time judging user in fatigue state by analysis, output alarm information, avoids dangerous and occur.
Referring to Fig. 3, reminding method the 3rd embodiment schematic diagram providing for the embodiment of the present invention.
In third embodiment of the invention, the first electronic equipment has the first image capture module.Described the first image capture module is specifically as follows camera.
S301, utilizes the first image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information.
The first image capture module is used for gathering user behavior characteristic information.When specific implementation, the first image capture module have can gather user head behavior characteristic information and or eye behavior characteristic information.
Preferably, the first electronic equipment can have stuck-module, and described stuck-module, in the time that user wears described the first electronic equipment, maintains the relative position relation of described the first electronic equipment and described user's head.For example, the form of the first electronic equipment is specifically as follows glasses type electronic equipment or wear-type electronic equipment.Further, described the first electronic equipment can have the second fixed part, for fixing described image capture module, makes described image capture module be positioned at user's viewing area and the user oriented eyes of described image capture module.
Certainly, the first electronic equipment can be also common non-wear-type electronic equipment, such as desk-top computer, laptop computer, mobile terminal (comprising smart mobile phone, non intelligent mobile phone, various panel computer) etc., it has image capture module, can gather easily user behavior characteristic information.
S302, obtains user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
The first image capture module can be identified the image gathering, to obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
S303, analyzes described head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
When specific implementation, the first electronic equipment has processing module, be used for analyzing head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result, and it is first pre-conditioned to judge whether described head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result meet.
When specific implementation, in the time that the user behavior characteristic information of the first image module collection is head behavior characteristic information, can judge that whether user's head is in the first state by analyzing head behavioural characteristic recognition result.User's head is the second state in the time of horizontality, and user's head is the first state in the time of non-standard state.For example, in the time that user's binocular is looked squarely front, user's head is in horizontality, and its hair and facial ratio are in the first preset range.When user is in the time bowing or face upward head status, its hair and facial ratio can exceed the first preset range, when when analyzing head behavioural characteristic recognition result and judge that user's hair and facial ratio exceed the first preset range, judge that user's head is in the first state, determine meet first pre-conditioned.
When specific implementation, in the time that the user behavior characteristic information of the first image module collection is eye behavior characteristic information, can judge that whether user is in fatigue state by the eye behavioural characteristic recognition result of analysis user.For example, can gather the eye behavior characteristic information of user under normal condition by training study by the first electronic equipment, eye behavior characteristic information under the eye behavior characteristic information of current collection and normal condition is compared, to judge that whether user is in fatigue state.For example, by whether analyzing interpupillary distance that eye behavioural characteristic recognition result judge user in normal range, can set in the time that user's interpupillary distance is greater than the 3rd predetermined threshold value, definite meet first pre-conditioned.Certainly, also can judge by analyzing eye behavioural characteristic recognition result that distance between user's upper palpebra inferior is whether in normal range, eyes of user can be set in the time opening state, when the distance on eye between palpebra inferior is less than the 4th predetermined threshold value, determine meet first pre-conditioned.
Certainly, it will be appreciated by persons skilled in the art that also and can analyze head behavioural characteristic recognition result and eye behavioural characteristic recognition result in conjunction with being suitable for, pre-conditioned to judge whether to meet first.
S304, when the head that judges user is in the time that the first state or the eye behavior that judges user show user in fatigue state, determine meet first pre-conditioned.
S305, described the first electronic equipment output information.
When specific implementation, in the time that the first electronic equipment has voice output module, can be by voice output module output language information; In the time that the first electronic equipment has vibration module, can export vibration prompt information by vibration module; In the time that the first electronic equipment has display module, can export text or picture cues information by display module.
Further, when after the first electronic equipment output information, can also further generate the first steering order, switch to the second mode of operation to control the first electronic equipment from the first mode of operation.For example, in the time that the first electronic equipment is gathered user behavior characteristic information and is judged user in fatigue state by image capture module, can further generate control command, control the first electronic equipment and switch to park mode from normal operating conditions, or switch to shutdown mode from normal operating conditions.Like this, can economize on resources, reach the object of power and energy saving.
In third embodiment of the invention, the first image capture module that the first electronic equipment can have by self directly gathers user behavior characteristic information, and obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result by image recognition, and judge whether to export information by analysis user behavioural characteristic recognition result.Especially, in third embodiment of the invention, the user behavior characteristic information of the first electronic equipment collection both can comprise head behavior characteristic information, also can comprise eye behavior characteristic information, and the mode of analysis is very flexible.
Referring to Fig. 4, reminding method the 4th embodiment schematic diagram providing for the embodiment of the present invention.
In fourth embodiment of the invention, the first electronic equipment is the equipment that does not have sensor or do not have the second sensor, and it has communication module, can receive signal or data that external unit, device or module send, to obtain user behavior characteristic information.In this embodiment, will be referred to as the second electronic equipment with external unit, device or the module of the first electronic equipment communication connection.It will be appreciated by persons skilled in the art that the second electronic equipment can be the autonomous device with processor, can be also the module without processor.
S401, receives the secondary signal that described the second electronic equipment sends.
Described secondary signal utilizes described the second sensor to gather by described the second electronic equipment, and described secondary signal is for characterizing user's head behavior characteristic information.In this embodiment, the first electronic equipment and the second electronic equipment all have communication module.Preferably, described communication module is wireless communication module.At this moment, the second electronic equipment has the second sensor, and described the second sensor is used for gathering secondary signal, and described secondary signal is for characterizing user's head behavior characteristic information.The second sensor is specifically as follows acceleration transducer, angular-rate sensor (gyroscope) or gravity sensor G-sensor(Gravity-sensor).
When specific implementation, in the time that the second sensor is gravity sensor or acceleration transducer, the tractive force that it can be subject to by perception the second electronic equipment or the variation of gravity, measure the variation of its corresponding acceleration, and then go out electronic equipment angle of inclination with respect to the horizontal plane according to the measure of the change of acceleration.Using angular-rate sensor is that gyroscope can directly be measured electronic equipment angle of inclination with respect to the horizontal plane.When specific implementation, the second electronic equipment can have stuck-module, and described stuck-module, in the time that user wears described the second electronic equipment, maintains the relative position relation of described the second electronic equipment and described user's head.For example, the form of the second electronic equipment is specifically as follows glasses type electronic equipment or wear-type electronic equipment.In the time that user's head inclination angle changes, the second sensor is exportable secondary signal, and secondary signal is sent to the first electronic equipment.
S402, judges whether described secondary signal meets first pre-conditioned.
When specific implementation, the first electronic equipment has processing module, and for analyzing secondary signal, and it is first pre-conditioned to judge whether described secondary signal meets.
S403, when when analyzing head that secondary signal judge user in the first state, definite meet first pre-conditioned.
Concrete analytical approach is identical with the method that second embodiment of the invention is introduced, and does not repeat them here.
S404, described the first electronic equipment output information.
When specific implementation, in the time that the first electronic equipment has voice output module, can be by voice output module output language information; In the time that the first electronic equipment has vibration module, can export vibration prompt information by vibration module; In the time that the first electronic equipment has display module, can export text or picture cues information by display module.
In fourth embodiment of the invention, the first electronic equipment has communication module, can receive the secondary signal that external unit, device or module send, and described secondary signal is for the head behavior characteristic information of identifying user.The first electronic equipment judges whether to export information by analysis user behavior characteristic information.In this embodiment, because the first electronic equipment itself does not need to have sensor, the sensor signal that only need receive external unit, device or module transmission can be obtained user behavior characteristic information and then judge that user, whether in fatigue state, can realize the function of alarm by less cost.
Referring to Fig. 5, reminding method the 5th embodiment schematic diagram providing for the embodiment of the present invention.
In fifth embodiment of the invention, the first electronic equipment can be do not have image capture module equipment its there is communication module, can receive signal or data that external unit, device or module send, to obtain user behavior characteristic information.In this embodiment, will be referred to as the second electronic equipment with external unit, device or the module of the first electronic equipment communication connection.Wherein, the second electronic equipment has the second image capture module.It will be appreciated by persons skilled in the art that the second electronic equipment can be the autonomous device with processor, can be also the image capture module without processor.
S501, receives user's head behavioural characteristic recognition result and/or the eye behavioural characteristic recognition result that described the second electronic equipment sends.
In this embodiment, the first electronic equipment and the second electronic equipment all have communication module.Preferably, described communication module is wireless communication module.Wherein, the second electronic equipment has the second image capture module, and described user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result utilize described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information the user's to collection head behavior characteristic information and/or eye behavior characteristic information to carry out image recognition by described the second electronic equipment and obtain.
S502, analyzes described head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
When specific implementation, the first electronic equipment has processing module, be used for analyzing head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result, and it is first pre-conditioned to judge whether described head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result meet.Concrete implementation procedure is identical with the 3rd embodiment, does not repeat them here.
S503, when the head that judges user is in the time that the first state or the eye behavior that judges user show user in fatigue state, determine meet first pre-conditioned.
S504, described the first electronic equipment output information.
In fifth embodiment of the invention, the first electronic equipment has communication module, can receive user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result that external unit, device or module send.The first electronic equipment judges whether to export information by analysis user head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.In this embodiment, because the first electronic equipment itself does not need to have image capture module, user's head behavioural characteristic recognition result and/or the eye behavioural characteristic recognition result that only need receive external unit, device or module transmission can judge that user, whether in fatigue state, can realize the function of alarm by less cost.
Referring to Fig. 6, reminding method the 6th embodiment schematic diagram providing for the embodiment of the present invention.
In sixth embodiment of the invention, the second electronic equipment is connected with the first electronic equipment data.The second electronic equipment is used for gathering user behavior characteristic information.The second electronic equipment has communication module, can the user behavior characteristic information of collection be sent to the first electronic equipment by described communication module, to make the first electronic equipment in the time judging user in fatigue state by the described user behavior characteristic information of analysis, output information.It will be understood by those skilled in the art that, in this embodiment, the second electronic equipment can be the autonomous device with processor, can be also " acquisition module " without processor, it can gather user behavior characteristic information, and by communication module, the data of collection is sent.
A kind of reminding method, described method is applied to the second electronic equipment, and described the second electronic equipment is connected with the first electronic equipment data, comprising:
S601, gathers user behavior characteristic information.
When specific implementation, the second electronic equipment has the second sensor, gathers user behavior characteristic information and comprises: utilize described the second sensor to gather secondary signal, described secondary signal is for characterizing user's head behavior characteristic information.The second sensor is specifically as follows acceleration transducer, angular-rate sensor (gyroscope) or gravity sensor G-sensor(Gravity-sensor).In the time that the second sensor is gravity sensor or acceleration transducer, the tractive force that it can be subject to by perception the second electronic equipment or the variation of gravity, measure the variation of its corresponding acceleration, and then go out electronic equipment angle of inclination with respect to the horizontal plane according to the measure of the change of acceleration.Using angular-rate sensor is that gyroscope can directly be measured electronic equipment angle of inclination with respect to the horizontal plane.When specific implementation, the second electronic equipment can have stuck-module, and described stuck-module, in the time that user wears described the second electronic equipment, maintains the relative position relation of described the second electronic equipment and described user's head.For example, the form of the second electronic equipment is specifically as follows glasses type electronic equipment or wear-type electronic equipment.In the time that user's head inclination angle changes, the second sensor is exportable secondary signal.
When specific implementation, the second electronic equipment has the second image capture module, described collection user behavior characteristic information comprises: utilize described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information, obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.Preferably, the second electronic equipment can have stuck-module, and described stuck-module, in the time that user wears described the second electronic equipment, maintains the relative position relation of described the second electronic equipment and described user's head.For example, the form of the second electronic equipment is specifically as follows glasses type electronic equipment or wear-type electronic equipment.Further, described the second electronic equipment can have the second fixed part, for fixing described image capture module, makes described image capture module be positioned at user's viewing area and the user oriented eyes of described image capture module.
S602, the described user behavior characteristic information gathering is sent to the first electronic equipment, to make described the first electronic equipment obtain described user behavior characteristic information, judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result, when showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.
The second electronic equipment has communication module, can realize and the communication connection of the first electronic equipment, and the user behavior characteristic information of collection is sent to the first electronic equipment.Concrete, the secondary signal of collection is sent to the first electronic equipment by the second electronic equipment, or the head behavioural characteristic recognition result and/or the eye behavioural characteristic recognition result that obtain user are sent to the first electronic equipment by the second electronic equipment.Wherein, the user behavior characteristic information that the first electronic equipment utilization receives is analyzed the process of judgement, refers to the introduction of the 4th embodiment and the 5th embodiment.
In sixth embodiment of the invention, in the time that the second electronic equipment collects user behavior characteristic information, by communication module, the user behavior characteristic information of collection is sent to the first electronic equipment, first pre-conditioned to make the first electronic equipment judge whether described user behavior characteristic information meets, obtain the first judged result, when showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.Because the second electronic equipment can be for not having the module of processor, it needs to gather user behavior characteristic information, has therefore saved cost.For example, can in common headset equipment, (such as glasses, bluetooth earphone etc.) increase sensor or image capture module, and make it there is communication function,, can realize object of the present invention, save cost.Especially, the second electronic equipment has spectacle or the wear-type electronic equipment of stuck-module, can gather easily user behavior characteristic information, and be convenient for carrying.
Referring to Fig. 7, the first electronic equipment schematic diagram providing for the embodiment of the present invention.
A kind of the first electronic equipment, described the first electronic equipment comprises:
Acquisition module 701, has and obtains user behavior characteristic information.
Judge module 702, the user behavior characteristic information sending for receiving described acquisition module, judges whether described user behavior characteristic information meets first pre-conditioned, obtains the first judged result;
Output module 703, the first judged result sending for receiving described judge module, meets described first when pre-conditioned, output information when described the first judged result shows described user behavior characteristic information.
In one embodiment, described the first electronic equipment has first sensor, described acquisition module specifically for: utilize described first sensor to obtain first signal, described first signal is for characterizing user's head behavior characteristic information.
Further, described the first electronic equipment has first communication module, for being connected with the second electronic equipment data, described the second electronic equipment has the second sensor, described acquisition module specifically for: receive described second electronic equipment send secondary signal, described secondary signal utilizes described the second sensor to gather by described the second electronic equipment, and described secondary signal is for characterizing user's head behavior characteristic information.
Further, described the second electronic equipment has stuck-module, described stuck-module, in the time that user wears described the second electronic equipment, maintains the relative position relation of described the second electronic equipment and described user's head, and described the second electronic equipment is different from described the first electronic equipment.
Further, described judgement institute module specifically for: when when analyzing head that first signal judge user in the first state, definite meet first pre-conditioned.
Further, described the first electronic equipment has stuck-module, and described stuck-module, in the time that user wears described the first electronic equipment, maintains the relative position relation of described the first electronic equipment and described user's head.
In another embodiment, described the first electronic equipment has the first image capture module, described acquisition module specifically for: utilize described the first image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information, obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
Further, described the first electronic equipment is connected with the second electronic equipment data, described the second electronic equipment has the second image capture module, described acquisition module specifically for: receive user's head behavioural characteristic recognition result and/or the eye behavioural characteristic recognition result that described the second electronic equipment sends, described user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result utilize described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information the user's to collection head behavior characteristic information and/or eye behavior characteristic information to carry out image recognition by described the second electronic equipment and obtain.
Described judge module is specifically for analyzing described head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result; When the head that judges user is in the time that the first state or the eye behavior that judges user show user in fatigue state, determine meet first pre-conditioned.
Further, described output module comprises:
Voice output module, for output language information;
Vibration module, for exporting vibration prompt information;
Display module, for exporting text or picture cues information.
Referring to Fig. 8, the second electronic equipment schematic diagram providing for the embodiment of the present invention.
A kind of the second electronic equipment, described the second electronic equipment has second communication module, for being connected with the first electronic equipment data, comprising:
Acquisition module 801, for gathering user behavior characteristic information.
Sending module 802, for the described user behavior characteristic information gathering is sent to the first electronic equipment, to make described the first electronic equipment obtain described user behavior characteristic information, judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result, when showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.
Further, described the second electronic equipment has the second sensor, and described acquisition module is specifically for utilizing described the second sensor to gather secondary signal, and described secondary signal is for characterizing user's head behavior characteristic information.
Further, described the second electronic equipment has stuck-module, described stuck-module, in the time that user wears described the second electronic equipment, maintains the relative position relation of described the second electronic equipment and described user's head, and described the second electronic equipment is different from described the first electronic equipment.
Further, described the second electronic equipment has the second image capture module, described acquisition module, specifically for utilizing described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information, obtains user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
It should be noted that, in this article, relational terms such as the first and second grades is only used for an entity or operation to separate with another entity or operational zone, and not necessarily requires or imply and between these entities or operation, have the relation of any this reality or sequentially.And, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thereby the process, method, article or the equipment that make to comprise a series of key elements not only comprise those key elements, but also comprise other key elements of clearly not listing, or be also included as the intrinsic key element of this process, method, article or equipment.The in the situation that of more restrictions not, the key element being limited by statement " comprising ... ", and be not precluded within process, method, article or the equipment that comprises described key element and also have other identical element.
The present invention can describe in the general context of computer executable instructions, for example program module.Usually, program module comprises and carries out particular task or realize routine, program, object, assembly, data structure of particular abstract data type etc.Also can in distributed computing environment, put into practice the present invention, in these distributed computing environment, be executed the task by the teleprocessing equipment being connected by communication network.In distributed computing environment, program module can be arranged in the local and remote computer-readable storage medium including memory device.
Each embodiment in this instructions all adopts the mode of going forward one by one to describe, between each embodiment identical similar part mutually referring to, what each embodiment stressed is and the difference of other embodiment.Especially,, for device embodiment, because it is substantially similar in appearance to embodiment of the method, so describe fairly simplely, relevant part is referring to the part explanation of embodiment of the method.Device embodiment described above is only schematic, the wherein said module as separating component explanation can or can not be also physically to separate, the parts that show as module can be or can not be also physical modules, can be positioned at a place, or also can be distributed on multiple mixed-media network modules mixed-medias.Can select according to the actual needs some or all of module wherein to realize the object of the present embodiment scheme.Those of ordinary skill in the art, in the situation that not paying creative work, are appreciated that and implement.
The above is only the specific embodiment of the present invention; it should be pointed out that for those skilled in the art, under the premise without departing from the principles of the invention; can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.
Claims (20)
1. a reminding method, is characterized in that, described method is applied to the first electronic equipment, comprising:
Obtain user behavior characteristic information;
Judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result;
When showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.
2. method according to claim 1, is characterized in that, described the first electronic equipment has first sensor, obtains user behavior characteristic information described in to comprise:
Utilize described first sensor to obtain first signal, described first signal is for characterizing user's head behavior characteristic information.
3. method according to claim 1, is characterized in that, described the first electronic equipment is connected with the second electronic equipment data, and described the second electronic equipment has the second sensor, obtains user behavior characteristic information described in to comprise:
Receive the secondary signal that described the second electronic equipment sends, described secondary signal utilizes described the second sensor to gather by described the second electronic equipment, and described secondary signal is for characterizing user's head behavior characteristic information.
4. according to the method described in claims 1 to 3 any one, it is characterized in that, described to judge whether described user behavior characteristic information meets first pre-conditioned, obtains the first judged result and comprise:
When analyzing head that first signal judge user in the first state, definite meet first pre-conditioned.
5. method according to claim 1, is characterized in that, described the first electronic equipment has the first image capture module, obtains user behavior characteristic information described in to comprise:
Utilize described the first image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information, obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
6. method according to claim 1, is characterized in that, described the first electronic equipment is connected with the second electronic equipment data, and described the second electronic equipment has the second image capture module, obtains user behavior characteristic information described in to comprise:
Receive the user's of described the second electronic equipment transmission head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result, described user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result utilize described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information the user's to collection head behavior characteristic information and/or eye behavior characteristic information to carry out image recognition by described the second electronic equipment and obtain.
7. according to the method described in claim 5 or 6, it is characterized in that, described to judge whether described user behavior characteristic information meets first pre-conditioned, obtains the first judged result and comprise:
Analyze described head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result;
When the head that judges user is in the time that the first state or the eye behavior that judges user show user in fatigue state, determine meet first pre-conditioned.
8. method according to claim 1, is characterized in that, described output information comprises:
By voice output module output language information; With or
Export vibration prompt information by vibration module; And/or
Export text or picture cues information by display module.
9. a reminding method, is characterized in that, described method is applied to the second electronic equipment, and described the second electronic equipment is connected with the first electronic equipment data, comprising:
Gather user behavior characteristic information;
The described user behavior characteristic information gathering is sent to the first electronic equipment, to make described the first electronic equipment obtain described user behavior characteristic information, judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result, when showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.
10. method according to claim 9, is characterized in that, described the second electronic equipment has the second sensor, and described collection user behavior characteristic information comprises:
Utilize described the second sensor to gather secondary signal, described secondary signal is for characterizing user's head behavior characteristic information.
11. according to the method described in claim 9 or 10, it is characterized in that, described the second electronic equipment has stuck-module, described stuck-module is in the time that user wears described the second electronic equipment, maintain the relative position relation of described the second electronic equipment and described user's head, described the second electronic equipment is different from described the first electronic equipment.
12. methods according to claim 9, is characterized in that, described the second electronic equipment has the second image capture module, and described collection user behavior characteristic information comprises:
Utilize described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information, obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
13. a kind of first electronic equipment, is characterized in that, described the first electronic equipment comprises:
Acquisition module, has and obtains user behavior characteristic information;
Judge module, the user behavior characteristic information sending for receiving described acquisition module, judges whether described user behavior characteristic information meets first pre-conditioned, obtains the first judged result;
Output module, the first judged result sending for receiving described judge module, meets described first when pre-conditioned, output information when described the first judged result shows described user behavior characteristic information.
14. the first electronic equipments according to claim 13, is characterized in that, described the first electronic equipment has first sensor, described acquisition module specifically for:
Utilize described first sensor to obtain first signal, described first signal is for characterizing user's head behavior characteristic information.
15. the first electronic equipments according to claim 13, is characterized in that, described the first electronic equipment is connected with the second electronic equipment data, and described the second electronic equipment has the second sensor, described acquisition module specifically for:
Receive the secondary signal that described the second electronic equipment sends, described secondary signal utilizes described the second sensor to gather by described the second electronic equipment, and described secondary signal is for characterizing user's head behavior characteristic information.
16. the first electronic equipments according to claim 13, is characterized in that, described the first electronic equipment has the first image capture module, described acquisition module specifically for:
Utilize described the first image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information, obtain user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result.
17. the first electronic equipments according to claim 13, is characterized in that, described the first electronic equipment is connected with the second electronic equipment data, and described the second electronic equipment has the second image capture module, described acquisition module specifically for:
Receive the user's of described the second electronic equipment transmission head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result, described user's head behavioural characteristic recognition result and/or eye behavioural characteristic recognition result utilize described the second image capture module collection user's head behavior characteristic information and/or eye behavior characteristic information the user's to collection head behavior characteristic information and/or eye behavior characteristic information to carry out image recognition by described the second electronic equipment and obtain.
18. electronic equipments according to claim 13, it is characterized in that, described the first electronic equipment has stuck-module, and described stuck-module, in the time that user wears described the first electronic equipment, maintains the relative position relation of described the first electronic equipment and described user's head.
19. a kind of second electronic equipment, is characterized in that, described the second electronic equipment is connected with the first electronic equipment data, comprising:
Acquisition module, for gathering user behavior characteristic information;
Sending module, for the described user behavior characteristic information gathering is sent to the first electronic equipment, to make described the first electronic equipment obtain described user behavior characteristic information, judge whether described user behavior characteristic information meets first pre-conditioned, obtain the first judged result, when showing described user behavior characteristic information, described the first judged result meets described first when pre-conditioned, output information.
20. the second electronic equipments according to claim 19, it is characterized in that, described the second electronic equipment has stuck-module, described stuck-module is in the time that user wears described the second electronic equipment, maintain the relative position relation of described the second electronic equipment and described user's head, described the second electronic equipment is different from described the first electronic equipment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310058892.6A CN104008628A (en) | 2013-02-25 | 2013-02-25 | Prompting method and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310058892.6A CN104008628A (en) | 2013-02-25 | 2013-02-25 | Prompting method and equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104008628A true CN104008628A (en) | 2014-08-27 |
Family
ID=51369263
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310058892.6A Pending CN104008628A (en) | 2013-02-25 | 2013-02-25 | Prompting method and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104008628A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104333654A (en) * | 2014-10-14 | 2015-02-04 | 京东方科技集团股份有限公司 | Danger warning method, danger warning device and portable electronic equipment |
CN104574817A (en) * | 2014-12-25 | 2015-04-29 | 清华大学苏州汽车研究院(吴江) | Machine vision-based fatigue driving pre-warning system suitable for smart phone |
CN105282717A (en) * | 2015-06-29 | 2016-01-27 | 维沃移动通信有限公司 | Prompting method and mobile terminal |
CN107317916A (en) * | 2017-05-26 | 2017-11-03 | 广东欧珀移动通信有限公司 | Application control method and related product |
WO2018014598A1 (en) * | 2016-07-18 | 2018-01-25 | 中兴通讯股份有限公司 | Method and terminal for alerting fatigued driver |
CN109448455A (en) * | 2018-12-20 | 2019-03-08 | 广东小天才科技有限公司 | Recitation method for real-time error correction and family education equipment |
CN112308914A (en) * | 2020-03-06 | 2021-02-02 | 北京字节跳动网络技术有限公司 | Method, apparatus, device and medium for processing information |
CN115886816A (en) * | 2022-11-15 | 2023-04-04 | 立讯精密科技(南京)有限公司 | Fatigue detection method based on VR/AR equipment and VR/AR equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2087790U (en) * | 1990-07-25 | 1991-10-30 | 邱水文 | Glasses with sleepiness warning device |
CN201060334Y (en) * | 2007-01-26 | 2008-05-14 | 深圳大学 | Anti-doze glasses |
CN201210338Y (en) * | 2008-06-13 | 2009-03-18 | 王钦兵 | Head wearing intelligent prompter |
CN201628825U (en) * | 2009-12-28 | 2010-11-10 | 金陵科技学院 | Anti-doze eyeglasses |
CN102201148A (en) * | 2011-05-25 | 2011-09-28 | 北京航空航天大学 | Driver fatigue detecting method and system based on vision |
CN102752458A (en) * | 2012-07-19 | 2012-10-24 | 北京理工大学 | Driver fatigue detection mobile phone and unit |
-
2013
- 2013-02-25 CN CN201310058892.6A patent/CN104008628A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2087790U (en) * | 1990-07-25 | 1991-10-30 | 邱水文 | Glasses with sleepiness warning device |
CN201060334Y (en) * | 2007-01-26 | 2008-05-14 | 深圳大学 | Anti-doze glasses |
CN201210338Y (en) * | 2008-06-13 | 2009-03-18 | 王钦兵 | Head wearing intelligent prompter |
CN201628825U (en) * | 2009-12-28 | 2010-11-10 | 金陵科技学院 | Anti-doze eyeglasses |
CN102201148A (en) * | 2011-05-25 | 2011-09-28 | 北京航空航天大学 | Driver fatigue detecting method and system based on vision |
CN102752458A (en) * | 2012-07-19 | 2012-10-24 | 北京理工大学 | Driver fatigue detection mobile phone and unit |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104333654A (en) * | 2014-10-14 | 2015-02-04 | 京东方科技集团股份有限公司 | Danger warning method, danger warning device and portable electronic equipment |
US9928710B2 (en) | 2014-10-14 | 2018-03-27 | Boe Technology Group Co., Ltd. | Danger alerting method and device, portable electronic apparatus |
CN104574817A (en) * | 2014-12-25 | 2015-04-29 | 清华大学苏州汽车研究院(吴江) | Machine vision-based fatigue driving pre-warning system suitable for smart phone |
CN105282717B (en) * | 2015-06-29 | 2019-12-03 | 维沃移动通信有限公司 | Based reminding method and mobile terminal |
CN105282717A (en) * | 2015-06-29 | 2016-01-27 | 维沃移动通信有限公司 | Prompting method and mobile terminal |
WO2018014598A1 (en) * | 2016-07-18 | 2018-01-25 | 中兴通讯股份有限公司 | Method and terminal for alerting fatigued driver |
US10725820B2 (en) | 2017-05-26 | 2020-07-28 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Application control method and mobile terminal |
CN107317916B (en) * | 2017-05-26 | 2019-09-10 | Oppo广东移动通信有限公司 | Application control method and Related product |
CN107317916A (en) * | 2017-05-26 | 2017-11-03 | 广东欧珀移动通信有限公司 | Application control method and related product |
CN109448455A (en) * | 2018-12-20 | 2019-03-08 | 广东小天才科技有限公司 | Recitation method for real-time error correction and family education equipment |
CN112308914A (en) * | 2020-03-06 | 2021-02-02 | 北京字节跳动网络技术有限公司 | Method, apparatus, device and medium for processing information |
CN115886816A (en) * | 2022-11-15 | 2023-04-04 | 立讯精密科技(南京)有限公司 | Fatigue detection method based on VR/AR equipment and VR/AR equipment |
CN115886816B (en) * | 2022-11-15 | 2024-05-10 | 立讯精密科技(南京)有限公司 | VR/AR equipment and fatigue detection method based on same |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104008628A (en) | Prompting method and equipment | |
EP3180675B1 (en) | Identifying gestures using motion data | |
US9437157B2 (en) | Image processing apparatus and image processing method | |
US20150062032A1 (en) | Mobile communication terminal, screen adjusting method and storage medium thereof | |
WO2017101047A1 (en) | Charging control method and device, power adapter, and mobile terminal | |
CN110969981A (en) | Screen display parameter adjusting method and electronic equipment | |
CN108988421B (en) | Battery charging method, charging circuit and terminal | |
CN111899545B (en) | Driving reminding method and device, storage medium and mobile terminal | |
CN109962514B (en) | Charging method and mobile terminal | |
EP4113778A1 (en) | Circuit control apparatus and method | |
CN108446207B (en) | Method, device and system for evaluating disaster tolerance capability of computer system | |
CN108012026B (en) | Eyesight protection method and mobile terminal | |
EP3503332A1 (en) | Electric charging protection method, terminal, and charger | |
CN109525837B (en) | Image generation method and mobile terminal | |
EP4096207A1 (en) | Mobile terminal, method for detecting image capturing mode, and storage medium | |
CN206181168U (en) | Terminal equipment and mobile terminal | |
CN108984145B (en) | Brightness adjusting method and electronic equipment | |
CN110825223A (en) | Control method and intelligent glasses | |
CN111182215B (en) | Power supply device of separable camera module | |
CN109755997A (en) | A kind of charging method and terminal device | |
CN108810284B (en) | Mode switching method of camera and electronic equipment | |
CN108595352B (en) | Protection method and device for mobile terminal | |
CN108448177B (en) | Charging method and terminal | |
CN107967086B (en) | Icon arrangement method and device for mobile terminal and mobile terminal | |
CN106557168A (en) | Intelligent glasses and its control method, control device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20140827 |