US20130190043A1 - Portable device including mouth detection to initiate speech recognition and/or voice commands - Google Patents
Portable device including mouth detection to initiate speech recognition and/or voice commands Download PDFInfo
- Publication number
- US20130190043A1 US20130190043A1 US13/691,351 US201213691351A US2013190043A1 US 20130190043 A1 US20130190043 A1 US 20130190043A1 US 201213691351 A US201213691351 A US 201213691351A US 2013190043 A1 US2013190043 A1 US 2013190043A1
- Authority
- US
- United States
- Prior art keywords
- mouth
- phone
- voice
- detected
- picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/247—Telephone sets including user guidance or feature selection means facilitating their use
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/247—Telephone sets including user guidance or feature selection means facilitating their use
- H04M1/2477—Telephone sets including user guidance or feature selection means facilitating their use for selecting a function from a menu display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Definitions
- One embodiment provides a method comprising: detecting a voice; checking if a mouth is detected; and activating a voice recognition application on a phone if both the voice and the mouth are detected.
- FIG. 1 illustrates a diagram of a phone that is held up to the mouth of a user, where the user is talking into the phone, according to one embodiment.
- FIG. 2 illustrates a block diagram of a phone, which may be used to implement the embodiments described herein.
- FIG. 3 illustrates an example simplified flow diagram for enhancing phone functionality based on detection of a mouth of a user, according to one embodiment.
- Embodiments described herein enhance phone functionality based on detection of a mouth of a user.
- the phone if a phone detects both a voice and mouth, the phone automatically activates a voice recognition application on a phone. In other words, if a user holds the phone up to the user's mouth and talks, the phone automatically interprets what the user is saying without the user needing to manually activate the voice recognition application.
- FIG. 1 illustrates a diagram of a phone 100 that is held up to the mouth 102 of a user, where the user is talking into phone 100 , according to one embodiment.
- phone 100 includes a display screen 104 and a camera lens 106 of a camera.
- Camera lens 106 is configured to detect objects (e.g., mouth 102 ) that are within a predetermined distance from display screen 104 .
- camera lens 106 may be configured with a field of view 108 that can detect mouth 102 when it is within a close proximity (e.g., 3 to 6 inches, or more) to display screen 104 .
- camera lens 106 may be a wide angle lens that can capture an object that is anywhere in front of display screen 104 .
- camera lens 106 may be a transparent cover over an existing camera lens, where camera lens 106 alters the optics to achieve a wider field of view and closer focus.
- camera lens 106 may be a film or button placed over an existing lens to alter the optics.
- phone 100 corrects any distortions to an image that may occur.
- Camera lens 106 may be permanently fixed to phone 100 or temporarily fixed to phone 100 .
- camera lens 106 may be a permanent auxiliary lens on phone 100 , which may be used by an existing camera or a separate dedicated camera with the purpose of detecting a user finger.
- FIG. 2 illustrates a block diagram of a phone 100 , which may be used to implement the embodiments described herein.
- phone 100 may include a processor 202 and a memory 204 .
- a phone aware application 206 may be stored on memory 204 or on any other suitable storage location or computer-readable medium.
- memory 204 may be a non-volatile memory (e.g., random-access memory (RAM), flash memory, etc.).
- Phone aware application 206 provides instructions that enable processor 202 to perform the functions described herein.
- processor 202 may include logic circuitry (not shown).
- phone 100 also includes a detection unit 210 .
- detection unit 210 may be a camera that includes an image sensor 212 and an aperture 214 .
- Image sensor 212 captures images when image sensor 212 is exposed to light passing through camera lens 106 ( FIG. 1 ).
- Aperture 214 regulates light passing through camera lens 106 .
- detection unit 210 may store the images in an image library 216 in memory 204 .
- phone 100 may not have all of the components listed and/or may have other components instead of, or in addition to, those listed above.
- the components of phone 100 shown in FIG. 2 may be implemented by one or more processors or any combination of hardware devices, as well as any combination of hardware, software, firmware, etc.
- phone 100 is described as performing the steps as described in the embodiments herein, any suitable component or combination of components of phone 100 may perform the steps described.
- phone 100 activates a voice recognition application on a phone if both the voice and mouth 102 with moving lips are detected. In one embodiment, if phone 100 detects moving lips, phone 100 activates a lip reading application. In one embodiment, phone 100 may interpret commands from the user solely by voice recognition, solely by lip reading, or a combination of both voice recognition and lip reading.
- phone 100 takes a picture if the voice is detected. Phone 100 then determines if a mouth is in the picture. If the mouth is in the picture, phone 100 determines a distance between the mouth and the phone. In one embodiment, a mouth is determined to be detected if the mouth is within a predetermined distance from the phone. One or more pictures can be taken. Video can also be used.
- any detection device or sensor may be used to check for a mouth.
- a sensor can be an image sensor, a proximity sensor, a distance sensor, an accelerometer, an infrared sensor, and an acoustic sensor, etc.
- routines of particular embodiments including C, C++, Java, assembly language, etc.
- Different programming techniques may be employed such as procedural or object-oriented.
- the routines may execute on a single processing device or on multiple processors. Although the steps, operations, or computations may be presented in a specific order, the order may be changed in particular embodiments. In some particular embodiments, multiple steps shown as sequential in this specification may be performed at the same time.
- Particular embodiments may be implemented in a computer-readable storage medium (also referred to as a machine-readable storage medium) for use by or in connection with an instruction execution system, apparatus, system, or device.
- Particular embodiments may be implemented in the form of control logic in software or hardware or a combination of both.
- the control logic when executed by one or more processors, may be operable to perform that which is described in particular embodiments.
- a “processor” includes any suitable hardware and/or software system, mechanism or component that processes data, signals or other information.
- a processor may include a system with a general-purpose central processing unit, multiple processing units, dedicated circuitry for achieving functionality, or other systems. Processing need not be limited to a geographic location, or have temporal limitations. For example, a processor may perform its functions in “real time,” “offline,” in a “batch mode,” etc. Portions of processing may be performed at different times and at different locations, by different (or the same) processing systems.
- a computer may be any processor in communication with a memory.
- the memory may be any suitable processor-readable storage medium, such as random-access memory (RAM), read-only memory (ROM), magnetic or optical disk, or other tangible media suitable for storing instructions for execution by the processor.
- Particular embodiments may be implemented by using a programmed general purpose digital computer, by using application specific integrated circuits, programmable logic devices, field programmable gate arrays, optical, chemical, biological, quantum or nanoengineered systems, components and mechanisms.
- the functions of particular embodiments may be achieved by any means known in the art.
- Distributed, networked systems, components, and/or circuits may be used. Communication or transfer of data may be wired, wireless, or by any other means.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Environmental & Geological Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
- Position Input By Displaying (AREA)
Abstract
Embodiments generally relate to portable electronic devices such as a phone with a camera and touchscreen. In one embodiment, a method includes detecting a voice and checking if an image of a mouth is detected by using the camera. An embodiment also includes activating a voice recognition application on a phone if both the voice and the mouth are detected.
Description
- This application claims priority from U.S. Provisional Patent Application Ser. No. 61/590,284; entitled “USER INTERFACE USING DEVICE AWARENESS”, filed on Jan. 24, 2012, which is hereby incorporated by reference as if set forth in full in this document for all purposes.
- Many conventional computing devices such as computers, tablets, game consoles, televisions, monitors, phones, etc., include a touchscreen. A touchscreen enables a user to interact directly with displayed objects on the touchscreen by touching the objects with a hand, finger, stylus, or other item. Such displayed objects may include controls that control functions on a phone. Using the touchscreen, the user can activate controls by touching corresponding objects on the touchscreen. For example, the user can touch an object such as a button on the touchscreen to activate a voice recognition application on the phone. The user can touch the touchscreen and swipe up and down to scroll a page up and down on the touchscreen.
- Embodiments generally relate to a phone. In one embodiment, a method includes detecting a voice and checking if a mouth is detected. The method also includes activating a voice recognition application or voice command input on a phone if both the voice and the mouth are detected.
- One embodiment provides a method comprising: detecting a voice; checking if a mouth is detected; and activating a voice recognition application on a phone if both the voice and the mouth are detected.
-
FIG. 1 illustrates a diagram of a phone that is held up to the mouth of a user, where the user is talking into the phone, according to one embodiment. -
FIG. 2 illustrates a block diagram of a phone, which may be used to implement the embodiments described herein. -
FIG. 3 illustrates an example simplified flow diagram for enhancing phone functionality based on detection of a mouth of a user, according to one embodiment. - Embodiments described herein enhance phone functionality based on detection of a mouth of a user. In one embodiment, if a phone detects both a voice and mouth, the phone automatically activates a voice recognition application on a phone. In other words, if a user holds the phone up to the user's mouth and talks, the phone automatically interprets what the user is saying without the user needing to manually activate the voice recognition application.
-
FIG. 1 illustrates a diagram of aphone 100 that is held up to themouth 102 of a user, where the user is talking intophone 100, according to one embodiment. In one embodiment,phone 100 includes adisplay screen 104 and acamera lens 106 of a camera.Camera lens 106 is configured to detect objects (e.g., mouth 102) that are within a predetermined distance fromdisplay screen 104. In one embodiment,camera lens 106 may be configured with a field ofview 108 that can detectmouth 102 when it is within a close proximity (e.g., 3 to 6 inches, or more) to displayscreen 104. - In one embodiment,
camera lens 106 may be a wide angle lens that can capture an object that is anywhere in front ofdisplay screen 104. In one embodiment,camera lens 106 may be a transparent cover over an existing camera lens, wherecamera lens 106 alters the optics to achieve a wider field of view and closer focus. As an overlay,camera lens 106 may be a film or button placed over an existing lens to alter the optics. In one embodiment, ifcamera lens 106 overlays an existing camera lens,phone 100 corrects any distortions to an image that may occur.Camera lens 106 may be permanently fixed tophone 100 or temporarily fixed tophone 100. In one embodiment,camera lens 106 may be a permanent auxiliary lens onphone 100, which may be used by an existing camera or a separate dedicated camera with the purpose of detecting a user finger. - While
camera lens 106 is shown in the upper center portion ofphone 100,camera lens 100 may be located anywhere on the face ofphone 100 One or more lenses or cameras may be used, placed and oriented on the device as desired.FIG. 2 illustrates a block diagram of aphone 100, which may be used to implement the embodiments described herein. In one embodiment,phone 100 may include aprocessor 202 and amemory 204. A phoneaware application 206 may be stored onmemory 204 or on any other suitable storage location or computer-readable medium. In one embodiment,memory 204 may be a non-volatile memory (e.g., random-access memory (RAM), flash memory, etc.). Phoneaware application 206 provides instructions that enableprocessor 202 to perform the functions described herein. In one embodiment,processor 202 may include logic circuitry (not shown). - In one embodiment,
phone 100 also includes adetection unit 210. In one embodiment,detection unit 210 may be a camera that includes animage sensor 212 and anaperture 214.Image sensor 212 captures images whenimage sensor 212 is exposed to light passing through camera lens 106 (FIG. 1 ). Aperture 214 regulates light passing throughcamera lens 106. In one embodiment, afterdetection unit 210 captures images,detection unit 210 may store the images in animage library 216 inmemory 204. - In other embodiments,
phone 100 may not have all of the components listed and/or may have other components instead of, or in addition to, those listed above. - The components of
phone 100 shown inFIG. 2 may be implemented by one or more processors or any combination of hardware devices, as well as any combination of hardware, software, firmware, etc. - While
phone 100 is described as performing the steps as described in the embodiments herein, any suitable component or combination of components ofphone 100 may perform the steps described. -
FIG. 3 illustrates an example simplified flow diagram for enhancing phone functionality based on detection of a mouth of a user, according to one embodiment. Referring to bothFIGS. 1 and 3 , a method is initiated inblock 302, wherephone 100 detects a voice. In one embodiment, the voice includes speech. Inblock 304,phone 100 checks if amouth 102 is detected. Inblock 306,phone 100 activates a voice recognition application on a phone if both the voice andmouth 102 are detected. In one embodiment, a face is sufficient determine that the user intends to speak intophone 100. In other words,phone 100 activates a voice recognition application on a phone if both the voice and a face are detected. - In one embodiment,
phone 100 activates a voice recognition application on a phone if both the voice andmouth 102 with moving lips are detected. In one embodiment, ifphone 100 detects moving lips,phone 100 activates a lip reading application. In one embodiment,phone 100 may interpret commands from the user solely by voice recognition, solely by lip reading, or a combination of both voice recognition and lip reading. - In one embodiment, to detect a mouth,
phone 100 takes a picture if the voice is detected.Phone 100 then determines if a mouth is in the picture. If the mouth is in the picture,phone 100 determines a distance between the mouth and the phone. In one embodiment, a mouth is determined to be detected if the mouth is within a predetermined distance from the phone. One or more pictures can be taken. Video can also be used. - In one embodiment, the predetermined distance (e.g., 0 to 12 inches, or more, etc.) is set to a default distance that is set at the factory. In one embodiment, the user may modify the predetermined distance. The user may also modify the field of
view 108 angle. A face ormouth 102 that is close todisplay screen 102 is indicative of the user intending to speak intophone 100. For example, if the users mouth/face is within 12 inches fromdisplay screen 104, the user probably intends to speak intophone 100 to activate a control. - In one embodiment, any detection device or sensor may be used to check for a mouth. For example, such a sensor can be an image sensor, a proximity sensor, a distance sensor, an accelerometer, an infrared sensor, and an acoustic sensor, etc.
- Although the description has been described with respect to particular embodiments thereof, these particular embodiments are merely illustrative, and not restrictive.
- Any suitable programming language may be used to implement the routines of particular embodiments including C, C++, Java, assembly language, etc. Different programming techniques may be employed such as procedural or object-oriented. The routines may execute on a single processing device or on multiple processors. Although the steps, operations, or computations may be presented in a specific order, the order may be changed in particular embodiments. In some particular embodiments, multiple steps shown as sequential in this specification may be performed at the same time.
- Particular embodiments may be implemented in a computer-readable storage medium (also referred to as a machine-readable storage medium) for use by or in connection with an instruction execution system, apparatus, system, or device. Particular embodiments may be implemented in the form of control logic in software or hardware or a combination of both. The control logic, when executed by one or more processors, may be operable to perform that which is described in particular embodiments.
- A “processor” includes any suitable hardware and/or software system, mechanism or component that processes data, signals or other information. A processor may include a system with a general-purpose central processing unit, multiple processing units, dedicated circuitry for achieving functionality, or other systems. Processing need not be limited to a geographic location, or have temporal limitations. For example, a processor may perform its functions in “real time,” “offline,” in a “batch mode,” etc. Portions of processing may be performed at different times and at different locations, by different (or the same) processing systems. A computer may be any processor in communication with a memory. The memory may be any suitable processor-readable storage medium, such as random-access memory (RAM), read-only memory (ROM), magnetic or optical disk, or other tangible media suitable for storing instructions for execution by the processor.
- Particular embodiments may be implemented by using a programmed general purpose digital computer, by using application specific integrated circuits, programmable logic devices, field programmable gate arrays, optical, chemical, biological, quantum or nanoengineered systems, components and mechanisms. In general, the functions of particular embodiments may be achieved by any means known in the art. Distributed, networked systems, components, and/or circuits may be used. Communication or transfer of data may be wired, wireless, or by any other means.
- It will also be appreciated that one or more of the elements depicted in the drawings/figures may also be implemented in a more separated or integrated manner, or even removed or rendered as inoperable in certain cases, as is useful in accordance with a particular application. It is also within the spirit and scope to implement a program or code that is stored in a machine-readable medium to permit a computer to perform any of the methods described above.
- As used in the description herein and throughout the claims that follow, “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
- While one or more implementations have been described by way of example and in terms of the specific embodiments, it is to be understood that the implementations are not limited to the disclosed embodiments. To the contrary, they are intended to cover various modifications and similar arrangements as would be apparent to those skilled in the art. Therefore, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.
- Thus, while particular embodiments have been described herein, latitudes of modification, various changes, and substitutions are intended in the foregoing disclosures, and it will be appreciated that in some instances some features of particular embodiments will be employed without a corresponding use of other features without departing from the scope and spirit as set forth. Therefore, many modifications may be made to adapt a particular situation or material to the essential scope and spirit.
Claims (20)
1. A method comprising:
detecting a voice;
checking if a mouth is detected; and
activating a voice recognition application on a phone if both the voice and the mouth are detected.
2. The method of claim 1 , wherein the detecting of the mouth comprises detecting moving lips.
3. The method of claim 1 , further comprising:
detecting moving lips; and
activating a lip reading application.
4. The method of claim 1 , further comprising:
detecting moving lips;
activating a lip reading application; and
interpreting commands from a user by one or more of voice recognition and lip reading.
5. The method of claim 1 , wherein the detecting of the mouth comprises:
taking a picture if the voice is detected; and
determining if a mouth is in the picture.
6. The method of claim 1 , wherein the detecting of the mouth comprises:
taking a picture if the voice is detected;
determining if a mouth is in the picture; and
if the mouth is in the picture, determining a distance between the mouth and the phone, wherein the mouth is determined to be detected if the mouth is within a predetermined distance from the phone.
7. A computer-readable storage medium carrying one or more sequences of instructions thereon, the instructions when executed by a processor cause the processor to:
detect a voice;
check if a mouth is detected; and
activate a voice recognition application on a phone if both the voice and the mouth are detected.
8. The computer-readable storage medium of claim 7 , wherein the instructions further cause the processor to detect moving lips.
9. The computer-readable storage medium of claim 7 , wherein the instructions further cause the processor to :
detect moving lips; and
activate a lip reading application.
10. The computer-readable storage medium of claim 7 , wherein the instructions further cause the processor to:
detect moving lips;
activate a lip reading application; and
interpret commands from a user by one or more of voice recognition and lip reading.
11. The computer-readable storage medium of claim 7 , wherein the instructions further cause the processor to:
take a picture if the voice is detected; and
determine if a mouth is in the picture.
12. The computer-readable storage medium of claim 7 , wherein the instructions further cause the processor to:
take a picture if the voice is detected;
determine if a mouth is in the picture; and
if the mouth is in the picture, determine a distance between the mouth and the phone, wherein the mouth is determined to be detected if the mouth is within a predetermined distance from the phone.
13. An apparatus comprising:
one or more processors; and
logic encoded in one or more tangible media for execution by the one or more processors, and when executed operable to:
detect a voice;
check if a mouth is detected; and
activate a voice recognition application on a phone if both the voice and the mouth are detected.
14. The apparatus of claim 13 , further comprising a sensor that checks for the mouth.
15. The apparatus of claim 13 , further comprising a sensor that checks for the mouth, wherein the sensor is one or more of an image sensor, a proximity sensor, a distance sensor, an accelerometer, and an acoustic sensor.
16. The apparatus of claim 13 , further comprising a camera that has a lens configured to detect objects within the predetermined distance from the phone.
17. The apparatus of claim 13 , wherein the logic when executed is further operable to detect moving lips.
18. The apparatus of claim 13 , wherein the logic when executed is further operable to:
detect moving lips; and
activate a lip reading application.
19. The apparatus of claim 13 , wherein the logic when executed is further operable to:
take a picture if the voice is detected; and
determine if a mouth is in the picture.
20. The apparatus of claim 13 , wherein the logic when executed is further operable to:
take a picture if the voice is detected;
determine if a mouth is in the picture; and
if the mouth is in the picture, determine a distance between the mouth and the phone, wherein the mouth is determined to be detected if the mouth is within a predetermined distance from the phone.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/691,351 US20130190043A1 (en) | 2012-01-24 | 2012-11-30 | Portable device including mouth detection to initiate speech recognition and/or voice commands |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261590284P | 2012-01-24 | 2012-01-24 | |
US13/691,351 US20130190043A1 (en) | 2012-01-24 | 2012-11-30 | Portable device including mouth detection to initiate speech recognition and/or voice commands |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130190043A1 true US20130190043A1 (en) | 2013-07-25 |
Family
ID=48796927
Family Applications (9)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/492,478 Expired - Fee Related US8863042B2 (en) | 2012-01-24 | 2012-06-08 | Handheld device with touch controls that reconfigure in response to the way a user operates the device |
US13/691,372 Expired - Fee Related US9124686B2 (en) | 2012-01-24 | 2012-11-30 | Portable device including automatic scrolling in response to a user's eye position and/or movement |
US13/691,332 Expired - Fee Related US8938274B2 (en) | 2012-01-24 | 2012-11-30 | User interface for a portable device including detecting absence of a finger near a touchscreen to prevent activating a control |
US13/691,351 Abandoned US20130190043A1 (en) | 2012-01-24 | 2012-11-30 | Portable device including mouth detection to initiate speech recognition and/or voice commands |
US13/691,364 Expired - Fee Related US9124685B2 (en) | 2012-01-24 | 2012-11-30 | Portable device including control actions dependent on a user looking at a touchscreen |
US13/691,339 Expired - Fee Related US8655416B2 (en) | 2012-01-24 | 2012-11-30 | User interface for a portable device including detecting close proximity of an ear to allow control of an audio application with a touchscreen |
US13/691,322 Expired - Fee Related US8938273B2 (en) | 2012-01-24 | 2012-11-30 | User interface for a portable device including detecting proximity of a finger near a touchscreen to prevent changing the display |
US14/479,712 Expired - Fee Related US9350841B2 (en) | 2012-01-24 | 2014-09-08 | Handheld device with reconfiguring touch controls |
US15/146,786 Expired - Fee Related US9626104B2 (en) | 2012-01-24 | 2016-05-04 | Thumb access area for one-handed touchscreen use |
Family Applications Before (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/492,478 Expired - Fee Related US8863042B2 (en) | 2012-01-24 | 2012-06-08 | Handheld device with touch controls that reconfigure in response to the way a user operates the device |
US13/691,372 Expired - Fee Related US9124686B2 (en) | 2012-01-24 | 2012-11-30 | Portable device including automatic scrolling in response to a user's eye position and/or movement |
US13/691,332 Expired - Fee Related US8938274B2 (en) | 2012-01-24 | 2012-11-30 | User interface for a portable device including detecting absence of a finger near a touchscreen to prevent activating a control |
Family Applications After (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/691,364 Expired - Fee Related US9124685B2 (en) | 2012-01-24 | 2012-11-30 | Portable device including control actions dependent on a user looking at a touchscreen |
US13/691,339 Expired - Fee Related US8655416B2 (en) | 2012-01-24 | 2012-11-30 | User interface for a portable device including detecting close proximity of an ear to allow control of an audio application with a touchscreen |
US13/691,322 Expired - Fee Related US8938273B2 (en) | 2012-01-24 | 2012-11-30 | User interface for a portable device including detecting proximity of a finger near a touchscreen to prevent changing the display |
US14/479,712 Expired - Fee Related US9350841B2 (en) | 2012-01-24 | 2014-09-08 | Handheld device with reconfiguring touch controls |
US15/146,786 Expired - Fee Related US9626104B2 (en) | 2012-01-24 | 2016-05-04 | Thumb access area for one-handed touchscreen use |
Country Status (1)
Country | Link |
---|---|
US (9) | US8863042B2 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130325484A1 (en) * | 2012-05-29 | 2013-12-05 | Samsung Electronics Co., Ltd. | Method and apparatus for executing voice command in electronic device |
US20150264439A1 (en) * | 2012-10-28 | 2015-09-17 | Hillcrest Laboratories, Inc. | Context awareness for smart televisions |
US20160286031A1 (en) * | 2013-11-07 | 2016-09-29 | Huawei Device Co. Ltd. | Voice call establishing method and apparatus |
WO2019222759A1 (en) * | 2018-05-18 | 2019-11-21 | Synaptics Incorporated | Recurrent multimodal attention system based on expert gated networks |
WO2020091794A1 (en) * | 2018-11-01 | 2020-05-07 | Hewlett-Packard Development Company, L.P. | User voice based data file communications |
US10748542B2 (en) | 2017-03-23 | 2020-08-18 | Joyson Safety Systems Acquisition Llc | System and method of correlating mouth images to input commands |
US10810413B2 (en) * | 2018-01-22 | 2020-10-20 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Wakeup method, apparatus and device based on lip reading, and computer readable medium |
Families Citing this family (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102008051756A1 (en) * | 2007-11-12 | 2009-05-14 | Volkswagen Ag | Multimodal user interface of a driver assistance system for entering and presenting information |
JP5458783B2 (en) * | 2009-10-01 | 2014-04-02 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
KR101743506B1 (en) * | 2011-04-01 | 2017-06-07 | 삼성전자주식회사 | Apparatus and method for sensing proximity using proximity sensor in portable terminal |
JP5885501B2 (en) * | 2011-12-28 | 2016-03-15 | キヤノン株式会社 | Electronic device, its control method, program, and recording medium |
DE112011106059T5 (en) | 2011-12-28 | 2014-09-18 | Intel Corporation | Dimming an ad in response to users |
US9870752B2 (en) * | 2011-12-28 | 2018-01-16 | Intel Corporation | Display dimming in response to user |
US8863042B2 (en) * | 2012-01-24 | 2014-10-14 | Charles J. Kulas | Handheld device with touch controls that reconfigure in response to the way a user operates the device |
KR101992676B1 (en) * | 2012-07-26 | 2019-06-25 | 삼성전자주식회사 | Method and apparatus for voice recognition using video recognition |
JP5668992B2 (en) * | 2012-10-15 | 2015-02-12 | 横河電機株式会社 | Electronic equipment with a resistive touch panel |
KR20140070745A (en) * | 2012-11-26 | 2014-06-11 | 삼성전자주식회사 | Display device and driving method thereof |
KR102013940B1 (en) * | 2012-12-24 | 2019-10-21 | 삼성전자주식회사 | Method for managing security for applications and an electronic device thereof |
CN103973971B (en) * | 2013-01-30 | 2017-10-13 | 奥林巴斯株式会社 | The control method of information equipment and information equipment |
US9094576B1 (en) | 2013-03-12 | 2015-07-28 | Amazon Technologies, Inc. | Rendered audiovisual communication |
US9134952B2 (en) * | 2013-04-03 | 2015-09-15 | Lg Electronics Inc. | Terminal and control method thereof |
US20140329564A1 (en) * | 2013-05-02 | 2014-11-06 | Nokia Corporation | User interface apparatus and associated methods |
US11199906B1 (en) | 2013-09-04 | 2021-12-14 | Amazon Technologies, Inc. | Global user input management |
EP3046020A4 (en) * | 2013-09-11 | 2017-04-26 | Yulong Computer Telecommunication Scientific (Shenzhen) Co. Ltd. | Display method for touchscreen and terminal |
KR102077678B1 (en) * | 2014-02-07 | 2020-02-14 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US9836084B2 (en) | 2014-03-28 | 2017-12-05 | Intel Corporation | Inferred undocking for hybrid tablet computer |
CN103870199B (en) * | 2014-03-31 | 2017-09-29 | 华为技术有限公司 | The recognition methods of user operation mode and handheld device in handheld device |
USD789340S1 (en) * | 2014-05-16 | 2017-06-13 | Samsung Electronics Co., Ltd. | Battery cover for electronic device |
US9766702B2 (en) | 2014-06-19 | 2017-09-19 | Apple Inc. | User detection by a computing device |
WO2016042404A1 (en) * | 2014-09-19 | 2016-03-24 | Cochlear Limited | Configuration of hearing prosthesis sound processor based on visual interaction with external device |
CN105517025A (en) * | 2014-09-22 | 2016-04-20 | 中兴通讯股份有限公司 | Terminal state processing method and terminal state processing device |
CN104267812B (en) * | 2014-09-22 | 2017-08-29 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
US9796391B2 (en) * | 2014-10-13 | 2017-10-24 | Verizon Patent And Licensing Inc. | Distracted driver prevention systems and methods |
JP1523588S (en) * | 2014-10-23 | 2015-05-18 | ||
KR101632009B1 (en) | 2014-11-26 | 2016-06-21 | 엘지전자 주식회사 | Electronic device and control method for the electronic device |
US10444977B2 (en) * | 2014-12-05 | 2019-10-15 | Verizon Patent And Licensing Inc. | Cellphone manager |
TWI553502B (en) * | 2015-03-05 | 2016-10-11 | 緯創資通股份有限公司 | Protection method and computer system thereof for firewall apparatus disposed to application layer |
CN105094567B (en) * | 2015-08-20 | 2020-11-17 | Tcl科技集团股份有限公司 | Intelligent terminal operation implementation method and system based on gravity sensing |
EP3356918A1 (en) * | 2015-09-29 | 2018-08-08 | Telefonaktiebolaget LM Ericsson (publ) | Touchscreen device and method thereof |
US9804681B2 (en) * | 2015-11-10 | 2017-10-31 | Motorola Mobility Llc | Method and system for audible delivery of notifications partially presented on an always-on display |
CN105528577B (en) * | 2015-12-04 | 2019-02-12 | 深圳大学 | Recognition methods based on intelligent glasses |
KR20170084558A (en) * | 2016-01-12 | 2017-07-20 | 삼성전자주식회사 | Electronic Device and Operating Method Thereof |
CN106066694A (en) * | 2016-05-30 | 2016-11-02 | 维沃移动通信有限公司 | The control method of a kind of touch screen operation response and terminal |
JP2018037819A (en) * | 2016-08-31 | 2018-03-08 | 京セラ株式会社 | Electronic apparatus, control method, and program |
CN107968875A (en) * | 2016-10-20 | 2018-04-27 | 苏州乐聚堂电子科技有限公司 | Smart mobile phone and its control method |
KR102591413B1 (en) * | 2016-11-16 | 2023-10-19 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
KR102655584B1 (en) * | 2017-01-02 | 2024-04-08 | 삼성전자주식회사 | Display apparatus and controlling method thereof |
US10496190B2 (en) | 2017-01-25 | 2019-12-03 | Microsoft Technology Licensing, Llc | Redrawing a user interface based on pen proximity |
US10254940B2 (en) | 2017-04-19 | 2019-04-09 | International Business Machines Corporation | Modifying device content to facilitate user interaction |
US10650702B2 (en) | 2017-07-10 | 2020-05-12 | Sony Corporation | Modifying display region for people with loss of peripheral vision |
US10805676B2 (en) | 2017-07-10 | 2020-10-13 | Sony Corporation | Modifying display region for people with macular degeneration |
US20190018478A1 (en) * | 2017-07-11 | 2019-01-17 | Sony Corporation | Sensing viewer direction of viewing to invoke accessibility menu in audio video device |
US10303427B2 (en) | 2017-07-11 | 2019-05-28 | Sony Corporation | Moving audio from center speaker to peripheral speaker of display device for macular degeneration accessibility |
US10845954B2 (en) | 2017-07-11 | 2020-11-24 | Sony Corporation | Presenting audio video display options as list or matrix |
US20190026120A1 (en) * | 2017-07-21 | 2019-01-24 | International Business Machines Corporation | Customizing mobile device operation based on touch points |
CN107517313A (en) * | 2017-08-22 | 2017-12-26 | 珠海市魅族科技有限公司 | Awakening method and device, terminal and readable storage medium storing program for executing |
US10509563B2 (en) | 2018-01-15 | 2019-12-17 | International Business Machines Corporation | Dynamic modification of displayed elements of obstructed region |
CN108717348B (en) * | 2018-05-16 | 2023-07-11 | 珠海格力电器股份有限公司 | Operation control method and mobile terminal |
CN109164950B (en) | 2018-07-04 | 2020-07-07 | 珠海格力电器股份有限公司 | Method, device, medium and equipment for setting system interface of mobile terminal |
JP7055721B2 (en) * | 2018-08-27 | 2022-04-18 | 京セラ株式会社 | Electronic devices with voice recognition functions, control methods and programs for those electronic devices |
US11151993B2 (en) * | 2018-12-28 | 2021-10-19 | Baidu Usa Llc | Activating voice commands of a smart display device based on a vision-based mechanism |
US11487425B2 (en) | 2019-01-17 | 2022-11-01 | International Business Machines Corporation | Single-hand wide-screen smart device management |
CN110244898A (en) * | 2019-04-24 | 2019-09-17 | 维沃移动通信有限公司 | The method and terminal device of controlling terminal equipment |
US11204675B2 (en) | 2019-09-06 | 2021-12-21 | Aptiv Technologies Limited | Adaptive input countermeasures on human machine interface |
US11145315B2 (en) * | 2019-10-16 | 2021-10-12 | Motorola Mobility Llc | Electronic device with trigger phrase bypass and corresponding systems and methods |
CN116209974A (en) | 2020-09-25 | 2023-06-02 | 苹果公司 | Method for navigating a user interface |
US11294459B1 (en) | 2020-10-05 | 2022-04-05 | Bank Of America Corporation | Dynamic enhanced security based on eye movement tracking |
WO2022186811A1 (en) * | 2021-03-01 | 2022-09-09 | Hewlett-Packard Development Company, L.P. | Transmit power controls |
CN114821006B (en) * | 2022-06-23 | 2022-09-20 | 盾钰(上海)互联网科技有限公司 | Twin state detection method and system based on interactive indirect reasoning |
CN116225209A (en) * | 2022-11-03 | 2023-06-06 | 溥畅(杭州)智能科技有限公司 | Man-machine interaction method and system based on eye movement tracking |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020035475A1 (en) * | 2000-09-12 | 2002-03-21 | Pioneer Corporation | Voice recognition apparatus |
US20040267536A1 (en) * | 2003-06-27 | 2004-12-30 | Hershey John R. | Speech detection and enhancement using audio/video fusion |
US7219062B2 (en) * | 2002-01-30 | 2007-05-15 | Koninklijke Philips Electronics N.V. | Speech activity detection using acoustic and facial characteristics in an automatic speech recognition system |
US7587318B2 (en) * | 2002-09-12 | 2009-09-08 | Broadcom Corporation | Correlating video images of lip movements with audio signals to improve speech recognition |
US7860718B2 (en) * | 2005-12-08 | 2010-12-28 | Electronics And Telecommunications Research Institute | Apparatus and method for speech segment detection and system for speech recognition |
US20110257971A1 (en) * | 2010-04-14 | 2011-10-20 | T-Mobile Usa, Inc. | Camera-Assisted Noise Cancellation and Speech Recognition |
US8442820B2 (en) * | 2009-09-22 | 2013-05-14 | Hyundai Motor Company | Combined lip reading and voice recognition multimodal interface system |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5500643A (en) * | 1993-08-26 | 1996-03-19 | Grant; Alan H. | One-hand prehensile keyboard |
US6519003B1 (en) * | 1998-03-26 | 2003-02-11 | Eastman Kodak Company | Camera with combination four-way directional and mode control interface |
TWI238348B (en) * | 2002-05-13 | 2005-08-21 | Kyocera Corp | Portable information terminal, display control device, display control method, and recording media |
US20040080487A1 (en) * | 2002-10-29 | 2004-04-29 | Griffin Jason T. | Electronic device having keyboard for thumb typing |
US20050151725A1 (en) * | 2003-12-30 | 2005-07-14 | Jennings Christopher P. | User interface device |
US20070040807A1 (en) * | 2004-01-30 | 2007-02-22 | Kleve Robert E | Keyboard |
WO2006003590A2 (en) * | 2004-06-29 | 2006-01-12 | Koninklijke Philips Electronics, N.V. | A method and device for preventing staining of a display device |
FR2878344B1 (en) * | 2004-11-22 | 2012-12-21 | Sionnest Laurent Guyot | DATA CONTROLLER AND INPUT DEVICE |
KR100622891B1 (en) * | 2004-12-09 | 2006-09-19 | 엘지전자 주식회사 | Portable communication terminal having function of optimizing receiver position using sensor of recognizing images and method thereof |
JP2006245799A (en) * | 2005-03-01 | 2006-09-14 | Nec Saitama Ltd | Electronic apparatus, and method of controlling alarm output and alarm output control program in apparatus |
US7649522B2 (en) * | 2005-10-11 | 2010-01-19 | Fish & Richardson P.C. | Human interface input acceleration system |
KR101391602B1 (en) * | 2007-05-29 | 2014-05-07 | 삼성전자주식회사 | Method and multimedia device for interacting using user interface based on touch screen |
EP2071441A1 (en) * | 2007-12-03 | 2009-06-17 | Semiconductor Energy Laboratory Co., Ltd. | Mobile phone |
TW200928887A (en) * | 2007-12-28 | 2009-07-01 | Htc Corp | Stylus and electronic device |
US8395584B2 (en) * | 2007-12-31 | 2013-03-12 | Sony Corporation | Mobile terminals including multiple user interfaces on different faces thereof configured to be used in tandem and related methods of operation |
US8423076B2 (en) * | 2008-02-01 | 2013-04-16 | Lg Electronics Inc. | User interface for a mobile device |
US8213389B2 (en) * | 2008-04-15 | 2012-07-03 | Apple Inc. | Location determination using formula |
JP4632102B2 (en) * | 2008-07-17 | 2011-02-16 | ソニー株式会社 | Information processing apparatus, information processing method, and information processing program |
US8131322B2 (en) * | 2008-09-19 | 2012-03-06 | Apple Inc. | Enabling speaker phone mode of a portable voice communications device having a built-in camera |
FR2936326B1 (en) * | 2008-09-22 | 2011-04-29 | Stantum | DEVICE FOR THE CONTROL OF ELECTRONIC APPARATUS BY HANDLING GRAPHIC OBJECTS ON A MULTICONTACT TOUCH SCREEN |
US8314832B2 (en) * | 2009-04-01 | 2012-11-20 | Microsoft Corporation | Systems and methods for generating stereoscopic images |
US20100271312A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Menu Configuration System and Method for Display on an Electronic Device |
US8331966B2 (en) * | 2009-05-15 | 2012-12-11 | Apple Inc. | Content selection based on simulcast data |
GB0908456D0 (en) * | 2009-05-18 | 2009-06-24 | L P | Touch screen, related method of operation and systems |
SG175398A1 (en) * | 2009-06-16 | 2011-12-29 | Intel Corp | Adaptive virtual keyboard for handheld device |
US20130135223A1 (en) * | 2009-12-13 | 2013-05-30 | Ringbow Ltd. | Finger-worn input devices and methods of use |
US9367205B2 (en) * | 2010-02-19 | 2016-06-14 | Microsoft Technolgoy Licensing, Llc | Radial menus with bezel gestures |
US9310994B2 (en) * | 2010-02-19 | 2016-04-12 | Microsoft Technology Licensing, Llc | Use of bezel as an input mechanism |
US20110304695A1 (en) * | 2010-06-10 | 2011-12-15 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US8836643B2 (en) * | 2010-06-10 | 2014-09-16 | Qualcomm Incorporated | Auto-morphing adaptive user interface device and methods |
US8954099B2 (en) * | 2010-06-16 | 2015-02-10 | Qualcomm Incorporated | Layout design of proximity sensors to enable shortcuts |
US9262002B2 (en) * | 2010-11-03 | 2016-02-16 | Qualcomm Incorporated | Force sensing touch screen |
US20120162078A1 (en) * | 2010-12-28 | 2012-06-28 | Bran Ferren | Adaptive virtual keyboard for handheld device |
US9805331B2 (en) * | 2011-02-09 | 2017-10-31 | Deutsche Telekom Ag | Smartphone-based asset management system |
US9405463B2 (en) * | 2011-11-25 | 2016-08-02 | Samsung Electronics Co., Ltd. | Device and method for gesturally changing object attributes |
US20130147701A1 (en) * | 2011-12-13 | 2013-06-13 | Research In Motion Limited | Methods and devices for identifying a gesture |
CN102591577A (en) * | 2011-12-28 | 2012-07-18 | 华为技术有限公司 | Method for displaying arc-shaped menu index and relevant device |
US8863042B2 (en) * | 2012-01-24 | 2014-10-14 | Charles J. Kulas | Handheld device with touch controls that reconfigure in response to the way a user operates the device |
US9304683B2 (en) * | 2012-10-10 | 2016-04-05 | Microsoft Technology Licensing, Llc | Arced or slanted soft input panels |
-
2012
- 2012-06-08 US US13/492,478 patent/US8863042B2/en not_active Expired - Fee Related
- 2012-11-30 US US13/691,372 patent/US9124686B2/en not_active Expired - Fee Related
- 2012-11-30 US US13/691,332 patent/US8938274B2/en not_active Expired - Fee Related
- 2012-11-30 US US13/691,351 patent/US20130190043A1/en not_active Abandoned
- 2012-11-30 US US13/691,364 patent/US9124685B2/en not_active Expired - Fee Related
- 2012-11-30 US US13/691,339 patent/US8655416B2/en not_active Expired - Fee Related
- 2012-11-30 US US13/691,322 patent/US8938273B2/en not_active Expired - Fee Related
-
2014
- 2014-09-08 US US14/479,712 patent/US9350841B2/en not_active Expired - Fee Related
-
2016
- 2016-05-04 US US15/146,786 patent/US9626104B2/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020035475A1 (en) * | 2000-09-12 | 2002-03-21 | Pioneer Corporation | Voice recognition apparatus |
US7219062B2 (en) * | 2002-01-30 | 2007-05-15 | Koninklijke Philips Electronics N.V. | Speech activity detection using acoustic and facial characteristics in an automatic speech recognition system |
US7587318B2 (en) * | 2002-09-12 | 2009-09-08 | Broadcom Corporation | Correlating video images of lip movements with audio signals to improve speech recognition |
US20040267536A1 (en) * | 2003-06-27 | 2004-12-30 | Hershey John R. | Speech detection and enhancement using audio/video fusion |
US7860718B2 (en) * | 2005-12-08 | 2010-12-28 | Electronics And Telecommunications Research Institute | Apparatus and method for speech segment detection and system for speech recognition |
US8442820B2 (en) * | 2009-09-22 | 2013-05-14 | Hyundai Motor Company | Combined lip reading and voice recognition multimodal interface system |
US20110257971A1 (en) * | 2010-04-14 | 2011-10-20 | T-Mobile Usa, Inc. | Camera-Assisted Noise Cancellation and Speech Recognition |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130325484A1 (en) * | 2012-05-29 | 2013-12-05 | Samsung Electronics Co., Ltd. | Method and apparatus for executing voice command in electronic device |
US9619200B2 (en) * | 2012-05-29 | 2017-04-11 | Samsung Electronics Co., Ltd. | Method and apparatus for executing voice command in electronic device |
US11393472B2 (en) | 2012-05-29 | 2022-07-19 | Samsung Electronics Co., Ltd. | Method and apparatus for executing voice command in electronic device |
US10657967B2 (en) | 2012-05-29 | 2020-05-19 | Samsung Electronics Co., Ltd. | Method and apparatus for executing voice command in electronic device |
US20150264439A1 (en) * | 2012-10-28 | 2015-09-17 | Hillcrest Laboratories, Inc. | Context awareness for smart televisions |
US20160286031A1 (en) * | 2013-11-07 | 2016-09-29 | Huawei Device Co. Ltd. | Voice call establishing method and apparatus |
US9723129B2 (en) * | 2013-11-07 | 2017-08-01 | Huawei Device Co., Ltd. | Voice call establishing method and apparatus |
US11031012B2 (en) | 2017-03-23 | 2021-06-08 | Joyson Safety Systems Acquisition Llc | System and method of correlating mouth images to input commands |
US10748542B2 (en) | 2017-03-23 | 2020-08-18 | Joyson Safety Systems Acquisition Llc | System and method of correlating mouth images to input commands |
US10810413B2 (en) * | 2018-01-22 | 2020-10-20 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Wakeup method, apparatus and device based on lip reading, and computer readable medium |
WO2019222759A1 (en) * | 2018-05-18 | 2019-11-21 | Synaptics Incorporated | Recurrent multimodal attention system based on expert gated networks |
US11687770B2 (en) | 2018-05-18 | 2023-06-27 | Synaptics Incorporated | Recurrent multimodal attention system based on expert gated networks |
CN112470463A (en) * | 2018-11-01 | 2021-03-09 | 惠普发展公司,有限责任合伙企业 | User voice based data file communication |
WO2020091794A1 (en) * | 2018-11-01 | 2020-05-07 | Hewlett-Packard Development Company, L.P. | User voice based data file communications |
US20210295825A1 (en) * | 2018-11-01 | 2021-09-23 | Hewlett-Packard Development Company, L.P. | User voice based data file communications |
Also Published As
Publication number | Publication date |
---|---|
US20130190055A1 (en) | 2013-07-25 |
US20130188081A1 (en) | 2013-07-25 |
US9124686B2 (en) | 2015-09-01 |
US8938273B2 (en) | 2015-01-20 |
US20130190054A1 (en) | 2013-07-25 |
US8863042B2 (en) | 2014-10-14 |
US9124685B2 (en) | 2015-09-01 |
US20140380185A1 (en) | 2014-12-25 |
US9626104B2 (en) | 2017-04-18 |
US8655416B2 (en) | 2014-02-18 |
US9350841B2 (en) | 2016-05-24 |
US20130190045A1 (en) | 2013-07-25 |
US20130190042A1 (en) | 2013-07-25 |
US20130190044A1 (en) | 2013-07-25 |
US20160246499A1 (en) | 2016-08-25 |
US8938274B2 (en) | 2015-01-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20130190043A1 (en) | Portable device including mouth detection to initiate speech recognition and/or voice commands | |
KR101772177B1 (en) | Method and apparatus for obtaining photograph | |
KR101636460B1 (en) | Electronic device and method for controlling the same | |
US9703403B2 (en) | Image display control apparatus and image display control method | |
KR102397034B1 (en) | Electronic device and image stabilization method thereof | |
EP3133527A1 (en) | Human face recognition method, apparatus and terminal | |
WO2017071050A1 (en) | Mistaken touch prevention method and device for terminal with touch screen | |
US11822632B2 (en) | Methods, mechanisms, and computer-readable storage media for unlocking applications on a mobile terminal with a sliding module | |
WO2017124899A1 (en) | Information processing method, apparatus and electronic device | |
US20130154947A1 (en) | Determining a preferred screen orientation based on known hand positions | |
US20150331491A1 (en) | System and method for gesture based touchscreen control of displays | |
JP2017533602A (en) | Switching between electronic device cameras | |
US10735578B2 (en) | Method for unlocking a mobile terminal, devices using the same, and computer-readable storage media encoding the same | |
US20130308835A1 (en) | Mobile Communication Device with Image Recognition and Method of Operation Therefor | |
CN105100590A (en) | Image display and photographing system, photographing device, and display device | |
EP2990905A1 (en) | Method and device for displaying image | |
US9225906B2 (en) | Electronic device having efficient mechanisms for self-portrait image capturing and method for controlling the same | |
US9148537B1 (en) | Facial cues as commands | |
US20150334658A1 (en) | Affecting device action based on a distance of a user's eyes | |
US9507429B1 (en) | Obscure cameras as input | |
EP3789849A1 (en) | Contactless gesture control method, apparatus and storage medium | |
KR20140134844A (en) | Method and device for photographing based on objects | |
US20160091986A1 (en) | Handheld device, motion operation method, and computer readable medium | |
KR102031283B1 (en) | Method for managing for image an electronic device thereof | |
US10212382B2 (en) | Image processing device, method for controlling image processing device, and computer-readable storage medium storing program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |