CN108668077A - Camera control method, device, mobile terminal and computer-readable medium - Google Patents
Camera control method, device, mobile terminal and computer-readable medium Download PDFInfo
- Publication number
- CN108668077A CN108668077A CN201810382858.7A CN201810382858A CN108668077A CN 108668077 A CN108668077 A CN 108668077A CN 201810382858 A CN201810382858 A CN 201810382858A CN 108668077 A CN108668077 A CN 108668077A
- Authority
- CN
- China
- Prior art keywords
- mobile terminal
- user
- camera
- audio signal
- audio
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 230000005236 sound signal Effects 0.000 claims abstract description 121
- 230000015654 memory Effects 0.000 claims description 22
- 230000008878 coupling Effects 0.000 claims 1
- 238000010168 coupling process Methods 0.000 claims 1
- 238000005859 coupling reaction Methods 0.000 claims 1
- 230000006870 function Effects 0.000 description 20
- 238000005516 engineering process Methods 0.000 description 15
- 238000003860 storage Methods 0.000 description 13
- 238000009434 installation Methods 0.000 description 10
- 230000002093 peripheral effect Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 210000003128 head Anatomy 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 230000001815 facial effect Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 101150053844 APP1 gene Proteins 0.000 description 4
- 101100189105 Homo sapiens PABPC4 gene Proteins 0.000 description 4
- 102100039424 Polyadenylate-binding protein 4 Human genes 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000010295 mobile communication Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000004807 localization Effects 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 229910000838 Al alloy Inorganic materials 0.000 description 1
- 101100055496 Arabidopsis thaliana APP2 gene Proteins 0.000 description 1
- 241000208340 Araliaceae Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 230000018199 S phase Effects 0.000 description 1
- 101100016250 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) GYL1 gene Proteins 0.000 description 1
- 229910000831 Steel Inorganic materials 0.000 description 1
- 102100038359 Xaa-Pro aminopeptidase 3 Human genes 0.000 description 1
- 101710081949 Xaa-Pro aminopeptidase 3 Proteins 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005314 correlation function Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 210000003733 optic disk Anatomy 0.000 description 1
- 238000009527 percussion Methods 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000010959 steel Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L21/00—Speech or voice signal processing techniques to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
- G10L21/02—Speech enhancement, e.g. noise reduction or echo cancellation
- G10L21/0208—Noise filtering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/223—Execution procedure of a spoken command
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Quality & Reliability (AREA)
- Telephone Function (AREA)
Abstract
The embodiment of the present application provides a kind of camera control method, device, mobile terminal and computer-readable medium, belongs to technical field of mobile terminals.This method includes:When the screen display of the mobile terminal takes pictures interface, the audio signal input by user of the audio collection device acquisition is obtained;Direction of the user relative to the mobile terminal is determined according to the audio signal;It is rotated according to camera described in the direction controlling, so that the camera is towards the user.User does not need extra adjustment as a result, it is only necessary to be rotated by voice control camera, it will be able to realize camera alignment user's shooting, reduce operation, improve user experience.
Description
Technical field
This application involves technical field of mobile terminals, more particularly, to a kind of camera control method, device, movement
Terminal and computer-readable medium.
Background technology
Currently, when being clapped using mobile terminal, in some cases .., position of the user from mobile terminal farther out,
And be inconvenient to control mobile terminal, for example, when user's self-timer, self-shooting bar and farther out from mobile phone is not carried, can not be operated
To mobile terminal, lead to not the shooting angle for adjusting mobile terminal, user experience is excessively poor.
Invention content
Present applicant proposes a kind of camera control method, device, mobile terminal and computer-readable mediums, in improvement
State defect.
In a first aspect, the embodiment of the present application provides a kind of camera control method, it is applied to mobile terminal, the movement
Terminal includes terminal body, camera and audio collection device, and the camera is mounted on the terminal body, and
It can be rotated by control, the method includes:When the screen display of the mobile terminal takes pictures interface, obtains the audio and adopt
The audio signal input by user of storage acquisition;Determine the user relative to the mobile terminal according to the audio signal
Direction;It is rotated according to camera described in the direction controlling, so that the camera is towards the user.
Second aspect, the embodiment of the present application also provides a kind of camera shooting head controlling devices, are applied to mobile terminal, the shifting
Dynamic terminal includes terminal body and camera, and the camera is mounted on the terminal body, and can be controlled
System rotation, described device include:Acquiring unit, determination unit and control unit.Acquiring unit, for when the mobile terminal
Screen display take pictures interface when, obtain the audio signal input by user of audio collection device acquisition.Determination unit is used for root
Direction of the user relative to the mobile terminal is determined according to the audio signal.Control unit, for according to the direction
Camera rotation is controlled, so that the camera is towards the user.
The third aspect, the embodiment of the present application also provides a kind of mobile terminal, including memory and processor, the storages
Device is coupled with the processor;The memory store instruction, when executed by the processor so that the place
It manages device and executes following operation:When the screen display of the mobile terminal takes pictures interface, the audio collection device acquisition is obtained
Audio signal input by user;Direction of the user relative to the mobile terminal is determined according to the audio signal;According to
Camera described in the direction controlling rotates, so that the camera is towards the user.
Fourth aspect, the embodiment of the present application also provides a kind of computers for the program code that can perform with processor can
Medium is read, said program code makes the processor execute the above method.
The embodiment of the present application provides a kind of camera control method, device, mobile terminal and computer-readable medium, energy
Enough when the screen display of the mobile terminal takes pictures interface, the audio input by user letter of the audio collection device acquisition is obtained
Number, determine direction of the user relative to the mobile terminal further according to the audio signal acquired, after direction is determined, control
Camera rotation processed so that camera is towards the direction, to make the camera towards the user.User is not required to as a result,
Want extra adjustment, it is only necessary to rotate by voice control camera, it will be able to realize camera alignment user's shooting, reduce behaviour
Make, improves user experience.
Other feature and advantage of the embodiment of the present application will illustrate in subsequent specification, also, partly from specification
In become apparent, or understood by implementing the embodiment of the present application.The purpose of the embodiment of the present application and other advantages can
It realizes and obtains by specifically noted structure in the specification, claims and attached drawing write.
Description of the drawings
In order to more clearly explain the technical solutions in the embodiments of the present application, make required in being described below to embodiment
Attached drawing is briefly described, it should be apparent that, the accompanying drawings in the following description is only some embodiments of the present application, for
For those skilled in the art, without creative efforts, it can also be obtained according to these attached drawings other attached
Figure.
Fig. 1 shows the structural schematic diagram of the first mobile terminal provided by the embodiments of the present application;
Fig. 2 shows the structural schematic diagrams of second of mobile terminal provided by the embodiments of the present application;
Fig. 3 shows the structural schematic diagram of the third mobile terminal provided by the embodiments of the present application;
Fig. 4 shows the method flow diagram for the camera control method that the application first embodiment provides;
Fig. 5 shows the schematic diagram of the image captured by camera provided by the embodiments of the present application;
Fig. 6 shows the method flow diagram for the camera control method that the application second embodiment provides;
Fig. 7 shows the method flow diagram for the camera control method that the application 3rd embodiment provides;
Fig. 8 shows the module frame chart for the camera shooting head controlling device that the application first embodiment provides;
Fig. 9 shows the module frame chart for the camera shooting head controlling device that the application second embodiment provides;
Figure 10 shows a kind of front schematic view of mobile terminal provided by the embodiments of the present application;
Figure 11 shows the block diagram of the mobile terminal for executing the camera control method according to the embodiment of the present application.
Specific implementation mode
Below in conjunction with attached drawing in the embodiment of the present application, technical solutions in the embodiments of the present application carries out clear, complete
Ground describes, it is clear that described embodiments are only a part of embodiments of the present application, instead of all the embodiments.Usually exist
The component of the embodiment of the present application described and illustrated in attached drawing can be arranged and be designed with a variety of different configurations herein.Cause
This, the detailed description of the embodiments herein to providing in the accompanying drawings is not intended to limit claimed the application's below
Range, but it is merely representative of the selected embodiment of the application.Based on embodiments herein, those skilled in the art are not doing
The every other embodiment obtained under the premise of going out creative work, shall fall in the protection scope of this application.
It should be noted that:Similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi
It is defined, then it further need not be defined and explained in subsequent attached drawing in a attached drawing.Meanwhile the application's
In description, term " first ", " second " etc. are only used for distinguishing description, are not understood to indicate or imply relative importance.
Display screen usually act as display text, picture, icon played in the mobile terminals such as mobile phone, tablet computer
Or the contents such as video.And along with the development of touch technology, the display screen set by more and more mobile terminals is touch-control
Display screen, in the case where touching display screen is arranged, when detect user on touching display screen and pull, click, double-click,
When the touch control operations such as sliding, the touch control operation of user can be responded.
As user requires the clarity and fineness of displayed content higher and higher, more mobile terminals uses
Larger-size touching display screen, to realize the display effect shielded comprehensively.But larger-size touching display screen is being set
In the process, it finds the front camera set by mobile terminal front end, can influence to touch close to function elements such as optical sensor, receivers
The region that control display screen can expand to.
Usual mobile terminal includes front panel, rear cover and frame.In front plate include upper frontal region, middle part screen area and under
Portion key zone.In general, upper frontal region is provided with the function elements such as receiver sound outlet hole and front camera, middle part screen area is provided with
Touching display screen, lower part key zone are provided with one to three physical buttons.And with the development of technology, lower part key zone is gradual
Cancel, the physical button in lower part key zone is set originally and is substituted by the virtual key in touching display screen.
And the function elements such as receiver sound outlet hole and front camera set by upper frontal region support the function of mobile phone
It is more important, be not easy easily to cancel, so by touching display screen can display area be extended to the upper frontal region of covering have it is larger
Difficulty.After some row researchs, inventor has found can will to include receiver, image acquisition device, light filling device, knot
The function element module of structure optical assembly is hidden into the inside of the terminal body of mobile terminal, and being reconfigured function element module can be with
The mode of sliding reveals at the top of the terminal body, alternatively, by above-mentioned functional unit setting overturning to mobile phone
The back side, so that display screen can be arranged by extension for mandibular area before.
Illustratively, referring to Fig. 1, Fig. 1 shows the structural schematic diagram for the mobile terminal that can be used for the embodiment of the present application.
Mobile terminal includes terminal body 130, camera 150 and runner assembly 140, and camera 150 is mounted on runner assembly 140,
In addition, the function element of mobile terminal can be arranged on runner assembly, then the function element is in addition to including above-mentioned camera 15
Except, can also include close to optical sensor, receiver, range sensor, ambient light sensor, temperature sensor, pressure sensing
The devices such as device.
Runner assembly 140 is rotatablely connected with terminal body 130, and in some implementations, camera 150 is in runner assembly
Under 140 drive, it can be rotated in the horizontal direction with vertical direction.In the embodiment of the present application, mobile terminal is provided with driving
Component, the drive component can be motors, and runner assembly 140 can be driven to rotate.Then the rotation direction of runner assembly 140 can be with
It is two-dimensional, can also be three-dimensional.And the rotation of runner assembly 140 so that camera 150 can make as front camera
With, can also be used as rear camera use.
Wherein, the rotation that two-dimensional rotation can be horizontally oriented, as shown in Figure 1, drive component drives runner assembly 140
Rotate in the horizontal direction, wherein horizontal direction be when mobile terminal vertical display, the rotational plane of runner assembly 140 with
The top side of mobile terminal is parallel, and specifically, which may include shaft and installation part, then one end of shaft
Mounted on the top of mobile terminal, the other end is connect with installation part, then on mountings, shaft can for above-mentioned function element installation
Installation part is driven to horizontally rotate.As an implementation, runner assembly 140 can move up and down, i.e., move end along separate
The direction movement at the top at end and the top of close mobile terminal so that runner assembly can be contained in inside terminal body
Or the back side, for example, runner assembly is slidably arranged on inside terminal body, and the open top of terminal body,
Runner assembly can be stretched out by open top.Alternatively, runner assembly can be placed in the back side of mobile terminal, for example, moving
A groove is arranged in the top at the back side of dynamic terminal body, then runner assembly 140 is arranged in groove, and can be in groove
It stretches out or retracts.
In addition, two-dimensional rotation can also be the rotation on vertical direction, as shown in Fig. 2, drive component drives rotating group
Part 140 rotates in the vertical direction, wherein horizontal direction is when mobile terminal vertical display, and the rotation of runner assembly 140 is flat
Face is vertical with the top side of mobile terminal, and specifically, which may include holder and installation part, then holder is pacified
Mounted in the top of mobile terminal, and installation part is connect with holder pivots, as shown in Fig. 2, the quantity of holder is two, is installed in
The top of terminal body, and installation part is installed in rotation on two holders, specifically, can be pacified by a shaft
On two holders, under the action of shaft, installation part can rotate in the vertical direction.In addition, rotating group shown in Fig. 2
Part can be also contained in inside terminal body or the back side, and specific ground structure can refer to aforementioned, and details are not described herein.
Furthermore the rotation on three-dimensional may include the rotation in horizontal direction and vertical direction, as shown in figure 3,
Drive component drives runner assembly 140 in vertically and horizontally upper rotation, and specifically, which can wrap
Holder and installation part are included, then holder is mounted on the top of mobile terminal, and installation part is connect with holder pivots, and holder also can
Enough rotations, rotation direction are horizontal direction, the rotation being achieved that as a result, on the three-dimensional of runner assembly.
It since the camera of present mobile terminal is fixed, and can not be freely rotated, then in some occasions
Under, position of the user from mobile terminal farther out, and is inconvenient to control mobile terminal, for example, when user's self-timer, does not carry
Self-shooting bar and farther out from mobile phone, can not operate mobile terminal, lead to not the shooting angle for adjusting mobile terminal, user experience
It is poor to cross.
Therefore, in order to solve drawbacks described above, the embodiment of the present application provides a kind of camera control method, as shown in figure 4,
For improving Experience Degree of the user when taking pictures, this method includes this method:S401 to S403.
S401:When the screen display of the mobile terminal takes pictures interface, the user of the audio collection device acquisition is obtained
The audio signal of input.
Application program running state table is stored in mobile terminal, the institute installed including current mobile terminal in the table
The mark state corresponding with the mark of each application program for having application program, for example, as shown in table 1 below:
Table 1
The mark of application program | State | Time point |
APP1 | Front stage operation state | 2017/11/3/13:20 |
APP2 | Background operation state | 2017/11/4/14:10 |
APP3 | Not running state | 2017/11/5/8:20 |
APP4 | Background operation state | 2017/11/5/10:03 |
APP5 | Background operation state | 2017/11/4/9:18 |
In above-mentioned table 1, APP1 is the mark of application program, can be that title or packet name of application program etc. are used to refer to
The content of application identity, corresponding time point are the time point when application program is switched to corresponding state, for example,
The time point of APP1 in table 1 identifies in 13: 20 this time point of on November 3rd, 2017, screens of the APP1 in mobile terminal
Upper operation switches to front stage operation state.
Wherein, the state of application program includes front stage operation state, background operation state and not running state.Front stage operation
State refers to that application program is run by interface on the screen, and user can be by interface and the application program interactive, for example, defeated
Enter to execute instruction or some information etc. are observed by interface.Background operation state refers to resource management of the application program in system
It is run in device, but typically no interface.And not running state refers to that the application program is not actuated, that is, is not at front stage operation shape
State, not also in background operation state.
After being determined whether camera applications run on the screen, the current display of camera applications may further determine that
Interface, for example, there is the preview interface of preview image at the interface of taking pictures of camera applications in the interface display, which can be with
It is the image that the former camera or the latter's camera acquire, and if camera can rotate under the drive of runner assembly,
The preview image in interface of taking pictures can be the image that camera shown in Fig. 1-3 acquires, and by rotating camera
Realize the function of the former camera and rear camera.
Determine the screen of the mobile terminal when and interface be display take pictures interface after, obtain the audio and adopt
Storage acquisition audio signal input by user, specifically, can be determine the screen of the mobile terminal when and interface
For display take pictures interface when, start mobile terminal on audio collection device, obtain the sound that the audio collection device is acquired.Its
In, audio collection device can be microphone unit, for receiving sound and converting sound into audio signal.
In addition, since surrounding also has noise or inhuman sound, it is therefore desirable to the sound acquired to audio collection device
Sound does one and filters out, to which the sound except the voice of the mankind be removed, specifically, the frequency of the sound sent out due to the mankind and
The difference of the ambient noise of surrounding, under normal circumstances, male voice is low, and female voice is high.Specific frequency is as follows:The bass of male sound
82~392Hz, 64~523Hz of reference note area, 164~698Hz of 123~493Hz of middle pitch and high pitch, and the bass 82 of female's sound~
392Hz, 160~1200Hz of reference note area, 220~1100HZ of 123~493Hz of bass and high pitch.
Therefore, the sound except the frequency of people's sound can be filtered out, for example, using Hamming filter, sef-adapting filter
Or kirchhoff filter etc., the sound except 123~698Hz is filtered out.And only acquire the sound between 123~698Hz
Sound, to obtain audio signal input by user.
S402:Direction of the user relative to the mobile terminal is determined according to the audio signal.
Mobile terminal is provided with audio collection device, and the quantity of audio collection device can be one, can also be it is multiple,
And based on the quantity of audio collection device difference, then it is determined between the user and the mobile terminal according to the audio signal
The mode of directional information is also different.
As an implementation, the quantity of audio collection device is one, is rotatablely connected, can freely turn with mobile terminal
Dynamic, specifically, audio collection device is arranged on above-mentioned runner assembly, can be rotated under the drive of runner assembly.Then according to institute
State audio signal and determine that the mode of the directional information between the user and the mobile terminal is, at certain intervals angle and
Direction change terminal runner assembly, so that the angle and direction variation rotation at certain intervals of audio collection device, is recorded
Audio collection device turns to the received audio signal at each position, obtains in all audio signals received, signal is strong
The corresponding target direction of maximum audio signal is spent, just can determine sound source in the target direction as a result, namely can determine institute
State the directional information between user and the mobile terminal.
Specifically, above-mentioned mode is the sound localization method that the steerable beam based on peak power output forms technology,
Specifically, using beam-forming technology, the reception direction of microphone array is adjusted, is scanned in entire reception space, energy is most
Big direction is the direction of sound source.
As another embodiment, multiple audio collection devices can be set on mobile terminals, wherein audio collection device
It can be that microphone array is arranged as an implementation in microphone unit on mobile terminal, refer to the row of microphone
Row.That is it is made of the acoustic sensor of certain amount (being usually microphone), is used for carrying out the spatial character of sound field
The system for sampling and handling.
In some embodiments, all audio collection devices are arranged on the ontology of mobile terminal, it is of course also possible to be
All audio collection devices are arranged on the runner assembly of mobile terminal, it is, of course, also possible to be, a part of audio collection device is set
It sets on the runner assembly of mobile terminal, the setting of another part audio collection device is on dynamic terminal body, in order to improve determining sound
The accuracy in source circumferentially disposed specifically, can be circumferentially provided on terminal body on mobile terminals, in addition,
An audio collection device can be arranged in the audio collection device being arranged on runner assembly on each face of runner assembly,
It can be the setting audio collection device at least on two opposite faces.
Specifically, when multiple audio collection devices are arranged, not only can use it is above-mentioned based on peak power output can
The sound localization method of beam-forming technology is controlled, can also use and be based on reaching time-difference technology.
Specifically, the sound localization method based on reaching time-difference technology is, since sound wave is in air with certain speed
It propagates, the phase for reaching the microphone for being set to different location is different, according to these microphones to the phase of same sound recording
Difference, we can calculate the time difference that same sound reaches each pair of microphone.If we have obtained some sound source hair
The sound gone out reaches the time difference of a pair of of microphone, and the suitable position for arranging microphone can make bi-curved intersection point only
There are one, this point is exactly the sound source position that we want.
S403:It is rotated according to camera described in the direction controlling, so that the camera is towards the user.
User is being determined relative to the direction of the mobile terminal and then is determining the current position of camera, then root
It is rotated relative to the direction controlling camera of the mobile terminal according to the current position of camera and user.It specifically, can be with
Set the datum mark of a mobile terminal, it is determined that after direction of the user relative to the mobile terminal, also just obtain use
Then direction of the family relative to the datum mark of the mobile terminal after the current position of camera is determined, also determines
Orientation of the current position of camera between the datum mark of the mobile terminal, for example, user is in mobile terminal
The right side of the horizontal direction of datum mark, and camera then controls camera shooting in the top of the horizontal direction of the datum mark of mobile terminal
Head rotates down the datum mark that the first distance reaches mobile terminal, after then horizontally rotating second distance again, towards user.
Wherein it is determined that the mode of the current position of camera is, can be current by angular transducer acquisition camera
The sensor for detecting rotation direction and rotational angle is arranged, for example, in runner assembly in rotational angle on runner assembly
Angular transducer or Hall sensor, the rotation direction for detecting shaft and rotational angle are set in shaft, then camera
It is rotated by rotation axis, then camera shooting head-turned angle can be detected to the detection of the rotation of rotation axis.
In addition, in some embodiments, camera can be hidden in shell or be hidden in the back side of mobile terminal,
It is blocked by terminal body, then whether also needs to detection mobile terminal by being stretched out in shell.Specifically, user's phase is being got
For the direction of the mobile terminal, the camera is determined whether by being stretched out in terminal body, if it is, according to institute
It states described in direction controlling camera to rotate, so that the camera is towards the user.
Wherein, judge that whether be stretched out the mode of ontology of mobile terminal of mobile mobile terminal is to be set at camera
A range sensor is set, when camera is in terminal body, the range sensor at camera is by mobile terminal sheet
Body blocks, and distance value is maintained at a lower numerical value, and when camera is stretched place's ontology, the Distance-sensing at camera
The distance value that device is acquired increases, therefore, it is possible to judge that when whether the distance value that range sensor is acquired is more than preset value, such as
Fruit is more than, it is determined that camera is moved to precalculated position, if it is less than or equal to preset value, it is determined that camera is un-shifted
To precalculated position.Wherein, preset value is numerical value set by user, is not imaged when for indicating the camera place of being stretched ontology
Head ontology blocks.
In addition, when camera is shot towards user, the image of camera acquisition is obtained, profile is done to acquired image
Extraction, determines in acquired image whether there is facial image according to acquired profile information, if it is not, control camera shooting
Head moves in the region towards user, until including facial image in acquired image, then determines the facial image
Position track up.Specifically, according to each acquired include facial image photo in facial contour integrity degree and
Position, control camera movement.As shown in figure 5, face in image captured by the camera at the first moment shown in Fig. 5 (a)
Position where profile, for example, for the centre of image, Fig. 5 (b) is subsequent time, i.e. the second moment, the figure captured by camera
Position as in where facial contour, for example, for the right in image, and the right half part of facial contour is not photographed, then
Control camera turns right.
In addition, when including multiple users in the ambient enviroment of mobile terminal, it can be from the sound source of multiple users
It determines a sound source, shoots the user of the sound source, can also be that will obtain the direction of each sound source, that is, it is opposite to obtain each user
In the direction of the mobile terminal, and shot successively towards each user.
In the embodiment of the present application, mobile terminal can control camera to certain or some user in multiple users
Shooting, specifically, referring to Fig. 6, the embodiment of the present application provides a kind of camera control method, this method is for improving user
Experience Degree when taking pictures, this method include:S601 to S604.
S601:When the screen display of the mobile terminal takes pictures interface, the multiple of the audio collection device acquisition are obtained
Audio signal input by user, alternately audio signal.
Audio signal input by user each of in the current environment of acquisition mobile terminal, specifically, since sound source is different,
Then the audio frequency parameter between the audio signal of different sound sources is different, then can distinguish different audio signals.Wherein, sound
Frequency parameter can be the parameters such as frequency, amplitude, the phase of audio signal.Get each audio signal input by user it
Afterwards, parsing obtains the corresponding audio frequency parameter of each audio signal.
S602:It searches in the alternative audio signals, meets the audio signal of preset condition as the use acquired in this
The audio signal of family input.
Wherein, the condition that preset condition sets for user according to actual use, for example, it may be by the sound of audio signal
Frequency parameter meets the audio signal of some condition, for example, signal strength to be more than to the audio signal of some value, is obtained as this
The audio signal input by user taken, that is, target user audio signal.
In addition, since the corresponding audio of each audio signal is different, then mean that each audio signal can correspond to difference
User identity, then can according to the user identity corresponding to each audio signal come search those audio signals meet preset item
Part specifically according to the correspondence of preset audio signal and identity information, determines in the alternative audio signals, each
The corresponding identity information of the audio signal.
Different users record audio signals in mobile terminal in advance, to which mobile terminal establishes preset audio signal
With the correspondence of identity information.For example, the correspondence is as shown in table 2 below:
Table 2
Wherein, different audio frequency parameters corresponds to different identity informations, then the identity information in correspondence can be moved
The common user of dynamic terminal or the relevant user of user with mobile terminal.
After the audio signal for getting each user in the current environment of mobile terminal of audio collection device acquisition,
The audio frequency parameter for obtaining each audio signal is searched by the audio frequency parameter of acquired each audio signal in above-mentioned table 2,
The acquired corresponding identity information of each audio frequency parameter.If it is corresponding not find audio signal in above-mentioned correspondence
Identity information then gives up the audio signal.
Then, it in the corresponding identity information of all audio signals found, searches to meet and presets identification criteria
Identity information.Wherein, identification criteria is preset as according to identity priority or the standard set according to demand.
In some embodiments, described by all identity informations, meet the identity information pair for presetting identification criteria
The audio signal answered is as the embodiment of the audio signal input by user acquired in this:
By in all identity informations, audio signal corresponding with the matched identity information of default identity information is as this
Secondary acquired audio signal input by user.
Wherein, it is the identity information set in mobile terminal in advance to preset identity information, for example, each mobile terminal
A corresponding manager, which is generally the user of mobile terminal, and corresponding telephone number is usually to move
The number of SIM card in terminal.Then in mobile terminal preset a default identity information list, this preset identity information row
Identity information is preset in storage in table, and default identity information is the identity information set by manager, and the identity information is upper
It states and exists in correspondence shown in table 2.For example, above-mentioned table 2 includes three identity informations, respectively ID1, ID2 and ID3,
And it is ID1 that identity information is preset in storage in default identity information list, that is to say, that, can be according to collected sound according to table 2
Frequency signal finds ID1, ID2 and ID3, still, meets the only ID1 for presetting identity information.The audio that then will currently get
In the audio-frequency information for all users that collector is acquired, the corresponding audio-frequency informations of identity information ID1 are as acquired in this
Audio signal input by user, so as to the user by camera towards identity information ID1.Therefore, user is according to the need of oneself
It asks and sets default identity information, it will be able to so that camera executes different users the effect of track up.
In further embodiments, the identity information for presetting identification criteria in all identity informations, will be met to correspond to
Audio signal be as the embodiment of the audio signal input by user acquired in this:
It obtains in all identity informations, each the corresponding priority of the identity information;The priority is met
The corresponding audio signal of identity information of pre-set level standard is as the audio signal input by user acquired in this.
Specifically, the corresponding priority of each identity information is preset in mobile terminal, for example, body can be stored in
Part information in the corresponding table of priority, multiple identity informations are stored in the correspondence table and each identity information is corresponding excellent
First grade.Then as an implementation, the correspondence and audio signal of identity information and priority and identity can be believed
The correspondence of breath is stored together, as shown in table 3 below:
Table 3
Identity information | The audio frequency parameter of audio signal | Priority |
ID1 | V1 | J1 |
ID2 | V2 | J2 |
ID3 | V3 | J3 |
As shown in upper table 3, J1, J2 and J3 indicate different priority, then as an implementation, " J " subsequent number
Word is smaller, and priority is higher.Then in J1, J2 and J3, the highest priority of J1, J2 takes second place, and the priority of J3 is minimum.
Then in the correspondence according to audio signal and identity information, the audio-frequency information pair of each user acquired is determined
The corresponding priority of identity information of the audio-frequency information of the identity information and then determining each user answered.
The priority is met to the corresponding audio signal of identity information of pre-set level standard as acquired in this
Audio signal input by user, then pre-set level standard can be that user sets according to demand, for example, pre-set level standard
Can be minimum priority or highest priority or priority.
In the embodiment of the present application, pre-set level standard is highest priority, then the priority is met pre-set level
The corresponding audio signal of identity information of standard is as the mode of the audio signal input by user acquired in this:It will be described
The corresponding audio signal of identity information of highest priority is used as the audio signal input by user acquired in this, i.e. target
The audio signal at family.
S603:Direction of the user relative to the mobile terminal is determined according to the audio signal.
After the audio signal of target user is determined, then only determine the user of the audio signal relative to the movement
The direction of terminal, and the audio signal of other users then is given up to fall.
As an implementation, can be after the audio signal of target user is determined, according to the audio signal
Audio frequency parameter, other audio signals are filtered out, for example, the frequency model in the audio frequency parameter of the audio signal of target user
It encloses for AHz to BHz, wherein A and B is different numerical value, and A is less than B, then a bandpass filter is arranged, the bandpass filter
Two cutoff frequencies are respectively AHz and BHz, then, the audio except AHz to BHz can be believed by the bandpass filter
It number is filled into, and the audio signal within AHz to BHz, further according to the other parameters of each audio signal, for example, amplitude, flat
Equal power or frequency concentrated area are distinguished.
S604:It is rotated according to camera described in the direction controlling, so that the camera is towards the user.
Illustratively, there are three people around current mobile terminal, everyone is talking, then acquisition for mobile terminal to three
Then the audio signal of a user determines the audio signal of a target user from the audio signal of three users, then,
Camera is automatically to living in identified target user.
It should be noted that being the part of detailed description in above-mentioned steps, previous embodiment is can refer to, it is no longer superfluous herein
It states.
Referring to Fig. 7, the embodiment of the present application provides a kind of camera control method, this method is being clapped for improving user
According to when Experience Degree, specifically, this method includes:S701 to S706.
S701:When the screen display of the mobile terminal takes pictures interface, the multiple of the audio collection device acquisition are obtained
Audio signal input by user, alternately audio signal.
S702:It searches in the alternative audio signals, meets the audio signal of preset condition as the use acquired in this
The audio signal of family input.
S703:Direction of the user relative to the mobile terminal is determined according to the audio signal.
S704:It is rotated according to camera described in the direction controlling, so that the camera is towards the user.
S705:Determine the distance between the user and the mobile terminal.
S706:The parameter of the camera is adjusted according to the distance.
Camera rotation is controlled according to the directional information, so that the camera is towards after the user, also
Including:It determines the distance between the user and the mobile terminal, the parameter of the camera is adjusted according to the distance.
Specifically, it is provided with range sensor on mobile terminal, which obtains between user and mobile terminal
Distance, specifically, which may include signal projector and signal receiver, the signal that signal generator is sent out
It is received by signal receiver after being reflected by user, according to the time that the time of signal transmitting and signal receive, determines signal
Propagation time, it is possible thereby to determine the distance between user and mobile terminal.
After at a distance from determining user between mobile terminal, the ginseng of the camera is adjusted according to the distance
Number may include specifically according to distance adjustment focal length or aperture size, for example, the distance between user and mobile terminal
When more than preset value, it sets the aperture of camera to small aperture, i.e. f-number is less than default f-number, wherein default aperture
Value is the numerical value of lens speed half so that captured picture is more clear, and between user and mobile terminal away from
When from less than or equal to preset value, setting the aperture of camera to large aperture, i.e. f-number is greater than or equal to default f-number,
To obtain the background blurring effect except user images.
Wherein it is determined that the distance between the user and the mobile terminal, the camera is adjusted according to the distance
Parameter process, be not limited to execute in the embodiment corresponding to Fig. 7, can also be to be held in the embodiment corresponding to Fig. 4
Row, specifically, can be rotated according to camera described in the direction controlling so that the camera towards the user it
After execute.
It should be noted that being the part of detailed description in above-mentioned steps, previous embodiment is can refer to, it is no longer superfluous herein
It states.
Referring to Fig. 8, the embodiment of the present application provides a kind of camera shooting head controlling device 800, the device is for improving user
Experience Degree when taking pictures, specifically, which includes:Acquiring unit 801, determination unit 802 and control unit 803.
Acquiring unit 801, for when the screen display of the mobile terminal takes pictures interface, obtaining the audio collection device
The audio signal input by user of acquisition.
Determination unit 802, for determining direction of the user relative to the mobile terminal according to the audio signal.
Control unit 803, for being rotated according to camera described in the direction controlling, so that the camera is described in
User.
It is apparent to those skilled in the art that for convenience and simplicity of description, the device of foregoing description
It with the specific work process of unit, can refer to corresponding processes in the foregoing method embodiment, details are not described herein.
Referring to Fig. 9, the embodiment of the present application provides a kind of camera shooting head controlling device 900, the device is for improving user
Experience Degree when taking pictures, specifically, which includes:Acquiring unit 901, determination unit 902, control unit 903 and adjustment
Unit 904.
Acquiring unit 901, for when the screen display of the mobile terminal takes pictures interface, obtaining the audio collection device
The audio signal input by user of acquisition.
Determination unit 902, for determining direction of the user relative to the mobile terminal according to the audio signal.
Control unit 903, for being rotated according to camera described in the direction controlling, so that the camera is described in
User.
Adjustment unit 904 is adjusted for determining the distance between the user and the mobile terminal according to the distance
The parameter of the camera.
It is apparent to those skilled in the art that for convenience and simplicity of description, the device of foregoing description
It with the specific work process of unit, can refer to corresponding processes in the foregoing method embodiment, details are not described herein.
Referring to Fig. 10, based on above-mentioned camera control method, device, the embodiment of the present application also provides a kind of mobile whole
End 100, and it is illustrated in fig. 10 be the mobile terminal described in above-mentioned Fig. 1-3 runner assembly be retracted in terminal body or
The state at the person back side.Specifically, mobile terminal 100 include electronic body portion 10, the electronic body portion 10 include shell 12 and
The main display 120 being arranged on the shell 12.Metal can be used in the shell 12, such as steel, aluminium alloy are made.This implementation
In example, the main display 120 generally includes display panel 111, may also comprise and is carried out to the display panel 111 for responding
The circuit etc. of touch control operation.The display panel 111 can be a liquid crystal display panel (Liquid Crystal
Display, LCD), in some embodiments, the display panel 111 is a touch screen 109 simultaneously.
Please refer to Figure 11, in actual application scenarios, the mobile terminal 100 can be used as intelligent mobile phone terminal into
It exercises and uses, the electronic body portion 10 also typically includes one or more (only showing one in figure) processors in this case
102, memory 104, RF (Radio Frequency, radio frequency) module 106, voicefrequency circuit 110, sensor 114, input module
118, power module 122.It will appreciated by the skilled person that structure shown in Figure 11 is only to illustrate, not to institute
The structure for stating electronic body portion 10 causes to limit.For example, the electronic body portion 10 may also include it is more than shown in Figure 11 or
The less component of person, or with the configuration different from shown in Figure 10.
It will appreciated by the skilled person that for the processor 102, every other component belongs to
It is coupled by multiple Peripheral Interfaces 124 between peripheral hardware, the processor 102 and these peripheral hardwares.The Peripheral Interface 124 can
Based on following standard implementation:Universal Asynchronous Receive/sending device (Universal Asynchronous Receiver/
Transmitter, UART), universal input/output (General Purpose Input Output, GPIO), serial peripheral connect
Mouthful (Serial Peripheral Interface, SPI), internal integrated circuit (Inter-Integrated Circuit,
I2C), but it is not limited to above-mentioned standard.In some instances, the Peripheral Interface 124 can only include bus;In other examples
In, the Peripheral Interface 124 may also include other elements, such as one or more controller, such as connecting the display
The display controller of panel 111 or storage control for connecting memory.In addition, these controllers can also be from described
Detached in Peripheral Interface 124, and be integrated in the processor 102 or corresponding peripheral hardware in.
The memory 104 can be used for storing software program and module, and the processor 102 is stored in institute by operation
The software program and module in memory 104 are stated, to perform various functions application and data processing.The memory
104 may include high speed random access memory, may also include nonvolatile memory, and such as one or more magnetic storage device dodges
It deposits or other non-volatile solid state memories.In some instances, the memory 104 can further comprise relative to institute
The remotely located memory of processor 102 is stated, these remote memories can pass through network connection to the electronic body portion 10
Or the main display 120.The example of above-mentioned network includes but not limited to internet, intranet, LAN, mobile communication
Net and combinations thereof.
The RF modules 106 are used to receive and transmit electromagnetic wave, realize the mutual conversion of electromagnetic wave and electric signal, to
It is communicated with communication network or other equipment.The RF modules 106 may include various existing for executing these functions
Circuit element, for example, antenna, RF transceiver, digital signal processor, encryption/deciphering chip, subscriber identity module
(SIM) card, memory etc..The RF modules 106 can be carried out with various networks such as internet, intranet, wireless network
Communication is communicated by wireless network and other equipment.Above-mentioned wireless network may include cellular telephone networks, wireless
LAN or Metropolitan Area Network (MAN).Above-mentioned wireless network can use various communication standards, agreement and technology, including but not limited to
Global system for mobile communications (Global System for Mobile Communication, GSM), enhanced mobile communication skill
Art (Enhanced Data GSM Environment, EDGE), Wideband CDMA Technology (wideband code
Division multiple access, W-CDMA), Code Division Multiple Access (Code division access, CDMA), time-division
Multiple access technology (time division multiple access, TDMA), adopting wireless fidelity technology (Wireless, Fidelity,
WiFi) (such as American Institute of Electrical and Electronics Engineers's standard IEEE 802.10A, IEEE 802.11b, IEEE802.11g and/
Or IEEE 802.11n), the networking telephone (Voice over internet protocal, VoIP), worldwide interoperability for microwave accesses
(Worldwide Interoperability for Microwave Access, Wi-Max), other for mail, Instant Messenger
The agreement and any other suitable communications protocol of news and short message, or even may include that those are not developed currently yet
Agreement.
Voicefrequency circuit 110, receiver 101, sound jack 103, microphone 105 provide user and the electronic body portion jointly
Audio interface between 10 or described main displays 120.Specifically, the voicefrequency circuit 110 receives from the processor 102
Voice data is converted to electric signal by voice data, by electric signal transmission to the receiver 101.The receiver 101 is by electric signal
Be converted to the sound wave that human ear can be heard.The voicefrequency circuit 110 receives electric signal also from the microphone 105, by electric signal
Voice data is converted to, and gives the processor 102 to be further processed data transmission in network telephony.Audio data can be with
It is obtained from the memory 104 or by the RF modules 106.In addition, audio data can also be stored to the storage
It is sent in device 104 or by the RF modules 106.
The setting of the sensor 114 is in the electronic body portion 10 or in the main display 120, the sensor
114 example includes but is not limited to:Optical sensor, pressure sensor, acceleration transducer 114F, approaches operation sensor
Sensor 114J and other sensors.
Specifically, the optical sensor may include light sensor, pressure sensor.Wherein, pressure sensor can be examined
Survey the sensor by pressing the pressure generated in mobile terminal 100.That is, pressure sensor detection is by between user and mobile terminal
Contact or pressing generate pressure, such as by between the ear and mobile terminal of user contact or pressing generate pressure.
Therefore, whether pressure sensor may be used to determine is contacted or is pressed between user and mobile terminal 100, and
The size of pressure.
Referring to Figure 10, specifically in the embodiment shown in fig. 10, the light sensor and the pressure pass
Sensor is arranged adjacent to the display panel 111.The light sensor can be when there is object close to the main display 120, example
When being moved in one's ear such as the electronic body portion 10, the processor 102 closes display output.
As a kind of motion sensor, acceleration transducer 114F can detect in all directions (generally three axis) and accelerate
The size of degree can detect that size and the direction of gravity, can be used to identify the application of 100 posture of the mobile terminal when static
(such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) etc..
In addition, the electronic body portion 10 can also configure the other sensors such as gyroscope, barometer, hygrometer, thermometer, herein no longer
It repeats,
In the present embodiment, the input module 118 may include being arranged the touch screen on the main display 120
109, the touch screen 109 collect user on it or neighbouring touch operation (for example user is any using finger, stylus etc.
Operation of the suitable object or attachment on the touch screen 109 or near the touch screen 109), and according to presetting
The corresponding attachment device of driven by program.Optionally, the touch screen 109 may include touch detecting apparatus and touch controller.
Wherein, the touch orientation of the touch detecting apparatus detection user, and the signal that touch operation is brought is detected, it transmits a signal to
The touch controller;The touch controller receives touch information from the touch detecting apparatus, and by the touch information
It is converted into contact coordinate, then gives the processor 102, and order that the processor 102 is sent can be received and executed.
Furthermore, it is possible to realize touching for the touch screen 109 using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves
Touch detection function.
The information and the electronics that the main display 120 is used to show information input by user, is supplied to user
The various graphical user interface of body part 10, these graphical user interface can by figure, text, icon, number, video and its
Arbitrary to combine to constitute, in an example, the touch screen 109 may be disposed on the display panel 111 thus with described
Display panel 111 constitutes an entirety.
The power module 122 is used to provide supply of electric power to the processor 102 and other each components.Specifically,
The power module 122 may include power-supply management system, one or more power supply (such as battery or alternating current), charging circuit,
Power-fail detection circuit, inverter, indicator of the power supply status and any other and the electronic body portion 10 or the master
The generation, management of electric power and the relevant component of distribution in display screen 120.
The mobile terminal 100 further includes locator 119, and the locator 119 is for determining 100 institute of the mobile terminal
The physical location at place.In the present embodiment, the locator 119 realizes the positioning of the mobile terminal 100 using positioning service,
The positioning service, it should be understood that the location information of the mobile terminal 100 is obtained by specific location technology (as passed through
Latitude coordinate), it is marked on the electronic map by the technology of the position of positioning object or service.
In conclusion the embodiment of the present application provide a kind of camera control method, device, mobile terminal and computer can
Medium is read, the user that when the screen display of the mobile terminal takes pictures interface, can obtain the audio collection device acquisition is defeated
The audio signal entered determines direction of the user relative to the mobile terminal further according to the audio signal acquired, is determining
After direction, control camera rotation so that camera is towards the direction, to make the camera towards the user.By
This, user does not need extra adjustment, it is only necessary to be rotated by voice control camera, it will be able to realize that camera is directed at user
Shooting reduces operation, improves user experience.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show
The description of example " or " some examples " etc. means specific features, structure, material or spy described in conjunction with this embodiment or example
Point is contained at least one embodiment or example of the application.In the present specification, schematic expression of the above terms are not
It must be directed to identical embodiment or example.Moreover, particular features, structures, materials, or characteristics described can be in office
It can be combined in any suitable manner in one or more embodiments or example.In addition, without conflicting with each other, the skill of this field
Art personnel can tie the feature of different embodiments or examples described in this specification and different embodiments or examples
It closes and combines.
In addition, term " first ", " second " are used for description purposes only, it is not understood to indicate or imply relative importance
Or implicitly indicate the quantity of indicated technical characteristic.Define " first " as a result, the feature of " second " can be expressed or
Implicitly include at least one this feature.In the description of the present application, the meaning of " plurality " is at least two, such as two, three
It is a etc., unless otherwise specifically defined.
Any process described otherwise above or method description are construed as in flow chart or herein, and expression includes
It is one or more for realizing specific logical function or process the step of executable instruction code module, segment or portion
Point, and the range of the preferred embodiment of the application includes other realization, wherein can not press shown or discuss suitable
Sequence, include according to involved function by it is basic simultaneously in the way of or in the opposite order, to execute function, this should be by the application
Embodiment person of ordinary skill in the field understood.
Expression or logic and/or step described otherwise above herein in flow charts, for example, being considered use
In the order list for the executable instruction for realizing logic function, may be embodied in any computer-readable medium, for
Instruction execution system, device or equipment (system of such as computer based system including processor or other can be held from instruction
The instruction fetch of row system, device or equipment and the system executed instruction) it uses, or combine these instruction execution systems, device or set
It is standby and use.For the purpose of this specification, " computer-readable medium " can any can be included, store, communicating, propagating or passing
Defeated program is for instruction execution system, device or equipment or the dress used in conjunction with these instruction execution systems, device or equipment
It sets.The more specific example (non-exhaustive list) of computer-readable medium includes following:Electricity with one or more wiring
Interconnecting piece (mobile terminal), portable computer diskette box (magnetic device), random access memory (RAM), read-only memory
(ROM), erasable edit read-only storage (EPROM or flash memory), fiber device and portable optic disk is read-only deposits
Reservoir (CDROM).In addition, computer-readable medium can even is that the paper that can print described program on it or other are suitable
Medium, because can be for example by carrying out optical scanner to paper or other media, then into edlin, interpretation or when necessary with it
His suitable method is handled electronically to obtain described program, is then stored in computer storage.
It should be appreciated that each section of the application can be realized with hardware, software, firmware or combination thereof.Above-mentioned
In embodiment, software that multiple steps or method can in memory and by suitable instruction execution system be executed with storage
Or firmware is realized.It, and in another embodiment, can be under well known in the art for example, if realized with hardware
Any one of row technology or their combination are realized:With the logic gates for realizing logic function to data-signal
Discrete logic, with suitable combinational logic gate circuit application-specific integrated circuit, programmable gate array (PGA), scene
Programmable gate array (FPGA) etc..
Those skilled in the art are appreciated that realize all or part of step that above-described embodiment method carries
Suddenly it is that relevant hardware can be instructed to complete by program, the program can be stored in a kind of computer-readable storage medium
In matter, which includes the steps that one or a combination set of embodiment of the method when being executed.In addition, in each embodiment of the application
In each functional unit can be integrated in a processing module, can also be that each unit physically exists alone, can also two
A or more than two units are integrated in a module.The form that hardware had both may be used in above-mentioned integrated module is realized, also may be used
It is realized in the form of using software function module.If the integrated module realized in the form of software function module and as
Independent product sale in use, can also be stored in a computer read/write memory medium.
Storage medium mentioned above can be read-only memory, disk or CD etc..Although having been shown and retouching above
Embodiments herein is stated, it is to be understood that above-described embodiment is exemplary, and should not be understood as the limit to the application
System, those skilled in the art can be changed above-described embodiment, change, replace and become within the scope of application
Type.
Finally it should be noted that:Above example is only to illustrate the technical solution of the application, rather than its limitations;Although
The application is described in detail with reference to the foregoing embodiments, those skilled in the art are when understanding:It still can be with
Technical scheme described in the above embodiments is modified or equivalent replacement of some of the technical features;And
These modifications or replacements, do not drive corresponding technical solution essence be detached from each embodiment technical solution of the application spirit and
Range.
Claims (10)
1. a kind of camera control method, which is characterized in that be applied to mobile terminal, the mobile terminal includes mobile terminal sheet
Body, camera and audio collection device, the camera are mounted on the terminal body, and can be rotated by control, institute
The method of stating includes:
When the screen display of the mobile terminal takes pictures interface, the audio input by user of the audio collection device acquisition is obtained
Signal;
Direction of the user relative to the mobile terminal is determined according to the audio signal;
It is rotated according to camera described in the direction controlling, so that the camera is towards the user.
2. according to the method described in claim 1, it is characterized in that, described when the screen display of the mobile terminal is taken pictures interface
When, the audio signal input by user of the audio collection device acquisition is obtained, including:
When the screen display of the mobile terminal takes pictures interface, the multiple input by user of the audio collection device acquisition is obtained
Audio signal, alternately audio signal;
It searches in the alternative audio signals, meets the audio signal of preset condition as the sound input by user acquired in this
Frequency signal.
3. according to the method described in claim 2, it is characterized in that, described search in the alternative audio signals, satisfaction is default
The audio signal of condition as the audio signal input by user acquired in this, including:
It according to the correspondence of preset audio signal and identity information, determines in the alternative audio signals, each sound
The corresponding identity information of frequency signal;
The corresponding audio signal of identity information for presetting identification criteria in all identity informations, will be met to be obtained as this
The audio signal input by user taken.
4. according to the method described in claim 3, it is characterized in that, described in all identity informations, will meet default body
The corresponding audio signal of identity information of part standard as the audio signal input by user acquired in this, including:
By in all identity informations, audio signal corresponding with the matched identity information of default identity information is as this institute
The audio signal input by user obtained.
5. according to the method described in claim 3, it is characterized in that, described in all identity informations, will meet default body
The corresponding audio signal of identity information of part standard as the audio signal input by user acquired in this, including:
It obtains in all identity informations, each the corresponding priority of the identity information;
The priority is met to the corresponding audio signal of identity information of pre-set level standard as the user acquired in this
The audio signal of input.
6. according to the method described in claim 1, it is characterized in that, described rotate according to camera described in the direction controlling,
So that the camera further includes towards after the user:
Determine the distance between the user and the mobile terminal;
The parameter of the camera is adjusted according to the distance.
7. according to the method described in claim 1, it is characterized in that, being circumferentially provided with multiple audio collections on the mobile terminal
Device;Direction of the user relative to the mobile terminal is determined according to the audio signal, including:
Obtain the audio signal of each audio collection device acquisition;
According to the audio signal of multiple audio collection device acquisitions, side of the user relative to the mobile terminal is determined
To.
8. a kind of camera shooting head controlling device, which is characterized in that be applied to mobile terminal, the mobile terminal includes mobile terminal sheet
Body, camera and audio collection device, the camera are mounted on the terminal body, and can be rotated by control, institute
Stating device includes:
Acquiring unit, for when the screen display of the mobile terminal takes pictures interface, obtaining the audio collection device acquisition
Audio signal input by user;
Determination unit, for determining direction of the user relative to the mobile terminal according to the audio signal;
Control unit, for being rotated according to camera described in the direction controlling, so that the camera is towards the user.
9. a kind of mobile terminal, which is characterized in that including terminal body, camera and audio collection device, the camera
On the terminal body, and it can be rotated by control;Further include memory and processor, the memory and institute
State processor coupling;The memory store instruction, when executed by the processor so that the processor is held
The following operation of row:
When the screen display of the mobile terminal takes pictures interface, the audio input by user of the audio collection device acquisition is obtained
Signal;
Direction of the user relative to the mobile terminal is determined according to the audio signal;
It is rotated according to camera described in the direction controlling, so that the camera is towards the user.
10. a kind of computer-readable medium for the program code that can perform with processor, which is characterized in that said program code
The processor is set to execute any one of claim 1-7 the methods.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810382858.7A CN108668077A (en) | 2018-04-25 | 2018-04-25 | Camera control method, device, mobile terminal and computer-readable medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810382858.7A CN108668077A (en) | 2018-04-25 | 2018-04-25 | Camera control method, device, mobile terminal and computer-readable medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108668077A true CN108668077A (en) | 2018-10-16 |
Family
ID=63781160
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810382858.7A Pending CN108668077A (en) | 2018-04-25 | 2018-04-25 | Camera control method, device, mobile terminal and computer-readable medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108668077A (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109361792A (en) * | 2018-11-16 | 2019-02-19 | 维沃移动通信有限公司 | A kind of mobile terminal, control method and device |
CN109889717A (en) * | 2019-03-29 | 2019-06-14 | 维沃移动通信有限公司 | A kind of terminal and terminal control method, device |
CN110113541A (en) * | 2019-06-18 | 2019-08-09 | 华勤通讯技术有限公司 | The control method and system that the camera of smart machine rotates automatically |
CN110505401A (en) * | 2019-08-16 | 2019-11-26 | 维沃移动通信有限公司 | A kind of camera control method and electronic equipment |
CN110570850A (en) * | 2019-07-30 | 2019-12-13 | 珠海格力电器股份有限公司 | Voice control method, device, computer equipment and storage medium |
CN110769154A (en) * | 2019-10-30 | 2020-02-07 | 维沃移动通信(杭州)有限公司 | Shooting method and electronic equipment |
CN110881909A (en) * | 2019-12-20 | 2020-03-17 | 小狗电器互联网科技(北京)股份有限公司 | Control method and device of sweeper |
CN111708383A (en) * | 2020-07-01 | 2020-09-25 | 海信视像科技股份有限公司 | Method for adjusting shooting angle of camera and display device |
CN112135035A (en) * | 2019-06-24 | 2020-12-25 | 北京小米移动软件有限公司 | Control method and device of image acquisition assembly and storage medium |
CN112866772A (en) * | 2020-08-21 | 2021-05-28 | 海信视像科技股份有限公司 | Display device and sound image character positioning and tracking method |
CN113419652A (en) * | 2021-06-22 | 2021-09-21 | 中国联合网络通信集团有限公司 | Intelligent equipment interaction method, server and client |
WO2022001406A1 (en) * | 2020-07-01 | 2022-01-06 | 海信视像科技股份有限公司 | Display method and display device |
CN116129157A (en) * | 2023-04-13 | 2023-05-16 | 深圳市夜行人科技有限公司 | Intelligent image processing method and system for warning camera based on extreme low light level |
US12028617B2 (en) | 2020-07-01 | 2024-07-02 | Hisense Visual Technology Co., Ltd. | Display apparatus and processing method for display apparatus with camera |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104935819A (en) * | 2015-06-11 | 2015-09-23 | 广东欧珀移动通信有限公司 | Method for controlling camera to shoot and terminal |
CN104954673A (en) * | 2015-06-11 | 2015-09-30 | 广东欧珀移动通信有限公司 | Camera rotating control method and user terminal |
JP2017034570A (en) * | 2015-08-05 | 2017-02-09 | キヤノン株式会社 | Imaging apparatus |
CN107682634A (en) * | 2017-10-18 | 2018-02-09 | 维沃移动通信有限公司 | A kind of facial image acquisition methods and mobile terminal |
CN107800967A (en) * | 2017-10-30 | 2018-03-13 | 维沃移动通信有限公司 | A kind of image pickup method and mobile terminal |
-
2018
- 2018-04-25 CN CN201810382858.7A patent/CN108668077A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104935819A (en) * | 2015-06-11 | 2015-09-23 | 广东欧珀移动通信有限公司 | Method for controlling camera to shoot and terminal |
CN104954673A (en) * | 2015-06-11 | 2015-09-30 | 广东欧珀移动通信有限公司 | Camera rotating control method and user terminal |
JP2017034570A (en) * | 2015-08-05 | 2017-02-09 | キヤノン株式会社 | Imaging apparatus |
CN107682634A (en) * | 2017-10-18 | 2018-02-09 | 维沃移动通信有限公司 | A kind of facial image acquisition methods and mobile terminal |
CN107800967A (en) * | 2017-10-30 | 2018-03-13 | 维沃移动通信有限公司 | A kind of image pickup method and mobile terminal |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109361792A (en) * | 2018-11-16 | 2019-02-19 | 维沃移动通信有限公司 | A kind of mobile terminal, control method and device |
CN109889717A (en) * | 2019-03-29 | 2019-06-14 | 维沃移动通信有限公司 | A kind of terminal and terminal control method, device |
CN110113541A (en) * | 2019-06-18 | 2019-08-09 | 华勤通讯技术有限公司 | The control method and system that the camera of smart machine rotates automatically |
CN110113541B (en) * | 2019-06-18 | 2021-01-29 | 华勤技术股份有限公司 | Control method and system for automatic rotation of camera of intelligent equipment |
CN112135035A (en) * | 2019-06-24 | 2020-12-25 | 北京小米移动软件有限公司 | Control method and device of image acquisition assembly and storage medium |
CN110570850A (en) * | 2019-07-30 | 2019-12-13 | 珠海格力电器股份有限公司 | Voice control method, device, computer equipment and storage medium |
CN110505401A (en) * | 2019-08-16 | 2019-11-26 | 维沃移动通信有限公司 | A kind of camera control method and electronic equipment |
CN110769154A (en) * | 2019-10-30 | 2020-02-07 | 维沃移动通信(杭州)有限公司 | Shooting method and electronic equipment |
CN110769154B (en) * | 2019-10-30 | 2021-05-18 | 维沃移动通信(杭州)有限公司 | Shooting method and electronic equipment |
CN110881909A (en) * | 2019-12-20 | 2020-03-17 | 小狗电器互联网科技(北京)股份有限公司 | Control method and device of sweeper |
CN111708383A (en) * | 2020-07-01 | 2020-09-25 | 海信视像科技股份有限公司 | Method for adjusting shooting angle of camera and display device |
WO2022001406A1 (en) * | 2020-07-01 | 2022-01-06 | 海信视像科技股份有限公司 | Display method and display device |
US12028617B2 (en) | 2020-07-01 | 2024-07-02 | Hisense Visual Technology Co., Ltd. | Display apparatus and processing method for display apparatus with camera |
CN112866772A (en) * | 2020-08-21 | 2021-05-28 | 海信视像科技股份有限公司 | Display device and sound image character positioning and tracking method |
CN112866772B (en) * | 2020-08-21 | 2022-08-12 | 海信视像科技股份有限公司 | Display device and sound image character positioning and tracking method |
CN113419652A (en) * | 2021-06-22 | 2021-09-21 | 中国联合网络通信集团有限公司 | Intelligent equipment interaction method, server and client |
CN116129157A (en) * | 2023-04-13 | 2023-05-16 | 深圳市夜行人科技有限公司 | Intelligent image processing method and system for warning camera based on extreme low light level |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108668077A (en) | Camera control method, device, mobile terminal and computer-readable medium | |
CN109167894B (en) | Camera control method and device, mobile terminal and storage medium | |
CN108664190B (en) | Page display method, device, mobile terminal and storage medium | |
CN106412222B (en) | Mobile terminal and control method thereof | |
EP2400737B1 (en) | A method for providing an augmented reality display on a mobile device | |
EP2385687B1 (en) | Mobile terminal and control method thereof | |
KR102503945B1 (en) | Watch-type mobile terminal and method for controlling the same | |
CN108495045B (en) | Image capturing method, image capturing apparatus, electronic apparatus, and storage medium | |
CN111641794B (en) | Sound signal acquisition method and electronic equipment | |
CN108762859A (en) | Wallpaper displaying method, device, mobile terminal and storage medium | |
CN108307029A (en) | Mobile terminal and its operating method | |
JP7394879B2 (en) | Imaging method and terminal | |
CN107015776A (en) | Mobile terminal and its operating method | |
US10348970B2 (en) | Mobile terminal and method of operating the same | |
CN108989665A (en) | Image processing method, device, mobile terminal and computer-readable medium | |
CN108712602A (en) | Camera control method, device, mobile terminal and storage medium | |
CN108965691A (en) | Camera control method, device, mobile terminal and storage medium | |
KR101714207B1 (en) | Mobile terminal and method of controlling thereof | |
CN109218982A (en) | Sight spot information acquisition methods, device, mobile terminal and storage medium | |
CN108366163A (en) | Control method, device, mobile terminal and the computer-readable medium of camera applications | |
CN109104521A (en) | Bearing calibration, device, mobile terminal and the storage medium of proximity state | |
CN111464746B (en) | Photographing method and electronic equipment | |
CN109101119A (en) | terminal control method, device and mobile terminal | |
KR101657203B1 (en) | Mobile terminal and control method therof | |
CN108989666A (en) | Image pickup method, device, mobile terminal and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181016 |