CN108536371A - A kind of method for displaying image and terminal - Google Patents
A kind of method for displaying image and terminal Download PDFInfo
- Publication number
- CN108536371A CN108536371A CN201810310804.XA CN201810310804A CN108536371A CN 108536371 A CN108536371 A CN 108536371A CN 201810310804 A CN201810310804 A CN 201810310804A CN 108536371 A CN108536371 A CN 108536371A
- Authority
- CN
- China
- Prior art keywords
- target
- image
- deflection angle
- target image
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kind of method for displaying image and terminals, wherein the method includes:Acquire first object object;Obtain target image;Deflection angle according to the second target object in the target image relative to the first object object, adjusts the display direction of the target image.The present invention to target image according to deflection angle by carrying out display adjustment, to eliminate angles of display deviation of second target object relative to first object object, to promote image display effect.
Description
Technical field
The present embodiments relate to field of communication technology more particularly to a kind of method for displaying image and terminals.
Background technology
With the development of terminal technology, the function that terminal has is also more and more diversified.Such as:Camera function, video calling
Function, face identification functions etc..Wherein, terminal is in exposal model, video calling and recognition of face verification etc. under application scenarios,
It may all need the target image got being shown in the display interface of terminal.Currently, being directed to the above scene, terminal is general
It is being directly displayed in the display interface of terminal to target image of will obtaining, therefore the image display processing method of the prior art
There is a problem of that flexibility ratio is poor.
Invention content
The present invention provides a kind of method for displaying image and terminals, to solve method for displaying image flexibility ratio in the prior art
The problem of difference.
In order to solve the above-mentioned technical problem, the invention is realized in this way:
In a first aspect, an embodiment of the present invention provides a kind of method for displaying image, it is applied to terminal, wherein the method packet
It includes:
Acquire first object object;
Obtain target image;
Deflection angle according to the second target object in the target image relative to the first object object adjusts institute
State the display direction of target image.
Second aspect, the embodiment of the present invention additionally provide a kind of terminal, including:
Acquisition module, for acquiring first object object;
Acquisition module, for obtaining target image;
First processing module is used for according to the second target object in the target image relative to the first object object
Deflection angle, adjust the display direction of the target image.
The third aspect, the embodiment of the present invention additionally provide a kind of terminal, including processor, memory and are stored in described deposit
On reservoir and the computer program that can run on the processor, the computer program are realized when being executed by the processor
The step of method for displaying image as described above.
Fourth aspect, the embodiment of the present invention additionally provide a kind of computer readable storage medium, described computer-readable to deposit
Computer program is stored on storage media, the computer program realizes image display side as described above when being executed by processor
The step of method.
In embodiments of the present invention, by acquiring first object object, target image is obtained;And according in target image
Deflection angle of two target objects relative to first object object, adjusts the display direction of the target image.In this way, second
Target object, there are when angles of display deviation, carries out display adjustment, to eliminate mesh relative to first object object to target image
Angles of display deviation of second target object relative to first object object in logo image, to increase method for displaying image
Flexibility ratio promotes display effect.
Description of the drawings
Fig. 1 shows one of flow charts of method for displaying image of the embodiment of the present invention;
Fig. 2 indicates the two of the flow chart of the method for displaying image of the embodiment of the present invention;
Fig. 3 indicates the display schematic diagram before being adjusted to target image in the embodiment of the present invention;
Fig. 4 indicates the schematic diagram of angular relationship between first object direction and the second target direction in Fig. 3;
Fig. 5 indicates the display schematic diagram after being adjusted to target image in the embodiment of the present invention;
Fig. 6 indicates the schematic diagram of angular relationship between first object direction and the second target direction in Fig. 5;
Fig. 7 indicates the block diagram of the terminal of the embodiment of the present invention;
Fig. 8 shows the hardware architecture diagrams of the mobile terminal of the embodiment of the present invention.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation describes, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair
Embodiment in bright, the every other implementation that those of ordinary skill in the art are obtained without creative efforts
Example, shall fall within the protection scope of the present invention.
Referring to Fig. 1, an embodiment of the present invention provides a kind of method for displaying image, are applied to terminal, wherein the method packet
It includes:
Step 11:Acquire first object object.
In the embodiment, first object object is acquired, is specifically included:Obtain the first object object information of terminal acquisition;
Determine the first object object in the first object object information.As a kind of application scenarios:It, can in video call process
The first object object information is shown in the first image display window.
Can directly be acquired by the camera of terminal specifically, obtaining the first object object information of terminal acquisition
Obtained first object object information;Wherein, first object object information includes but not limited to following information:Facial image is believed
Breath, animal painting information, fixed object image information etc..It can also be the first video frame of the camera acquisition for obtaining terminal,
Or obtain image capture device (such as:External camera etc.) acquisition first object object information or the first video frame.
Step 12:Obtain target image.
In the embodiment, target image, including but not limited to following form are obtained:
Mode one:The target image that the other-end in addition to the terminal is sent is received, such as:It receives and the terminal
Establish the target image that the call opposite end of video calling is sent.
Specifically, call opposite end send target image can be converse opposite end camera acquisition image information or
Second video frame, can also be converse the opposite end image information or the second video frame that are acquired by other image capture devices,
Invention is not limited thereto.
Mode two:Target image is obtained by network.Such as:Under video calling scene, it can also be obtained by network
Target image, and the target image is shown in the second image display window.
Mode three:Obtain the target image being locally stored.Such as:It, can also be by obtaining this under video calling scene
The target image of ground storage, and the target image is shown in the second image display window.
It should be noted that the second image display window includes one or at least two image display windows.
Specifically, after obtaining target image, further include:Determine the second target object in the target image.Its
In, the second target object includes but not limited to following form:Human face image information, animal painting information, fixed object image letter
Breath etc..In this way in order to the deflection angle according to the second target object relative to first object object, the target image is determined
Display direction relative to the first object object.
Step 13:Deflection angle according to the second target object in the target image relative to the first object object
Degree, adjusts the display direction of the target image.
Preferably, first object object is the first facial image;Second target object is the second facial image.
In the embodiment, illustrate so that first object object and the second target object are facial image as an example.As a kind of reality
Existing mode, above-mentioned steps 13 include:
Determine that the second target object is opposite relative to the second deflection angle and first object object of object reference feature
First deflection angle of the object reference feature.
Wherein, object reference feature includes but not limited to following manner:Reference direction, pre-set image region or default ginseng
Examine line (such as:Parallel or perpendicular to the reference line etc. of transverse direction/longitudinal direction of terminal display).
Specifically, in the second image display window in terminal display interface, shows the second facial image profile, pass through
The objective contour feature of the second target object is compared relative to corresponding objective contour feature in the second facial image profile, is determined
Second deflection angle of the target image relative to object reference feature.
Likewise, in the first image display window that can also be in terminal display interface, the first facial image wheel is shown
Exterior feature, and by comparing the objective contour feature of first object object relative to corresponding objective contour in the first facial image profile
Feature determines first deflection angle of the first object object relative to the first facial image profile.
According to first deflection angle and second deflection angle, the second target object in the target image is determined
Deflection angle relative to the first object object.
Specifically, as a kind of realization method, if the first deflection angle and the second deflection angle are relative to object reference
The angle of the same direction deflection of feature, then according to difference (or the difference between the first deflection angle and the second deflection angle
Absolute value), the deflection angle as the second target object relative to the first object object.If the first deflection angle and
Two deflection angles are the angle deflected relative to the opposite direction of object reference feature, then partially according to the first deflection angle and second
The sum of gyration, the deflection angle as the second target object relative to the first object object.
As another realization method, if target object (including first object object and second target object) is relative to mesh
Fixed reference feature is marked along deflection clockwise, then deflection angle (including first deflection angle of the target object relative to object reference feature
Degree and the second deflection angle) it is labeled as positive value;If target object is relative to object reference feature along deflection counterclockwise, target pair
As the deflection angle relative to object reference feature is labeled as negative value.And according between the first deflection angle and the second deflection angle
The absolute value of difference, the deflection angle as the second target object relative to the first object object.
The program can compare first object object respectively and the second target object is respectively relative to object reference feature
Deflection angle, in order to when determining that first object object deflects relative to object reference feature, to first object object
Display adjustment is carried out, or when the target image that opposite end is sent of conversing deflects relative to object reference feature, to the mesh
Logo image is adjusted.The program has higher flexibility.
According to the deflection angle, the display direction of the target image is adjusted.
Specifically, in the case where the deflection angle is more than predetermined angle, the mesh is adjusted according to the deflection angle
The display direction of logo image.
Further, according to the deflection angle, the display direction for adjusting the target image is target display direction.
Wherein, when the target image is shown with the target display direction, the second target object in the target image
Deflection angle relative to the first object object is within the scope of predetermined angle.Preferably, ranging from -5 ° of predetermined angle~
5°。
In this way, by carrying out display adjustment to target image, to meet the second target object relative to first object object
Deflection angle within a preset range, can preferably reach consistent with the display direction of first object object display effect
Fruit, for example, can to avoid in video call process since call opposite end deflects or shakes so that call opposite end is sent
Target image when end side is shown, generate the shake for showing picture and cause human eye dizziness, and then it is logical to influence video
Talk about effect.
In said program, by acquiring first object object, target image is obtained;And according to the second target in target image
Deflection angle of the object relative to first object object, adjusts the display direction of the target image.In this way, in the second target pair
As, there are when angles of display deviation, display adjustment being carried out to target image, to eliminate the second target relative to first object object
Angles of display deviation of the object relative to first object object, to promote image display effect.
Referring to Fig. 2, the embodiment of the present invention additionally provides a kind of method for displaying image, including:
Step 21:Acquire first object object.
In the embodiment, first object object is acquired, is specifically included:Obtain the first object object information of terminal acquisition;
Determine the first object object in the first object object information.As a kind of application scenarios:It, can in video call process
The first object object information is shown in the first image display window.
Can directly be acquired by the camera of terminal specifically, obtaining the first object object information of terminal acquisition
Obtained first object object information;Wherein, first object object information includes but not limited to following object:Facial image is believed
Breath, animal painting information, fixed object image information etc..It can also be the first video frame of the camera acquisition for obtaining terminal,
Or obtain image capture device (such as:External camera etc.) acquisition first object object information or the first video frame.
Step 22:Obtain target image.
In the embodiment, target image, including but not limited to following form are obtained:
Mode one:The target image that the other-end in addition to the terminal is sent is received, such as:It receives and the terminal
Establish the target image that the call opposite end of video calling is sent.
Specifically, call opposite end send target image can be converse opposite end camera acquisition image information or
Second video frame, can also be converse the opposite end image information or the second video frame that are acquired by other image capture devices,
Invention is not limited thereto.
Mode two:Target image is obtained by network.Such as:Under video calling scene, it can also be obtained by network
Target image, and the target image is shown in the second image display window.
Mode three:Obtain the target image being locally stored.Such as:It, can also be by obtaining this under video calling scene
The target image of ground storage, and the target image is shown in the second image display window.
It should be noted that the second image display window includes one or at least two image display windows.
Specifically, after obtaining target image, further include:Determine the second target object in the target image.Its
In, the second target object includes but not limited to following form:Facial image, animal painting, fixed object image etc..In this way so as to
In the deflection angle according to the second target object relative to first object object, determine the target image relative to described first
The display direction of target object.
Step 23:Determine the first object object first object direction and second target object second
Target direction.
Wherein, step 23 includes specifically:It is special according to the first object feature of the first object object and the second target
Position relationship between sign determines the first object direction;According to the first object feature of second target object
Position relationship between second target signature determines second target direction.
In the embodiment, by taking first object object and the second target object are facial image as an example, it should be noted that the
One target signature and the second target signature and be not specific to first object object either in the second target object some or it is more
A pixel, but two different facial characteristics in facial image, such as:Lower jaw, lower jaw, forehead, lip, nose, left eye, the right side
Eye, Zuo Mei, right eyebrow, left ear, auris dextra etc..
Specifically, the position relationship between the first object feature and the second target signature according to first object object,
Determine the first object direction, or the first object feature according to the second target object and second target signature
Between position relationship, when determining second target direction, one or more pixel of first object feature can be chosen
Point is used as first position information (such as:The geometric center of target signature), choose one or more pixel of the second target signature
Point is used as second position information, and then determines in first object object from first position information to second position information as first
Target direction is determined from first position information to second position information in the second target object, as the second target direction.When
So, it can also be in determining first object object from second position information to first position information as first object direction, really
From second position information to first position information in fixed second target object, as the second target direction.
Identical facial characteristics in first object object and the second target object is chosen in the program respectively, ensures that basis should
It is comparable between the target direction that facial characteristics determines, and then ensures to calculate the accuracy of deflection angle result.
Below in conjunction with concrete application scene, which is specifically described:
Such as Fig. 3, display interface example when a kind of terminal carries out video calling is shown.Preferably, the first object
Feature and second target signature are respectively lower jaw and forehead.Specifically, first object is characterized as the lower jaw in face and volume
One in head;Second target signature is another in the lower jaw and the forehead.
Such as:In the first video frame 31 of terminal acquisition, using the first maxilla 312 of first object object 310 as the
One target signature, the first forehead 311 are used as the second target signature, determine from 312 to the first forehead 311 of first maxilla and are used as first
Target direction 313.Call opposite end send the second video frame 32 in, using the palpognath 322 of the second target object 320 as
First object feature, the second forehead 321 are used as the second target signature, determine from 322 to the second forehead 321 of palpognath as the
Two target directions 323.
In the embodiment, lower jaw and forehead are chosen as target signature, further the of determining first object object respectively
One target direction, and determine the second target direction of the second target object, shape of face difference between different user can be overcome
Precision when influencing, ensure that determining target direction is more accurate, and then ensureing to carry out target image display adjustment.
Step 24:According to the angle between the first object direction and second target direction, described first is determined
Deflection angle of the target object relative to second target object.
In the embodiment, second in the first object direction 313 and the second target object that determine first object object
After target direction 323, the starting point in first object direction 313 and the starting point of the second target direction 323 are overlapped, such as Fig. 4, really
Determine the angle M between first object direction 313 and the second target direction 323, it is opposite that angle M is determined as first object object
In the deflection angle of the second target object.The mode of this determining deflection angle is not only simple, can also reduce error.
Step 25:According to the deflection angle, the display direction of the target image is adjusted.
Wherein, step 25 specifically includes:In the case where the deflection angle is more than predetermined angle, then according to the deflection
Angle adjusts the display direction of the target image.
Further, according to the deflection angle, the display direction for adjusting the target image is target display direction.
Wherein, when the target image is shown with target display direction, the second target object is opposite in the target image
In the first object object deflection angle within the scope of predetermined angle.Preferably, ranging from -5 °~5 ° of predetermined angle.
Such as Fig. 5, by the second video frame 32, rotation (M-5) °~(M+5) ° reaches target location counterclockwise.Such as Fig. 6, show
A kind of example of angular relationship between postrotational second target direction 323 and first object direction 313.In this way, by mesh
Logo image carries out display adjustment, with meet the second target object relative to first object object deflection angle in preset range
It is interior, it can preferably reach the display effect consistent with the display direction of first object object, such as avoid in video calling
In the process since call opposite end deflects or shakes so that the target image that call opposite end is sent is shown in end side
When, it generates the shake for showing picture and causes human eye dizziness, and then influence video calling effect.
In said program, by acquiring first object object, target image is obtained;And according to the second target in target image
Deflection angle of the object relative to first object object, adjusts the display direction of the target image.In this way, in the second target pair
As, there are when angles of display deviation, display adjustment being carried out to target image, to eliminate the second target relative to first object object
Angles of display deviation of the object relative to first object object, to promote visible image display effect.
Referring to Fig. 7, the embodiment of the present invention additionally provides a kind of terminal 700, including:
Acquisition module 710, for acquiring first object object.
Acquisition module 720, for obtaining target image.
First processing module 730 is used for according to the second target object in the target image relative to the first object
The deflection angle of object adjusts the display direction of the target image.
Wherein, the first object object is the first facial image;Second target object is the second facial image.
Wherein, the terminal 700 further includes:
Determining module, the first object direction for determining the first object object and second target object
The second target direction.
Second processing module is used for according to the angle between the first object direction and second target direction, really
Fixed deflection angle of the first object object relative to second target object.
Wherein, the determining module includes:
First determination unit, for according between the first object feature and the second target signature of the first object object
Position relationship, determine the first object direction.
Second determination unit is used for the first object feature according to second target object and second target
Position relationship breath between feature, determines second target direction.
Wherein, the first object feature and second target signature are respectively lower jaw and forehead.
Wherein, the first processing module 730 includes:
First processing units are used in the case where the deflection angle is more than predetermined angle, according to the deflection angle
Adjust the display direction of the target image.
Wherein, the first processing module 730 includes:
Second processing unit, for according to the deflection angle, it is aobvious for target to adjust the display direction of the target image
Show direction.
Wherein, when the target image is shown with target display direction, the second target object phase in the target image
For the first object object deflection angle within the scope of predetermined angle.
Terminal provided in an embodiment of the present invention can realize each mistake that terminal is realized in the embodiment of the method for Fig. 1 to Fig. 2
Journey, to avoid repeating, which is not described herein again.
Terminal 700 in said program obtains target image by acquiring first object object;And according to target image
In deflection angle of second target object relative to first object object, adjust the display direction of the target image.In this way,
Second target object, there are when angles of display deviation, carries out display adjustment, to disappear relative to first object object to target image
Angles of display deviation except the second target object relative to first object object, to promote image display effect.
A kind of hardware architecture diagram of Fig. 8 mobile terminals of each embodiment to realize the present invention.
The mobile terminal 900 includes but not limited to:It is radio frequency unit 901, network module 902, audio output unit 903, defeated
Enter unit 904, sensor 905, display unit 906, user input unit 907, interface unit 908, memory 909, processor
The components such as 910 and power supply 911.It will be understood by those skilled in the art that mobile terminal structure shown in Fig. 8 is not constituted
Restriction to mobile terminal, mobile terminal may include than illustrating more or fewer components, either combine certain components or
Different component arrangements.In embodiments of the present invention, mobile terminal include but not limited to mobile phone, tablet computer, laptop,
Palm PC, car-mounted terminal, wearable device and pedometer etc..
Wherein, processor 910, for acquiring first object object;Obtain target image;According in the target image
Deflection angle of two target objects relative to the first object object, adjusts the display direction of the target image.
Mobile terminal 900 in said program obtains target image by acquiring first object object;And according to target
Deflection angle of second target object relative to first object object in image, adjusts the display direction of the target image.This
Sample, in the target image the second target object relative to first object object there are when angles of display deviation, to target image into
Row display adjustment, it is aobvious to promote image to eliminate angles of display deviation of second target object relative to first object object
Show effect.
It should be understood that the embodiment of the present invention in, radio frequency unit 901 can be used for receiving and sending messages or communication process in, signal
Send and receive, specifically, by from base station downlink data receive after, to processor 910 handle;In addition, by uplink
Data are sent to base station.In general, radio frequency unit 901 includes but not limited to antenna, at least one amplifier, transceiver, coupling
Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 901 can also by radio communication system and network and other set
Standby communication.
Mobile terminal has provided wireless broadband internet to the user by network module 902 and has accessed, and such as user is helped to receive
Send e-mails, browse webpage and access streaming video etc..
It is that audio output unit 903 can receive radio frequency unit 901 or network module 902 or in memory 909
The audio data of storage is converted into audio signal and exports to be sound.Moreover, audio output unit 903 can also be provided and be moved
The relevant audio output of specific function that dynamic terminal 900 executes is (for example, call signal receives sound, message sink sound etc.
Deng).Audio output unit 903 includes loud speaker, buzzer and receiver etc..
Input unit 904 is for receiving audio or video signal.Input unit 904 may include graphics processor
(Graphics Processing Unit, GPU) 9041 and microphone 9042, graphics processor 9041 is in video acquisition mode
Or the image data of the static images or video obtained by image capture apparatus (such as camera) in image capture mode carries out
Reason.Treated, and picture frame may be displayed on display unit 906.Through graphics processor 9041, treated that picture frame can be deposited
Storage is sent in memory 909 (or other storage mediums) or via radio frequency unit 901 or network module 902.Mike
Wind 9042 can receive sound, and can be audio data by such acoustic processing.Treated audio data can be
The format output of mobile communication base station can be sent to via radio frequency unit 901 by being converted in the case of telephone calling model.
Mobile terminal 900 further includes at least one sensor 905, such as optical sensor, motion sensor and other biographies
Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 9061, and proximity sensor can close when mobile terminal 900 is moved in one's ear
Display panel 9061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (general
For three axis) size of acceleration, size and the direction of gravity are can detect that when static, can be used to identify mobile terminal posture (ratio
Such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap);It passes
Sensor 905 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, wet
Meter, thermometer, infrared sensor etc. are spent, details are not described herein.
Display unit 906 is for showing information input by user or being supplied to the information of user.Display unit 906 can wrap
Display panel 9061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode may be used
Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 9061.
User input unit 907 can be used for receiving the number or character information of input, and generate the use with mobile terminal
Family is arranged and the related key signals input of function control.Specifically, user input unit 907 include touch panel 9071 and
Other input equipments 9072.Touch panel 9071, also referred to as touch screen collect user on it or neighbouring touch operation
(for example user uses any suitable objects or attachment such as finger, stylus on touch panel 9071 or in touch panel 9071
Neighbouring operation).Touch panel 9071 may include both touch detecting apparatus and touch controller.Wherein, touch detection
Device detects the touch orientation of user, and detects the signal that touch operation is brought, and transmits a signal to touch controller;Touch control
Device processed receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 910, receiving area
It manages the order that device 910 is sent and is executed.Furthermore, it is possible to more using resistance-type, condenser type, infrared ray and surface acoustic wave etc.
Type realizes touch panel 9071.In addition to touch panel 9071, user input unit 907 can also include other input equipments
9072.Specifically, other input equipments 9072 can include but is not limited to physical keyboard, function key (such as volume control button,
Switch key etc.), trace ball, mouse, operating lever, details are not described herein.
Further, touch panel 9071 can be covered on display panel 9061, when touch panel 9071 is detected at it
On or near touch operation after, send processor 910 to determine the type of touch event, be followed by subsequent processing device 910 according to touch
The type for touching event provides corresponding visual output on display panel 9061.Although in fig. 8, touch panel 9071 and display
Panel 9061 is to realize the function that outputs and inputs of mobile terminal as two independent components, but in some embodiments
In, can be integrated by touch panel 9071 and display panel 9061 and realize the function that outputs and inputs of mobile terminal, it is specific this
Place does not limit.
Interface unit 908 is the interface that external device (ED) is connect with mobile terminal 900.For example, external device (ED) may include having
Line or wireless head-band earphone port, external power supply (or battery charger) port, wired or wireless data port, storage card end
Mouth, port, the port audio input/output (I/O), video i/o port, earphone end for connecting the device with identification module
Mouthful etc..Interface unit 908 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) and
By one or more elements that the input received is transferred in mobile terminal 900 or can be used in 900 He of mobile terminal
Transmission data between external device (ED).
Memory 909 can be used for storing software program and various data.Memory 909 can include mainly storing program area
And storage data field, wherein storing program area can storage program area, application program (such as the sound needed at least one function
Sound playing function, image player function etc.) etc.;Storage data field can store according to mobile phone use created data (such as
Audio data, phone directory etc.) etc..In addition, memory 909 may include high-speed random access memory, can also include non-easy
The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 910 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection
A part by running or execute the software program and/or module that are stored in memory 909, and calls and is stored in storage
Data in device 909 execute the various functions and processing data of mobile terminal, to carry out integral monitoring to mobile terminal.Place
Reason device 910 may include one or more processing units;Preferably, processor 910 can integrate application processor and modulatedemodulate is mediated
Manage device, wherein the main processing operation system of application processor, user interface and application program etc., modem processor is main
Processing wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 910.
Mobile terminal 900 can also include the power supply 911 (such as battery) powered to all parts, it is preferred that power supply 911
Can be logically contiguous by power-supply management system and processor 910, to realize management charging by power-supply management system, put
The functions such as electricity and power managed.
In addition, mobile terminal 900 includes some unshowned function modules, details are not described herein.
Preferably, the embodiment of the present invention also provides a kind of mobile terminal, including processor 910, and memory 909 is stored in
On memory 909 and the computer program that can be run on the processor 910, the computer program are executed by processor 910
Each process of the above-mentioned method for displaying image embodiments of Shi Shixian, and identical technique effect can be reached, to avoid repeating, here
It repeats no more.
The embodiment of the present invention also provides a kind of computer readable storage medium, and meter is stored on computer readable storage medium
Calculation machine program, the computer program realize each process of above-mentioned method for displaying image embodiment, and energy when being executed by processor
Reach identical technique effect, to avoid repeating, which is not described herein again.Wherein, the computer readable storage medium, such as only
Read memory (Read-Only Memory, abbreviation ROM), random access memory (Random Access Memory, abbreviation
RAM), magnetic disc or CD etc..
It should be noted that herein, the terms "include", "comprise" or its any other variant are intended to non-row
His property includes, so that process, method, article or device including a series of elements include not only those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including this
There is also other identical elements in the process of element, method, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical scheme of the present invention substantially in other words does the prior art
Going out the part of contribution can be expressed in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions are used so that a station terminal (can be mobile phone, computer, service
Device, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited in above-mentioned specific
Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art
Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much
Form belongs within the protection of the present invention.
Claims (14)
1. a kind of method for displaying image is applied to terminal, which is characterized in that the method includes:
Acquire first object object;
Obtain target image;
Deflection angle according to the second target object in the target image relative to the first object object, described in adjustment
The display direction of target image.
2. method for displaying image according to claim 1, which is characterized in that the first object object is the first face figure
Picture;Second target object is the second facial image.
3. method for displaying image according to claim 1, which is characterized in that it is described according in the target image second
Deflection angle of the target object relative to the first object object before the display direction for adjusting the target image, also wraps
It includes:
Determine the first object direction of the first object object and the second target direction of second target object;
According to the angle between the first object direction and second target direction, determine that second target object is opposite
In the deflection angle of the first object object.
4. method for displaying image according to claim 3, which is characterized in that the of the determination first object object
Second target direction of one target direction and second target object, including:
According to the position relationship between the first object feature and the second target signature in the first object object, determine described in
First object direction;
According to the first object feature in second target object and the position relationship between second target signature,
Determine second target direction.
5. method for displaying image according to claim 4, which is characterized in that the first object feature and second mesh
It is respectively lower jaw and forehead to mark feature.
6. method for displaying image according to claim 1, which is characterized in that it is described according in the target image second
Deflection angle of the target object relative to the first object object, adjusts the display direction of the target image, including:
In the case where the deflection angle is more than predetermined angle, the display of the target image is adjusted according to the deflection angle
Direction.
7. method for displaying image according to claim 1, which is characterized in that it is described according in the target image second
Deflection angle of the target object relative to the first object object, adjusts the display direction of the target image, including:
According to the deflection angle, the display direction for adjusting the target image is target display direction;
Wherein, when the target image is shown with the target display direction, the second target object phase in the target image
For the first object object deflection angle within the scope of predetermined angle.
8. a kind of terminal, which is characterized in that including:
Acquisition module, for acquiring first object object;
Acquisition module, for obtaining target image;
First processing module is used for according to the second target object in the target image relative to the first object object
Deflection angle adjusts the display direction of the target image.
9. terminal according to claim 8, which is characterized in that the terminal further includes:
Determining module, of first object direction and second target object for determining the first object object
Two target directions;
Second processing module, for according to the angle between the first object direction and second target direction, determining institute
State deflection angle of second target object relative to the first object object.
10. terminal according to claim 9, which is characterized in that the determining module includes:
First determination unit, for according between the first object feature and the second target signature in the first object object
Position relationship determines the first object direction;
Second determination unit, for according in second target object the first object feature and second target it is special
Position relationship between sign determines second target direction.
11. terminal according to claim 8, which is characterized in that the first processing module includes:
First processing units, in the case where the deflection angle is more than predetermined angle, being adjusted according to the deflection angle
The display direction of the target image.
12. terminal according to claim 8, which is characterized in that the first processing module includes:
Second processing unit, for according to the deflection angle, the display direction for adjusting the target image to be target display side
To;
Wherein, when the target image is shown with the target display direction, the second target object phase in the target image
For the first object object deflection angle within the scope of predetermined angle.
13. a kind of terminal, which is characterized in that including processor, memory and be stored on the memory and can be at the place
The computer program run on reason device is realized when the computer program is executed by the processor as appointed in claim 1 to 7
The step of method for displaying image described in one.
14. a kind of computer readable storage medium, which is characterized in that be stored with computer on the computer readable storage medium
Program realizes the method for displaying image as described in any one of claim 1 to 7 when the computer program is executed by processor
The step of.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810310804.XA CN108536371A (en) | 2018-03-30 | 2018-03-30 | A kind of method for displaying image and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810310804.XA CN108536371A (en) | 2018-03-30 | 2018-03-30 | A kind of method for displaying image and terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108536371A true CN108536371A (en) | 2018-09-14 |
Family
ID=63483416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810310804.XA Pending CN108536371A (en) | 2018-03-30 | 2018-03-30 | A kind of method for displaying image and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108536371A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110807729A (en) * | 2019-10-30 | 2020-02-18 | 口碑(上海)信息技术有限公司 | Image data processing method and device |
CN112437243A (en) * | 2020-11-20 | 2021-03-02 | 联想(北京)有限公司 | Output control method and device and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106303029A (en) * | 2016-08-15 | 2017-01-04 | 广东欧珀移动通信有限公司 | The method of controlling rotation of a kind of picture, device and mobile terminal |
CN107357540A (en) * | 2017-06-30 | 2017-11-17 | 维沃移动通信有限公司 | The method of adjustment and mobile terminal of a kind of display direction |
CN107809594A (en) * | 2017-11-10 | 2018-03-16 | 维沃移动通信有限公司 | A kind of image pickup method and mobile terminal |
CN107833178A (en) * | 2017-11-24 | 2018-03-23 | 维沃移动通信有限公司 | A kind of image processing method, device and mobile terminal |
-
2018
- 2018-03-30 CN CN201810310804.XA patent/CN108536371A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106303029A (en) * | 2016-08-15 | 2017-01-04 | 广东欧珀移动通信有限公司 | The method of controlling rotation of a kind of picture, device and mobile terminal |
CN107357540A (en) * | 2017-06-30 | 2017-11-17 | 维沃移动通信有限公司 | The method of adjustment and mobile terminal of a kind of display direction |
CN107809594A (en) * | 2017-11-10 | 2018-03-16 | 维沃移动通信有限公司 | A kind of image pickup method and mobile terminal |
CN107833178A (en) * | 2017-11-24 | 2018-03-23 | 维沃移动通信有限公司 | A kind of image processing method, device and mobile terminal |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110807729A (en) * | 2019-10-30 | 2020-02-18 | 口碑(上海)信息技术有限公司 | Image data processing method and device |
CN110807729B (en) * | 2019-10-30 | 2023-06-23 | 口碑(上海)信息技术有限公司 | Image data processing method and device |
CN112437243A (en) * | 2020-11-20 | 2021-03-02 | 联想(北京)有限公司 | Output control method and device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107957839A (en) | A kind of display control method and mobile terminal | |
CN107682634A (en) | A kind of facial image acquisition methods and mobile terminal | |
CN110465080A (en) | Control method, apparatus, mobile terminal and the computer readable storage medium of vibration | |
CN107786811B (en) | A kind of photographic method and mobile terminal | |
CN109685915A (en) | A kind of image processing method, device and mobile terminal | |
CN111031253B (en) | Shooting method and electronic equipment | |
CN107843993A (en) | A kind of control method and mobile terminal of display screen visible angle | |
CN111031234B (en) | Image processing method and electronic equipment | |
CN107831891A (en) | A kind of brightness adjusting method and mobile terminal | |
CN109327668A (en) | A kind of method for processing video frequency and device | |
CN109065060A (en) | A kind of voice awakening method and terminal | |
CN109658886A (en) | A kind of control method and terminal of display screen | |
CN111314616A (en) | Image acquisition method, electronic device, medium and wearable device | |
CN110018805A (en) | A kind of display control method and mobile terminal | |
CN109542572A (en) | A kind of interface display method and mobile terminal | |
CN109688325A (en) | A kind of image display method and terminal device | |
CN109858447A (en) | A kind of information processing method and terminal | |
CN109443261A (en) | The acquisition methods and mobile terminal of Folding screen mobile terminal folding angles | |
CN109859718A (en) | Screen brightness regulation method and terminal device | |
CN109800606A (en) | A kind of display control method and mobile terminal | |
CN109729336A (en) | A kind of display methods and device of video image | |
CN108289186A (en) | A kind of video image method of adjustment, mobile terminal | |
CN108536371A (en) | A kind of method for displaying image and terminal | |
CN108632535A (en) | A kind of image processing method and mobile terminal | |
CN109063620A (en) | A kind of personal identification method and terminal device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180914 |