CN103517060B - A kind of display control method of terminal equipment and device - Google Patents

A kind of display control method of terminal equipment and device Download PDF

Info

Publication number
CN103517060B
CN103517060B CN201310396448.5A CN201310396448A CN103517060B CN 103517060 B CN103517060 B CN 103517060B CN 201310396448 A CN201310396448 A CN 201310396448A CN 103517060 B CN103517060 B CN 103517060B
Authority
CN
China
Prior art keywords
observation
point
screen
eye
human eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310396448.5A
Other languages
Chinese (zh)
Other versions
CN103517060A (en
Inventor
夏璐
王佳佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Shanghai Co Ltd
Original Assignee
Spreadtrum Communications Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Shanghai Co Ltd filed Critical Spreadtrum Communications Shanghai Co Ltd
Priority to CN201310396448.5A priority Critical patent/CN103517060B/en
Publication of CN103517060A publication Critical patent/CN103517060A/en
Application granted granted Critical
Publication of CN103517060B publication Critical patent/CN103517060B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

The display control method of terminal equipment and a device, described method comprises: the physical location determining human eye; Based on the parameter of the physical location determination point of observation of described human eye, described parameter comprises at least one in locus and viewing angle; Parameter based on described point of observation upgrades the 3D rendering that the screen of described terminal equipment shows.Technical solution of the present invention achieves the 3D interface display of visual angle conversion.

Description

A kind of display control method of terminal equipment and device
Technical field
The present invention relates to image display technology field, particularly relate to a kind of display control method and device of terminal equipment.
Background technology
Along with the development of three-dimensional (3D) image display technology, commercially there is the terminal equipment of the multiple 3D classes such as 3D TV, 3D flat board, 3D mobile phone.The terminal equipment of described 3D class shows 3D rendering based on bore hole 3D Display Technique.
Human eye is when seeing object, the image that two eyes are formed on the retina is also incomplete same, the left side of the object that left eye is seen is many, and right eye sees that the right side of object is many, and the image that left eye and right eye are seen just can produce the vision of object solid after brain is comprehensive.Bore hole 3D Display Technique utilizes this principle to present stereo-picture exactly, shows together with the visual fusion of two different angles that applicable left eye and right eye are mainly watched by bore hole 3D Display Technique.
We know, when shooting 3D film, utilize respectively the right and left eyes of left and right two video cameras simulation people see that the different images that object presents is taken, when showing captured 3D rendering, the visual fusion of two shot by camera in left and right is shown together.Can by the bore hole 3D technology of optical barrier type, what the left eye of observer was watched is the image that left video camera is taken, and the right eye of observer it is seen that the image of right video camera shooting, thus makes observer can watch the display effect of 3D.But when observer watches 3D image, shown 3D rendering can't show different 3D renderings because of the change at observer's viewing location, visual angle, after the posture change of observer, the content of shown 3D rendering can't change.Prior art does not also have corresponding scheme to realize the difference of observation place along with observer, angle, and realizes the 3D interface display of visual angle conversion.
Correlation technique can be the U.S. Patent application of US2013207968A1 with reference to publication number.
Summary of the invention
The problem that the present invention solves is the 3D interface display how realizing visual angle conversion.
For solving the problem, the invention provides a kind of display control method of terminal equipment, described method comprises:
Determine the physical location of human eye;
Based on the parameter of the physical location determination point of observation of described human eye, described parameter comprises at least one in locus and viewing angle;
Parameter based on described point of observation upgrades the 3D rendering that the screen of described terminal equipment shows.
Optionally, describedly determine that the physical location of human eye comprises: by determining the physical location of human eye to the identification of human eye iris.
Optionally, state point of observation and comprise left point of observation and right point of observation.
Optionally, describedly determine that the physical location of human eye comprises:
The distance H1 of described physical location and screen is obtained according to H1=H0/ (M1/M0), wherein, H0 is the distance between reference position and screen, M0 be when human eye in described reference position just to the length of the major axis of the iris region that measurement during described screen obtains, M1 is the length of the major axis of the iris region that measurement obtains when human eye is in described physical location.
Optionally, describedly determine that the physical location of human eye comprises:
When human eye is in described physical location, measure the central point in iris of left eye region and the distance A1 of longitudinal central axis line, and measure the central point in iris of right eye region and the distance A2 of described longitudinal central axis line, when human eye in reference position just to left eye during described screen and right eye relative to described longitudinal central axis line symmetry.
Optionally, the parameter of the described physical location determination point of observation based on described human eye comprises:
When human eye is in described physical location, measure the length W1 of semi-minor axis and the length L1 of semi-major axis in iris of left eye region, and measure the length W2 of semi-minor axis and the length L2 of semi-major axis in iris of right eye region;
The viewing angle a1 of left point of observation is obtained, when human eye is just symmetrical relative to described longitudinal central axis line to left eye during described screen and right eye in reference position according to a1=arcos (W1/L1);
The viewing angle a2 of right point of observation is obtained according to a2=arcos (W2/L2).
Optionally, the distance of described reference position and screen is for being suitable for the distance of terminal equipment described in eye-observation.
Optionally, it is characterized in that, described measurement refers to that the eye image to camera head absorbs is measured, and described camera head is arranged at the position to human eye above described terminal equipment, and the longitudinal centre line of described eye image is longitudinal central axis line.
Optionally, the parameter of the described physical location determination point of observation based on described human eye comprises:
According to Z1=Z l× (H1/H0) determines left point of observation position Z1 in the z-axis direction, Z lfor when human eye in described reference position just to left point of observation position in the z-axis direction described during described screen, described z-axis direction is the direction of vertical screen;
According to Z2=Z r× (H1/H0) determines right point of observation position Z2 in the z-axis direction, Z rfor when human eye in described reference position just to right point of observation position in the z-axis direction described during described screen.
Optionally, the parameter of the described physical location determination point of observation based on described human eye comprises:
According to X1=X l× (A1/A0) determines left point of observation position X1 in the direction of the x axis, X lfor when human eye in described reference position just to point of observation position in the direction of the x axis left during described screen, A0 be when human eye in described reference position just to the central point of the simple eye iris region that measurement during described screen obtains and the distance of described longitudinal central axis line, described x-axis direction is the horizontal direction of screen;
According to X2=X r× (A2/A0) determines right point of observation position X2 in the direction of the x axis, X rfor when human eye in described reference position just to point of observation position in the direction of the x axis right during described screen.
Optionally, described parameter also comprises direction of observation, and the parameter based on the physical location determination point of observation of described human eye also comprises:
The locus of focus is set;
Determine that direction that described left point of observation points to described focus is the direction of observation of described left point of observation;
Determine that direction that described right point of observation points to described focus is the direction of observation of described right point of observation.
Optionally, described terminal equipment is built-in to be corresponded to the camera lens of left eye and corresponds to the camera lens of right eye, and described point of observation comprises left point of observation and right point of observation;
The described parameter based on described point of observation upgrades the 3D rendering that screen shows and comprises:
The camera lens of left eye is corresponded to according to the parameter adjustment of described left point of observation;
The camera lens of right eye is corresponded to according to the parameter adjustment of described right point of observation;
The image that the image absorbed according to the camera lens corresponding to left eye after adjustment and the camera lens corresponding to right eye after adjusting absorb obtains 3D rendering;
Show the 3D rendering obtained on the screen.
Optionally, described terminal equipment Built-in lens array, described point of observation comprises left point of observation and right point of observation;
The described parameter based on described point of observation upgrades the 3D rendering that screen shows and comprises:
Left camera lens is set to the camera lens of the parameter matching of described left point of observation by described lens array;
Right camera lens is set to the camera lens of the parameter matching of described right point of observation by described lens array;
3D rendering is obtained according to the described image of left camera lens picked-up and the image of left camera lens picked-up;
Show the 3D rendering obtained on the screen.
Optionally, store the image corresponding with the parameter of point of observation in described terminal equipment, described point of observation comprises left point of observation and right point of observation;
The described parameter based on described point of observation upgrades the 3D rendering that screen shows and comprises:
3D rendering is obtained according to the image corresponding with the parameter of described left point of observation and the image corresponding with the parameter of described right point of observation;
Show the 3D rendering obtained on the screen.
Technical solution of the present invention also provides a kind of display control unit of terminal equipment, comprising:
Position determination unit, is suitable for the physical location determining human eye;
Parameter determination unit, be suitable for the parameter of the physical location determination point of observation based on described human eye, described parameter comprises at least one in locus and viewing angle;
Updating block, is suitable for upgrading based on the parameter of described point of observation the 3D rendering that the screen of described terminal equipment shows.
Compared with prior art, technical scheme of the present invention has the following advantages:
According to locus and the viewing angle information of the physical location determination point of observation of human eye, and then based on the 3D rendering that the locus of described point of observation and the screen of viewing angle information updating terminal equipment show, achieve the display at the 3D interface of visual angle conversion, can make user can from different angle views to different 3D renderings, method is simple, effective, for user, realize preferably 3D viewing effect, improve user experience.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of the display control method of the terminal equipment that technical solution of the present invention provides;
Fig. 2 is the schematic flow sheet of the display control method of the terminal equipment that the embodiment of the present invention one provides;
Fig. 3 is the schematic flow sheet of the display control method of the terminal equipment that the embodiment of the present invention two provides.
Embodiment
As stated in the Background Art, in prior art, when observer watches 3D image, shown 3D rendering can't show different 3D renderings because of the change at observer's viewing location, visual angle, after the posture change of observer, the content of shown 3D rendering can't change.3D interface shown on screen normally shows point of observation in the viewed scene of 3d space.Left point of observation and right point of observation is divided into for described point of observation, described left point of observation corresponds to the left eye of human eye, described right point of observation corresponds to the right eye of human eye, observer is watched by left point of observation position and the determined image of angle by left eye, by right eye viewing by right point of observation position and the determined image of angle, due to the image captured by left point of observation and the right point of observation camera lens that corresponding left and right is different respectively, time together with the different visual fusion that left eye and right eye are watched by observer, just can watch the display effect of 3D.
Point of observation also can be called picture pick-up device (as camera) or eyes in 3D graphics, and the position of point of observation is exactly the position at the center of camera lens or eyes usually; The focus of camera lens or the focus point of eyes are called focus or the focus of point of observation; The line direction of point of observation and focus is exactly the major axes orientation of point of observation, also referred to as direction of observation.If set virtual focus in virtual scene, the position of point of observation can be set accordingly, the focus of this point of observation is overlapped with virtual focus.
But the position of point of observation described in prior art and angle are all fixing usually, in screen shows 3D interface, position and the angle of described point of observation immobilize, it can only be the vision display realizing fixed-direction, can't along with the position of the adjustment points of observation such as the change of the change of observer's physical location or vision and angle, but, assuming that screen position is constant, the locus of point of observation is different, and the scene observed should also be different; The viewing angle of point of observation is different, the scene observed also should be different, the position of point of observation and angle should can be arranged on the optional position in spatial scene along with the position of observer and vision, the mode that therefore in prior art, point of observation position and angle are fixed has been difficult to meet 3D interface display demand.
In order to realize the display at the 3D interface of visual angle conversion, technical solution of the present invention provides a kind of display control method of terminal equipment, shown in method according to the locus of the physical location determination point of observation of human eye and viewing angle information, and then based on the 3D rendering that the locus of described point of observation and the screen of viewing angle information updating terminal equipment show.
Fig. 1 is the schematic flow sheet of the display control method of the terminal equipment that technical solution of the present invention provides, and as shown in Figure 1, first performs step S101, determines the physical location of human eye.
In order to the display realizing 3D interface can change along with the transformation of the observation place of observer, angle, then first need the position current to observer's human eye to determine, namely determine the physical location of human eye.
In prior art, existing multiple human eye detection recognition technology, such as, based on the detection method of eyeball, pupil or the detection method etc. based on eye structure feature.Have the multiple eye locating methods such as domain division method, edge extracting method, Gray Projection method, neural network and template matches at present.Because the oculopupillary size of people can change because of the change of light etc., but the size of iris is constant, comparatively speaking, in image captured by human eye detection, the size of iris or the change of shape are subject to the impact of the distance of human eye distance camera or angle larger, and the change of human eye iris is smaller by the impact of environment.For example, for the photo that camera captures, the region of human eye iris diminishes, and means that human eye is away from screen, otherwise is close, so, in the present invention, the physical location of human eye iris recognition technology determination human eye can be adopted.
Can by the front-facing camera be equipped with on the terminal device, when observer watches screen, in real time the face-image of observer is taken, and then based on captured image, identified by the image of described human eye iris recognition technology to described shooting, the change that occurs in reference position and current location human eye iris of person according to the observation, determines the current physical location of human eye.Described reference position for observer just to screen and the distance of the eyes of observer between screen for being suitable for the distance of terminal equipment described in eye-observation, for example, if described terminal equipment is mobile phone, then the distance being suitable for terminal equipment described in eye-observation described in is 30 centimetres to 50 centimetres.Describedly just refer in the image by terminal equipment being equipped with captured by front-facing camera to screen, the left eye of human eye and the position of right eye are symmetrical relative to the longitudinal centre line of captured image, in technical solution of the present invention, described longitudinal centre line is called longitudinal central axis line.
In the process of physical location determining human eye, the human eye distance distance of screen can be determined, distance etc. that human eye departs from longitudinal central axis line.
Perform step S102, based on the parameter of the physical location determination point of observation of described human eye, described parameter comprises at least one in locus and viewing angle.
The locus of point of observation can represent with its coordinate figure in the xyz coordinate system of space, and wherein, x-axis is the horizontal direction of screen, and x-axis forward is the horizontal right direction of screen; Y-axis is the vertical direction of screen, and y-axis forward is the direction straight up of screen; Z-axis is the direction of vertical screen and xy plane, and z-axis forward is the inside direction of vertical screen.The initial point of described space xyz coordinate system can be the center position of display screen.When considering that observer watches the image at 3D interface, the orientation of the eyes movement of observer is the movement of front, rear, left and right mostly, so in embodiments of the present invention emphatically to determining that the locus of point of observation in x-axis and z-axis is set forth, the change of the locus of point of observation in y-axis is not considered for the impact of the display at 3D interface.The method determination point of observation locus in the y-axis direction of the locus of the determination point of observation that those skilled in the art can provide in conjunction with the embodiment of the present invention on x-axis, z-axis direction.
In this step, the distance etc. of longitudinal central axis line can be departed from by the distance of human eye distance screen determined in step S101, human eye, determine the locus of point of observation.For example, based on the distance of human eye distance screen, can corresponding adjustment point of observation position in the z-axis direction, the distance of longitudinal central axis line is departed from based on human eye, can corresponding adjustment point of observation position in the direction of the x axis.
The viewing angle of point of observation i.e. the observation visual angle of point of observation, can be understood as the angle of direction of observation (or point of observation towards) and each axle.It has been generally acknowledged that human eye is when view screen, the motion of human eye is only the motion of left and right directions, and the motion in other direction is not considered in the technical program, so, in the technical program, only consider the angle in direction of observation and x direction.
During human eye side-to-side movement, can based on the ratio determination direction of observation of length of the semi-minor axis of human eye iris region and semi-major axis and the angle of horizontal direction.After determining the focus in space, then also can determine viewing angle by detecting direction of observation.
Perform step S103, the parameter based on described point of observation upgrades the 3D rendering that the screen of described terminal equipment shows.
Usually, the data source that graphics can be divided into calculate for the data source of the original 3D rendering data showing 3D interface and the data source of recording type.The data source that described graphics calculates refers to that terminal equipment based on this data source according to the shot information provided, can calculate the 3D rendering corresponding to this lens location, such as, in 3D game, can calculate various 3D interface in real time; The data source of described recording refers to based on the data message captured by each shot information in lens array.Described shot information comprises the information such as shooting angle of lens location, camera lens, described camera lens can comprise corresponding to left eye camera lens and correspond to the camera lens of right eye.
Based on the different data source of above-mentioned 3D rendering data, the display of 3D rendering can be realized by different modes.Such as, the data source that graphic based calculates, can think in terminal equipment containing camera lens, according to different shot information, calculate the different 3D rendering of corresponding shot information, the parameter adjustment camera lens put according to the observation, and then obtain the 3D rendering for showing according to the image (image calculated) of the camera lens picked-up after adjustment.If 3D rendering data are the data sources based on recording type, then can find the shot information with point of observation parameter matching from described lens array, then by based on the data message captured by described shot information, i.e. the 3D rendering of described shooting is presented on screen.In addition, if store the image corresponding with the parameter of point of observation in terminal equipment, then also on screen, directly can show the 3D rendering corresponding with the parameter of point of observation.
By locus and the viewing angle information of the above-mentioned physical location determination point of observation according to human eye, and then based on the method for the 3D rendering that the locus of described point of observation and the screen of viewing angle information updating terminal equipment show, the difference of the locus that can put according to the observation, viewing angle or direction of observation, and the 3D interface that the display of correspondence is different, realize the display at the 3D interface of visual angle conversion.
Be appreciated that, physical location due to determined human eye is the relative position between observation place and the screen of terminal equipment, therefore technical solution of the present invention had both been applicable to screen and had moved (as all around is moved) but the constant situation in observation place, also be applicable to the situation that the motionless but observation place of screen (as movable in human eye, human eye left-right rotation etc.) changes, be equally also applicable to screen and move and observation place situation about also changing.In addition, above-mentioned point of observation can be a point of observation can observing 3D rendering, also can be divided into two points of observation in left and right, the image co-registration that two points of observation are observed is become 3D rendering.For enabling above-mentioned purpose of the present invention, feature and advantage more become apparent, and be divided into left point of observation and right point of observation below, be described in detail in conjunction with the accompanying drawings and embodiments to technical solution of the present invention for point of observation.
Embodiment one
In the present embodiment, mainly to the locus of the physical location determination point of observation based on human eye, and then set forth based on the implementation procedure that the locus of described point of observation upgrades the 3D rendering that the screen of described terminal equipment shows.
Fig. 2 is the schematic flow sheet of the display control method of the terminal equipment that the embodiment of the present invention one provides.
First perform step S201, the human eye iris at reference position and physical location place is identified respectively.
In the present embodiment, take terminal equipment as mobile phone for example is described.Described mobile phone contains front-facing camera, and described front-facing camera is positioned at the center position above mobile phone screen.Described reference position be observer just to mobile phone screen and the distance of the eyes of observer between screen is the position of 30 centimetres, when human eye in described reference position just to described screen time, the position of the left point of observation that known left eye is corresponding is (X l, 0, Z l), the position of right point of observation that right eye is corresponding is (X r, 0, Z r).
By human eye iris recognition technology, the image captured by front-facing camera described in reference position and physical location place is identified.Step S201 please refer to step S101.
Perform step S202, measure the length of the major axis of iris region when human eye is in described reference position and physical location respectively.
Based on the recognition image of the human eye obtained in step S201 in reference position and physical location, measure the length M0 of the major axis of iris region in the recognition image obtained in reference position, measure the length M1 of the major axis of iris region in the recognition image obtained in physical location.It should be noted that, obtain the length value of the major axis of iris region herein, in order to the size according to iris region, namely according to the change of the length of the major axis of iris region, and determine the change of point of observation position in the z-axis direction, the major axis of iris region described herein can be the major axis of the iris region of left eye, also can be the major axis of the iris region of right eye.
Perform step S203, obtain described reference position and the distance between physical location and screen respectively.
In the present embodiment, the distance H0 between described reference position and screen can be 30 such as described in step S201 centimetre.Distance H1 between described physical location and screen can pass through formula (1) and obtain.
H1=H0/(M1/M0)(1)
Wherein, H1 is the distance between described physical location and screen, H0 is the distance between reference position and screen, M0 be when human eye in described reference position just to the length of the major axis of the iris region that measurement during described screen obtains, M1 is the length of the major axis of the iris region that measurement obtains when human eye is in described physical location.
Perform step S204, measure human eye when physical location, the central point in iris of left eye region and the distance between the central point in iris of right eye region and longitudinal central axis line, measure the distance of human eye when reference position and between longitudinal central axis line.
Based on the recognition image of the human eye obtained in step S201 in reference position and physical location, measure the distance A1 between the central point in human eye iris of left eye region when physical location and longitudinal central axis line, measure the distance A2 between the central point in human eye iris of right eye region when physical location and longitudinal central axis line.Because human eye is when reference position, just to screen position, the left eye of human eye and the position of right eye are symmetrical relative to the longitudinal centre line of captured image, so human eye is when reference position, left eye and be equidistant between right eye and longitudinal central axis line, now, when measuring the distance A0 of human eye when reference position and between longitudinal central axis line, the central point in human eye left eye or iris of right eye region when reference position and the spacing of longitudinal central axis line can be measured.
Perform step S205, determine left point of observation position in the z-axis direction.
Can based on the reference position obtained in step S203 and the distance H0 between physical location and screen and H1, and human eye in described reference position just to point of observation left during described screen position Z in the z-axis direction l, obtain left point of observation position Z1 in the z-axis direction by formula (2).
Z1=Z L×(H1/H0)(2)
Perform step S206, determine right point of observation position in the z-axis direction.
Can based on the reference position obtained in step S203 and the distance H0 between physical location and screen and H1, and human eye in described reference position just to point of observation right during described screen position Z in the z-axis direction r, obtain right point of observation position Z2 in the z-axis direction by formula (3).
Z2=Z R×(H1/H0)(3)
Perform step S207, determine left point of observation position in the direction of the x axis.
Distance A1 between the central point in the distance A0 of the human eye that can obtain based on the measurement in step S204 when reference position and between longitudinal central axis line and human eye iris of left eye region when physical location and longitudinal central axis line, and human eye in described reference position just to point of observation left during screen position X in the direction of the x axis l, determine left point of observation position X1 in the direction of the x axis by formula (4).
X1=X L×(A1/A0)(4)
Perform step S208, determine right point of observation position in the direction of the x axis.
Distance A2 between the central point in the distance A0 of the human eye that can obtain based on the measurement in step S204 when reference position and between longitudinal central axis line and human eye iris of right eye region when physical location and longitudinal central axis line, and human eye in described reference position just to point of observation right during screen position X in the direction of the x axis r, determine right point of observation position X2 in the direction of the x axis by formula (5).
X2=X R×(A2/A0)(5)
Perform step S209, correspond to the camera lens of left eye according to the position adjustment of left point of observation.
According to left point of observation determined in step S205 position the Z1 in the z-axis direction and left point of observation position X1 in the direction of the x axis determined in step S207, adjustment corresponds to the camera lens of left eye.The described position corresponding to the camera lens of left eye is (X1,0, Z1).
Perform step S210, correspond to the camera lens of right eye according to the position adjustment of right point of observation.
According to right point of observation determined in step S206 position the Z2 in the z-axis direction and right point of observation position X2 in the direction of the x axis determined in step S208, adjustment corresponds to the camera lens of right eye.The described position corresponding to the camera lens of right eye is (X2,0, Z2).
Perform step S211, obtain 3D rendering according to the camera lens corresponding to left eye after adjustment and the camera lens corresponding to right eye after adjustment.
Based on the different data source of 3D rendering data, according to the left eye after adjustment and the shot information of right eye obtain respectively corresponding to the camera lens of left eye 3D rendering and correspond to the 3D rendering of camera lens of right eye.Please refer to step S103.
Perform step S212, screen shows obtained 3D rendering.
The 3D rendering corresponding to the camera lens of left eye obtained in step S211 and the 3D rendering of camera lens that corresponds to right eye are presented on screen.
It should be noted that, the present embodiment be human eye distance screen distance change and side-looking screen time, determine point of observation position in the z-axis direction by the actual range calculating human eye and screen, and the distance departing from longitudinal central axis line by calculating right and left eyes determines point of observation position in the direction of the x axis.Be understandable that, in other embodiments, when the human eye distance invariant position of screen but side-looking screen time only can determine point of observation position in the direction of the x axis, or when human eye face screen but change in location apart from screen time only can determine point of observation position in the z-axis direction.
By the method that the present embodiment provides, can according to the locus of the physical location determination point of observation of human eye, and then determine the shot information corresponding to human eye based on the locus of described point of observation, and then upgrade the 3D rendering that the screen of terminal equipment shows, achieve the display at the 3D interface of visual angle conversion.
Embodiment two
In the present embodiment, mainly to locus and the viewing angle of the physical location determination point of observation based on human eye, and then set forth based on the implementation procedure that the locus of described point of observation and viewing angle upgrade the 3D rendering that the screen of described terminal equipment shows.
Fig. 3 is the schematic flow sheet of the display control method of the terminal equipment that the embodiment of the present invention two provides.
As shown in Figure 3, first step 301 is performed, step 301 corresponds to step S201 in Fig. 1 to step S204, mainly by identifying respectively the human eye iris at reference position and physical location place, and some used measured values and calculated value when obtaining the locus for determining point of observation, specifically please refer to step S201 to step S204.
Perform step S302, determine the locus of left point of observation and right point of observation.
Determine the position of left point of observation on x-axis, z-axis direction and determine the position of right point of observation on x-axis, z-axis direction.Please refer to step S205 to step S208.
Perform step S303, measure the human eye semi-minor axis in the semi-minor axis in iris of left eye region and the length of semi-major axis and iris of right eye region and length of semi-major axis when physical location.
Based on the recognition image of human eye in reference position and physical location, measure human eye length W1 of the semi-minor axis in iris of left eye region and the length L1 of semi-major axis when physical location, and measure human eye length W2 of the semi-minor axis in iris of right eye region and the length L2 of semi-major axis when physical location.
Perform step S304, determine the viewing angle of left point of observation.
During human eye side-to-side movement, can based on the ratio determination point of observation of the length of the semi-minor axis of human eye iris region and semi-major axis towards the angle with horizontal direction.
Based on the length W1 of semi-minor axis in the human eye obtained in step S303 iris of left eye region when physical location and the length L1 of semi-major axis, the viewing angle a1 of left point of observation can be determined by formula (6).
a1=arcos(W1/L1)(6)
Perform step S305, determine the viewing angle of right point of observation.
Based on the length W2 of semi-minor axis in the human eye obtained in step S303 iris of right eye region when physical location and the length L2 of semi-major axis, the viewing angle a2 of left point of observation can be determined by formula (7).
a2=arcos(W2/L2)(6)
Perform step S306, correspond to the camera lens of left eye according to the locus of left point of observation and viewing angle adjustment.
According to the locus of determined left point of observation and the viewing angle of left point of observation determined in step s 304, adjustment corresponds to the camera lens of left eye.The described lens location corresponding to left eye is (X1,0, Z1), rotates the camera lens corresponding to left eye and makes the line (direction of observation) of this camera lens and its focus be a1 relative to the deflection angle of horizontal direction.
Perform step S307, correspond to the camera lens of right eye according to the locus of right point of observation and viewing angle adjustment.
According to the locus of determined right point of observation and the viewing angle of right point of observation determined in step S305, adjustment corresponds to the camera lens of right eye.The described lens location corresponding to right eye is (X2,0, Z2), rotates the camera lens corresponding to right eye and makes the line (direction of observation) of this camera lens and its focus be a2 relative to the deflection angle of horizontal direction.
Perform step S308, obtain 3D rendering according to the camera lens corresponding to left eye after adjustment and the camera lens corresponding to right eye after adjustment.
Based on the different data source of 3D rendering data, according to the left eye after adjustment and the shot information of right eye obtain respectively corresponding to the camera lens of left eye 3D rendering and correspond to the 3D rendering of camera lens of right eye.Please refer to step S103.
Perform step S309, screen shows obtained 3D rendering.
The 3D rendering corresponding to the camera lens of left eye obtained in step S308 and the 3D rendering of camera lens that corresponds to right eye are presented on screen.
By the method that the present embodiment provides, can according to the locus of the physical location determination point of observation of human eye and viewing angle, and then based on the locus of described point of observation and viewing angle determination shot information, can effectively control the camera lens corresponding to human eye position and towards, realize the display at 3D interface of visual angle conversion.
It should be noted that, in other embodiments, also only can determine the viewing angle of point of observation according to the physical location of human eye, so based on described point of observation viewing angle determination human eye corresponding to shot information, finally realize the display at 3D interface of visual angle conversion.
In addition, in other embodiments, when needing the position changing focus, as user wishes focus a bit to move to another point from screen, after the locus of point of observation is determined, the direction of observation of left point of observation and the direction of observation of right point of observation can be redefined according to the focus after the change of position.Particularly, focus (namely the locus of focus is set) corresponding in the focus installation space after first changing according to position in screen; Determine that direction that described left point of observation points to described focus is the direction of observation of described left point of observation again; And determine that direction that described right point of observation points to described focus is the direction of observation of described right point of observation.
Then, correspond to the camera lens of left eye according to the direction of observation adjustment of determined left point of observation, correspond to the camera lens of right eye according to the direction of observation adjustment of the right point of observation determined.Particularly, keep the locus of the described camera lens corresponding to left eye constant, make the described central rotation of camera lens relative to camera lens corresponding to left eye, until the described main shaft corresponding to the camera lens of left eye overlaps with the direction of observation of described left point of observation.Keep the locus of the described camera lens corresponding to right eye constant, make the described central rotation of camera lens relative to camera lens corresponding to right eye, until the described main shaft corresponding to the camera lens of right eye overlaps with the direction of observation of described right point of observation.
The display control method of corresponding above-mentioned terminal equipment, technical solution of the present invention also provides a kind of display control unit of terminal equipment, comprises position determination unit, parameter determination unit and updating block.
Described position determination unit, is suitable for the physical location determining human eye.
Described parameter determination unit, be suitable for the parameter of the physical location determination point of observation based on described human eye, described parameter comprises at least one in locus and viewing angle.
Described updating block, is suitable for upgrading based on the parameter of described point of observation the 3D rendering that the screen of described terminal equipment shows.
In embodiments of the present invention, described position determination unit is physical location by determining human eye to the identification of human eye iris.
Particularly, described position determination unit can comprise acquisition unit, or can comprise the first measuring unit, or can comprise acquisition unit and the first measuring unit.
Described acquisition unit is suitable for the distance H1 obtaining described physical location and screen according to H1=H0/ (M1/M0), wherein, H0 is the distance of reference position and screen, M0 be when human eye in described reference position just to the length of the major axis of the iris region that measurement during described screen obtains, M1 is the length of the major axis of the iris region that measurement obtains when human eye is in described physical location.
Described first measuring unit is suitable for when human eye is in described physical location, measure the central point in iris of left eye region and the distance A1 of longitudinal central axis line, and measure the central point in iris of right eye region and the distance A2 of described longitudinal central axis line, when human eye in reference position just to left eye during described screen and right eye relative to described longitudinal central axis line symmetry.
Correspondingly, described parameter determination unit can comprise primary importance determining unit and second place determining unit, or the 3rd position determination unit and the 4th position determination unit can be comprised, or primary importance determining unit, second place determining unit, the 3rd position determination unit and the 4th position determination unit can be comprised.
Described primary importance determining unit is suitable for according to Z1=Z l× (H1/H0) determines left point of observation position Z1 in the z-axis direction, Z lfor when human eye in described reference position just to left point of observation position in the z-axis direction described during described screen, described z-axis direction is the direction of vertical xy plane, and wherein, x-axis direction is the horizontal direction of screen, the vertical direction of y-axis direction screen.
Described second place determining unit is suitable for according to Z2=Z r× (H1/H0) determines right point of observation position Z2 in the z-axis direction, Z rfor when human eye in described reference position just to right point of observation position in the z-axis direction described during described screen.
Described 3rd position determination unit is suitable for according to X1=X l× (A1/A0) determines left point of observation position X1 in the direction of the x axis, X lfor when human eye in described reference position just to point of observation position in the direction of the x axis left during described screen, A0 be when human eye in described reference position just to the central point of the simple eye iris region that measurement during described screen obtains and the distance of described longitudinal central axis line, described x-axis direction is the horizontal direction of screen.
Described 4th position determination unit is suitable for according to X2=X r× (A2/A0) determines right eye 3D camera lens position X2 in the direction of the x axis, X rfor when human eye in described reference position just to point of observation position in the direction of the x axis right during described screen.
Further, described parameter determination unit can also comprise direction-determining unit, and described direction-determining unit comprises: focus setting unit, is suitable for the locus arranging focus; First direction determining unit, is suitable for determining that direction that described left point of observation points to described focus is the direction of observation of described left point of observation; Second direction determining unit, is suitable for determining that direction that described right point of observation points to described focus is the direction of observation of described right point of observation.
Described parameter determination unit also can comprise: the second measuring unit, the first angle determination unit and the second angle determination unit.
Described second measuring unit is suitable for when human eye is in described physical location, measures the length W1 of semi-minor axis and the length L1 of semi-major axis in iris of left eye region, and measures the length W2 of semi-minor axis and the length L2 of semi-major axis in iris of right eye region.
Described first angle determination unit is suitable for the viewing angle a1 obtaining left point of observation according to a1=arcos (W1/L1), when human eye is just symmetrical relative to described longitudinal central axis line to left eye during described screen and right eye in reference position.
Described second angle determination unit is suitable for the viewing angle a2 obtaining right point of observation according to a2=arcos (W2/L2).
In one embodiment of this invention, described terminal equipment is built-in to be corresponded to the camera lens of left eye and corresponds to the camera lens of right eye, and described point of observation comprises left point of observation and right point of observation; Described updating block comprises: the first adjustment unit, is suitable for the camera lens corresponding to left eye according to the parameter adjustment of described left point of observation; Second adjustment unit, is suitable for the camera lens corresponding to right eye according to the parameter adjustment of described right point of observation; First image acquiring unit, is suitable for the image acquisition 3D rendering that the image that absorb of the camera lens corresponding to left eye after according to adjustment and the camera lens corresponding to right eye after adjusting absorb; First display unit, is suitable for showing obtained 3D rendering on the screen.
In another embodiment of the invention, described terminal equipment Built-in lens array, described point of observation comprises left point of observation and right point of observation; Described updating block comprises: the first setting unit, is suitable for being set to left camera lens by described lens array with the camera lens of the parameter matching of described left point of observation; Second setting unit, is suitable for being set to right camera lens by described lens array with the camera lens of the parameter matching of described right point of observation; Second image acquiring unit, is suitable for obtaining 3D rendering according to the described image of left camera lens picked-up and the image of left camera lens picked-up; Second display unit, is suitable for showing obtained 3D rendering on the screen.
In another embodiment of the present invention, store the image corresponding with the parameter of point of observation in described terminal equipment, described point of observation comprises left point of observation and right point of observation; Described updating block comprises: the 3rd obtains unit, is suitable for obtaining 3D rendering according to the image corresponding with the parameter of described left point of observation and the image corresponding with the parameter of described right point of observation; 3rd display unit, is suitable for showing obtained 3D rendering on the screen.
Although the present invention discloses as above, the present invention is not defined in this.Any those skilled in the art, without departing from the spirit and scope of the present invention, all can make various changes or modifications, and therefore protection scope of the present invention should be as the criterion with claim limited range.

Claims (20)

1. a display control method for terminal equipment, is characterized in that, comprising:
Determine the physical location of human eye;
Based on the parameter of the physical location determination point of observation of described human eye, described parameter comprises at least one in locus and viewing angle, and described point of observation comprises left point of observation and right point of observation; Wherein, determine the locus of point of observation, comprise: obtain the long axis length of the iris region of human eye when preset reference position and described physical location, the distance of screen of described reference position and described terminal equipment and the distance of described physical location and described screen, to determine that position in the z-axis direction distinguished by left point of observation and right point of observation; Measure human eye respectively when described physical location and described reference position, the central point in iris of left eye region and the distance of central point respectively and between longitudinal central axis line in iris of right eye region, to determine left point of observation and right point of observation difference position in the direction of the x axis; Determine the viewing angle of point of observation, comprising: the viewing angle a1 obtaining left point of observation according to a1=arcos (W1/L1), when human eye is just symmetrical relative to described longitudinal central axis line to left eye during described screen and right eye in reference position; The viewing angle a2 of right point of observation is obtained according to a2=arcos (W2/L2); Wherein, W1 is the length of the semi-minor axis in iris of left eye region, L1 is the length of the semi-major axis in iris of left eye region, W2 is the length of the semi-minor axis in iris of right eye region, L2 is the length of the semi-major axis in iris of right eye region, x-axis is the horizontal direction of described screen, and y-axis is the vertical direction of screen, and z-axis is the direction of vertical xy plane;
Parameter based on described point of observation upgrades the 3D rendering that the screen of described terminal equipment shows.
2. the display control method of terminal equipment as claimed in claim 1, is characterized in that, describedly determines that the physical location of human eye comprises: by determining the physical location of human eye to the identification of human eye iris.
3. the display control method of terminal equipment as claimed in claim 1, is characterized in that, describedly determines that the physical location of human eye comprises:
The distance H1 of described physical location and screen is obtained according to H1=H0/ (M1/M0), wherein, H0 is the distance between reference position and screen, M0 be when human eye in described reference position just to the length of the major axis of the iris region that measurement during described screen obtains, M1 is the length of the major axis of the iris region that measurement obtains when human eye is in described physical location.
4. the display control method of terminal equipment as claimed in claim 3, it is characterized in that, the distance of described reference position and screen is for being suitable for the distance of terminal equipment described in eye-observation.
5. the display control method of terminal equipment as claimed in claim 3, it is characterized in that, described measurement refers to that the eye image to camera head absorbs is measured, described camera head is arranged at the position to human eye above described terminal equipment, and the longitudinal centre line of described eye image is longitudinal central axis line.
6. the display control method of terminal equipment as claimed in claim 3, it is characterized in that, the parameter of the described physical location determination point of observation based on described human eye comprises:
According to Z1=Z l× (H1/H0) determines left point of observation position Z1 in the z-axis direction, Z lfor when human eye in described reference position just to left point of observation position in the z-axis direction described during described screen, described z-axis direction is the direction of vertical screen;
According to Z2=Z r× (H1/H0) determines right point of observation position Z2 in the z-axis direction, Z rfor when human eye in described reference position just to right point of observation position in the z-axis direction described during described screen.
7. the display control method of terminal equipment as claimed in claim 1, it is characterized in that, the parameter of the described physical location determination point of observation based on described human eye comprises:
According to X1=X l× (A1/A0) determines left point of observation position X1 in the direction of the x axis, X lfor when human eye in described reference position just to point of observation position in the direction of the x axis left during described screen, A1 is the central point in iris of left eye region and the distance of longitudinal central axis line, A0 be when human eye in described reference position just to the central point of the simple eye iris region that measurement during described screen obtains and the distance of described longitudinal central axis line, described x-axis direction is the horizontal direction of screen;
According to X2=X r× (A2/A0) determines right point of observation position X2 in the direction of the x axis, and A2 is the central point in iris of right eye region and the distance of longitudinal central axis line, X rfor when human eye in described reference position just to point of observation position in the direction of the x axis right during described screen.
8. the display control method of terminal equipment as claimed in claims 6 or 7, it is characterized in that, described parameter also comprises direction of observation, and the parameter based on the physical location determination point of observation of described human eye also comprises:
The locus of focus is set;
Determine that direction that described left point of observation points to described focus is the direction of observation of described left point of observation;
Determine that direction that described right point of observation points to described focus is the direction of observation of described right point of observation.
9. the display control method of terminal equipment as claimed in claim 1, is characterized in that, described terminal equipment is built-in to be corresponded to the camera lens of left eye and corresponds to the camera lens of right eye;
The described parameter based on described point of observation upgrades the 3D rendering that screen shows and comprises:
The camera lens of left eye is corresponded to according to the parameter adjustment of described left point of observation;
The camera lens of right eye is corresponded to according to the parameter adjustment of described right point of observation;
The image that the image absorbed according to the camera lens corresponding to left eye after adjustment and the camera lens corresponding to right eye after adjusting absorb obtains 3D rendering;
Show the 3D rendering obtained on the screen.
10. the display control method of terminal equipment as claimed in claim 1, is characterized in that, described terminal equipment Built-in lens array;
The described parameter based on described point of observation upgrades the 3D rendering that screen shows and comprises:
Left camera lens is set to the camera lens of the parameter matching of described left point of observation by described lens array;
Right camera lens is set to the camera lens of the parameter matching of described right point of observation by described lens array;
3D rendering is obtained according to the described image of left camera lens picked-up and the image of left camera lens picked-up;
Show the 3D rendering obtained on the screen.
The display control method of 11. terminal equipments as claimed in claim 1, is characterized in that, store the image corresponding with the parameter of point of observation in described terminal equipment;
The described parameter based on described point of observation upgrades the 3D rendering that screen shows and comprises:
3D rendering is obtained according to the image corresponding with the parameter of described left point of observation and the image corresponding with the parameter of described right point of observation;
Show the 3D rendering obtained on the screen.
The display control unit of 12. 1 kinds of terminal equipments, is characterized in that, comprising:
Position determination unit, is suitable for the physical location determining human eye;
Parameter determination unit, be suitable for the parameter of the physical location determination point of observation based on described human eye, described point of observation comprises left point of observation and right point of observation, and described parameter comprises at least one in locus and viewing angle; Comprise: the second measuring unit, be suitable for when human eye is in described physical location, measure the length of semi-minor axis and the length of semi-major axis in iris of left eye region, and measure the length of semi-minor axis and the length of semi-major axis in iris of right eye region;
First angle determination unit, be suitable for the viewing angle a1 obtaining left point of observation according to a1=arcos (W1/L1), W1 is the length of the semi-minor axis in iris of left eye region, and L1 is the length of the semi-major axis in iris of left eye region;
Second angle determination unit, be suitable for the viewing angle a2 obtaining right point of observation according to a2=arcos (W2/L2), W2 is the length of the semi-minor axis in iris of right eye region, and L2 is the length of the semi-major axis in iris of right eye region;
Primary importance determining unit, be suitable for according to preset reference position and described physical location respectively with the distance of the screen of described terminal equipment, and human eye in described reference position just to point of observation position in the z-axis direction left during described screen, determine described left point of observation position in the z-axis direction;
Second place determining unit, be suitable for according to described reference position and described physical location respectively with the distance of the screen of described terminal equipment, and human eye in described reference position just to point of observation position in the z-axis direction right during described screen, determine described right point of observation position in the z-axis direction;
3rd position determination unit, distance when to be suitable for according to the distance of human eye when reference position and between longitudinal central axis line, human eye in described physical location between the central point in iris of left eye region and longitudinal central axis line and human eye just to point of observation position in the direction of the x axis left during described screen, determine left point of observation position in the direction of the x axis in described reference position;
4th position determination unit, distance when to be suitable for according to the distance of human eye when reference position and between longitudinal central axis line, human eye in described physical location between the central point in iris of right eye region and longitudinal central axis line and human eye just to point of observation position in the direction of the x axis right during described screen, determine right point of observation position in the direction of the x axis in described reference position; When human eye is in reference position, left eye and right eye are about described longitudinal central axis line symmetry, and x-axis is the horizontal direction of described screen, and y-axis is the vertical direction of screen, and z-axis is the direction of vertical xy plane;
Updating block, is suitable for upgrading based on the parameter of described point of observation the 3D rendering that the screen of described terminal equipment shows.
The display control unit of 13. terminal equipments as claimed in claim 12, is characterized in that, described position determination unit is physical location by determining human eye to the identification of human eye iris.
The display control unit of 14. terminal equipments as claimed in claim 12, it is characterized in that, described position determination unit, be suitable for the distance H1 obtaining described physical location and screen according to H1=H0/ (M1/M0), wherein, H0 is the distance of reference position and screen, M0 be when human eye in described reference position just to the length of the major axis of the iris region that measurement during described screen obtains, M1 is the length of the major axis of the iris region that measurement obtains when human eye is in described physical location.
The display control unit of 15. terminal equipments as claimed in claim 14, is characterized in that, described in
Primary importance determining unit, is suitable for according to Z1=Z l× (H1/H0) determines left point of observation position Z1 in the z-axis direction, Z lfor when human eye in described reference position just to left point of observation position in the z-axis direction described during described screen, described z-axis direction is the direction of vertical screen;
Described second place determining unit, is suitable for according to Z2=Z r× (H1/H0) determines right point of observation position Z2 in the z-axis direction, Z rfor when human eye in described reference position just to right point of observation position in the z-axis direction described during described screen.
The display control unit of 16. terminal equipments as claimed in claim 12, is characterized in that, described in
3rd position determination unit, is suitable for according to X1=X l× (A1/A0) determines left point of observation position X1 in the direction of the x axis, X lfor when human eye in described reference position just to point of observation position in the direction of the x axis left during described screen, A1 is the central point in iris of left eye region and the distance of longitudinal central axis line, A0 be when human eye in described reference position just to the central point of the simple eye iris region that measurement during described screen obtains and the distance of described longitudinal central axis line, described x-axis direction is the horizontal direction of screen;
Described 4th position determination unit, is suitable for according to X2=X r× (A2/A0) determines right eye 3D camera lens position X2 in the direction of the x axis, and A2 is the central point in iris of right eye region and the distance of longitudinal central axis line, X rfor when human eye in described reference position just to point of observation position in the direction of the x axis right during described screen.
The display control unit of 17. terminal equipments as described in claim 15 or 16, it is characterized in that, described parameter determination unit also comprises direction-determining unit, and described direction-determining unit comprises:
Focus setting unit, is suitable for the locus arranging focus;
First direction determining unit, is suitable for determining that direction that described left point of observation points to described focus is the direction of observation of described left point of observation;
Second direction determining unit, is suitable for determining that direction that described right point of observation points to described focus is the direction of observation of described right point of observation.
The display control unit of 18. terminal equipments as claimed in claim 12, is characterized in that, described terminal equipment is built-in to be corresponded to the camera lens of left eye and corresponds to the camera lens of right eye;
Described updating block comprises:
First adjustment unit, is suitable for the camera lens corresponding to left eye according to the parameter adjustment of described left point of observation;
Second adjustment unit, is suitable for the camera lens corresponding to right eye according to the parameter adjustment of described right point of observation;
First image acquiring unit, is suitable for the image acquisition 3D rendering that the image that absorb of the camera lens corresponding to left eye after according to adjustment and the camera lens corresponding to right eye after adjusting absorb;
First display unit, is suitable for showing obtained 3D rendering on the screen.
The display control unit of 19. terminal equipments as claimed in claim 12, is characterized in that, described terminal equipment Built-in lens array;
Described updating block comprises:
First setting unit, is suitable for being set to left camera lens by described lens array with the camera lens of the parameter matching of described left point of observation;
Second setting unit, is suitable for being set to right camera lens by described lens array with the camera lens of the parameter matching of described right point of observation;
Second image acquiring unit, is suitable for obtaining 3D rendering according to the described image of left camera lens picked-up and the image of left camera lens picked-up;
Second display unit, is suitable for showing obtained 3D rendering on the screen.
The display control unit of 20. terminal equipments as claimed in claim 12, is characterized in that, store the image corresponding with the parameter of point of observation in described terminal equipment;
Described updating block comprises:
3rd obtains unit, is suitable for obtaining 3D rendering according to the image corresponding with the parameter of described left point of observation and the image corresponding with the parameter of described right point of observation;
3rd display unit, is suitable for showing obtained 3D rendering on the screen.
CN201310396448.5A 2013-09-03 2013-09-03 A kind of display control method of terminal equipment and device Active CN103517060B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310396448.5A CN103517060B (en) 2013-09-03 2013-09-03 A kind of display control method of terminal equipment and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310396448.5A CN103517060B (en) 2013-09-03 2013-09-03 A kind of display control method of terminal equipment and device

Publications (2)

Publication Number Publication Date
CN103517060A CN103517060A (en) 2014-01-15
CN103517060B true CN103517060B (en) 2016-03-02

Family

ID=49898977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310396448.5A Active CN103517060B (en) 2013-09-03 2013-09-03 A kind of display control method of terminal equipment and device

Country Status (1)

Country Link
CN (1) CN103517060B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103760980A (en) * 2014-01-21 2014-04-30 Tcl集团股份有限公司 Display method, system and device for conducting dynamic adjustment according to positions of two eyes
CN104581113B (en) * 2014-12-03 2018-05-15 深圳市魔眼科技有限公司 Adaptive holographic display methods and holographic display based on viewing angle
CN105320264B (en) * 2015-05-25 2019-10-15 维沃移动通信有限公司 A kind of display methods and mobile terminal of mobile terminal
CN105120251A (en) * 2015-08-19 2015-12-02 京东方科技集团股份有限公司 3D scene display method and device
CN105597311B (en) * 2015-12-25 2019-07-12 网易(杭州)网络有限公司 Camera control method and device in 3d game
CN105867616B (en) * 2016-03-25 2019-11-05 北京酷云互动科技有限公司 Show the transform method and transformation system of picture
CN106325510B (en) * 2016-08-19 2019-09-24 联想(北京)有限公司 Information processing method and electronic equipment
CN108476316B (en) * 2016-09-30 2020-10-09 华为技术有限公司 3D display method and user terminal
CN109644259A (en) * 2017-06-21 2019-04-16 深圳市柔宇科技有限公司 3-dimensional image preprocess method, device and wear display equipment
CN112040316B (en) * 2020-08-26 2022-05-20 深圳创维-Rgb电子有限公司 Video image display method, device, multimedia equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1350750A (en) * 1999-05-17 2002-05-22 纽约大学 displayer and method for displaying
CN1512456A (en) * 2002-12-26 2004-07-14 联想(北京)有限公司 Method for displaying three-dimensional image
CN1853420A (en) * 2003-09-20 2006-10-25 皇家飞利浦电子股份有限公司 Improving image quality in an image display device
CN101051349A (en) * 2007-05-18 2007-10-10 北京中科虹霸科技有限公司 Multiple iris collecting device using active vision feedback
CN103136519A (en) * 2013-03-22 2013-06-05 中国移动通信集团江苏有限公司南京分公司 Sight tracking and positioning method based on iris recognition
CN103209332A (en) * 2012-12-05 2013-07-17 深圳市亿思达显示科技有限公司 Three-dimensional display device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4469476B2 (en) * 2000-08-09 2010-05-26 パナソニック株式会社 Eye position detection method and eye position detection apparatus
JP2002305759A (en) * 2001-04-06 2002-10-18 Sanyo Electric Co Ltd Stereoscopic video image display device without eyeglasses
JP2010033305A (en) * 2008-07-29 2010-02-12 Hitachi Ltd Image information processing method and device
JP5699566B2 (en) * 2010-11-29 2015-04-15 ソニー株式会社 Information processing apparatus, information processing method, and program
KR20130008334A (en) * 2011-07-12 2013-01-22 삼성전자주식회사 Vergence control method for stereo-scopic image control and portable device supporting the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1350750A (en) * 1999-05-17 2002-05-22 纽约大学 displayer and method for displaying
CN1512456A (en) * 2002-12-26 2004-07-14 联想(北京)有限公司 Method for displaying three-dimensional image
CN1853420A (en) * 2003-09-20 2006-10-25 皇家飞利浦电子股份有限公司 Improving image quality in an image display device
CN101051349A (en) * 2007-05-18 2007-10-10 北京中科虹霸科技有限公司 Multiple iris collecting device using active vision feedback
CN103209332A (en) * 2012-12-05 2013-07-17 深圳市亿思达显示科技有限公司 Three-dimensional display device
CN103136519A (en) * 2013-03-22 2013-06-05 中国移动通信集团江苏有限公司南京分公司 Sight tracking and positioning method based on iris recognition

Also Published As

Publication number Publication date
CN103517060A (en) 2014-01-15

Similar Documents

Publication Publication Date Title
CN103517060B (en) A kind of display control method of terminal equipment and device
JP6860488B2 (en) Mixed reality system
US10269139B2 (en) Computer program, head-mounted display device, and calibration method
US11145077B2 (en) Device and method for obtaining depth information from a scene
CN111194423A (en) Head mounted display tracking system
JP2017174125A (en) Information processing apparatus, information processing system, and information processing method
US20170213085A1 (en) See-through smart glasses and see-through method thereof
CN112655202B (en) Reduced bandwidth stereoscopic distortion correction for fisheye lenses of head-mounted displays
KR102532486B1 (en) Method and system for testing wearable devices
JP2018050179A (en) Information processing apparatus, image generation method and head mount display
CN106168855B (en) Portable MR glasses, mobile phone and MR glasses system
WO2020215960A1 (en) Method and device for determining area of gaze, and wearable device
CN103517061A (en) Method and device for display control of terminal device
CN108282650B (en) Naked eye three-dimensional display method, device and system and storage medium
JP6649010B2 (en) Information processing device
EP3402410B1 (en) Detection system
JP2017046233A (en) Display device, information processor, and control method of the same
US20170300121A1 (en) Input/output device, input/output program, and input/output method
JP6518645B2 (en) INFORMATION PROCESSING APPARATUS AND IMAGE GENERATION METHOD
CN107592520A (en) The imaging device and imaging method of AR equipment
TW201304505A (en) Three-dimensional image processing device and three-dimensional image processing method
JP6467039B2 (en) Information processing device
US20190089899A1 (en) Image processing device
WO2024185428A1 (en) Head-mounted display and image display method
CN117745982A (en) Method, device, system, electronic equipment and storage medium for recording video

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20170208

Address after: Room 32, building 3205F, No. 707, Zhang Yang Road, free trade zone,, China (Shanghai)

Patentee after: Xin Xin Finance Leasing Co.,Ltd.

Address before: Zuchongzhi road in Pudong Zhangjiang hi tech park Shanghai 201203 Lane 2288 Pudong New Area Spreadtrum Center Building 1

Patentee before: SPREADTRUM COMMUNICATIONS (SHANGHAI) Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20170707

Address after: Room 2062, Wenstin administration apartment, No. 9 Financial Street B, Beijing, Xicheng District

Patentee after: Xin Xin finance leasing (Beijing) Co.,Ltd.

Address before: Room 707, building 32, Zhang Yang Road, free trade zone, Shanghai

Patentee before: Xin Xin Finance Leasing Co.,Ltd.

EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20140115

Assignee: SPREADTRUM COMMUNICATIONS (SHANGHAI) Co.,Ltd.

Assignor: Xin Xin finance leasing (Beijing) Co.,Ltd.

Contract record no.: 2018990000163

Denomination of invention: Method and device for display control of terminal device

Granted publication date: 20160302

License type: Exclusive License

Record date: 20180626

EE01 Entry into force of recordation of patent licensing contract
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200306

Address after: 201203 Zuchongzhi Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai 2288

Patentee after: SPREADTRUM COMMUNICATIONS (SHANGHAI) Co.,Ltd.

Address before: 100033 room 2062, Wenstin administrative apartments, 9 Financial Street B, Xicheng District, Beijing.

Patentee before: Xin Xin finance leasing (Beijing) Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200602

Address after: 361012 unit 05, 8 / F, building D, Xiamen international shipping center, No.97 Xiangyu Road, Xiamen area, China (Fujian) free trade zone, Xiamen City, Fujian Province

Patentee after: Xinxin Finance Leasing (Xiamen) Co.,Ltd.

Address before: 201203 Zuchongzhi Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai 2288

Patentee before: SPREADTRUM COMMUNICATIONS (SHANGHAI) Co.,Ltd.

EC01 Cancellation of recordation of patent licensing contract
EC01 Cancellation of recordation of patent licensing contract

Assignee: SPREADTRUM COMMUNICATIONS (SHANGHAI) Co.,Ltd.

Assignor: Xin Xin finance leasing (Beijing) Co.,Ltd.

Contract record no.: 2018990000163

Date of cancellation: 20210301

EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20140115

Assignee: SPREADTRUM COMMUNICATIONS (SHANGHAI) Co.,Ltd.

Assignor: Xinxin Finance Leasing (Xiamen) Co.,Ltd.

Contract record no.: X2021110000010

Denomination of invention: A display control method and device for terminal equipment

Granted publication date: 20160302

License type: Exclusive License

Record date: 20210317

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230719

Address after: 201203 Shanghai city Zuchongzhi road Pudong New Area Zhangjiang hi tech park, Spreadtrum Center Building 1, Lane 2288

Patentee after: SPREADTRUM COMMUNICATIONS (SHANGHAI) Co.,Ltd.

Address before: 361012 unit 05, 8 / F, building D, Xiamen international shipping center, 97 Xiangyu Road, Xiamen area, China (Fujian) pilot Free Trade Zone, Xiamen City, Fujian Province

Patentee before: Xinxin Finance Leasing (Xiamen) Co.,Ltd.