CN106919246A - The display methods and device of a kind of application interface - Google Patents

The display methods and device of a kind of application interface Download PDF

Info

Publication number
CN106919246A
CN106919246A CN201510991496.8A CN201510991496A CN106919246A CN 106919246 A CN106919246 A CN 106919246A CN 201510991496 A CN201510991496 A CN 201510991496A CN 106919246 A CN106919246 A CN 106919246A
Authority
CN
China
Prior art keywords
face
target object
image data
application interface
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510991496.8A
Other languages
Chinese (zh)
Inventor
陈新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Qihoo Technology Co Ltd
Qizhi Software Beijing Co Ltd
Original Assignee
Beijing Qihoo Technology Co Ltd
Qizhi Software Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qihoo Technology Co Ltd, Qizhi Software Beijing Co Ltd filed Critical Beijing Qihoo Technology Co Ltd
Priority to CN201510991496.8A priority Critical patent/CN106919246A/en
Publication of CN106919246A publication Critical patent/CN106919246A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The display methods and device of a kind of application interface are the embodiment of the invention provides, the method includes:Gather the view data of display device local environment;By described image Data Detection detected target object;Physical location of the destination object in the environment according to described image DATA REASONING;The application interface in the display device is adjusted according to the physical location to be shown.The embodiment of the present invention realizes the self-adaptative adjustment of application interface, it is ensured that user watches the demand of display device in different positions, it is to avoid is limited on certain position and watches, and facilitates other operations of user.

Description

Application interface display method and device
Technical Field
The present invention relates to the field of computer processing technologies, and in particular, to a display method and a display apparatus for an application interface.
Background
With the rapid development of science and technology, in order to meet the further demands of users for quality of life, various display devices are rapidly growing and widely popularized.
A display device, which may also be referred to as a display, is a device that can output images or tactile information, such as a television display screen, a computer display, and the like.
Because the display device is an I/O device, i.e. an input/output device, generally connected to other devices, and the display device is generally large in size, the display device is generally fixed and secured at a certain position.
When a user watches, in order to obtain a better watching effect, the user usually fixes a certain position to watch, so that the user is very inconvenient.
Disclosure of Invention
In view of the above, the present invention is proposed to provide a display method of an application interface and a corresponding display apparatus of an application interface that overcome or at least partially solve the above problems.
According to an aspect of the present invention, there is provided a display method of an application interface, including:
acquiring image data of an environment where display equipment is located;
detecting a detection target object through the image data;
measuring an actual position of the target object in the environment from the image data;
and adjusting an application interface in the display equipment to display according to the actual position.
Optionally, the step of acquiring image data of an environment in which the display device is located includes:
and calling a camera to acquire image data of the environment where the display equipment is located at certain intervals.
Optionally, the step of detecting a detection target object by the image data includes:
the step of detecting a detection target object by the image data includes:
carrying out face detection on the image data;
when the face data are detected, determining that the face data represent the target object.
Optionally, the actual position comprises an actual distance from the display device;
the step of measuring the actual position of the target object in the environment from the image data comprises:
and calculating the distance of the face data in the image data as the actual distance between the target object and a display device in the environment.
Optionally, the step of calculating a distance of the face data in the image data as an actual distance between the target object and a display device in the environment includes:
when the face data is one, taking a distance calculated based on the face data in the image data as an actual distance between the target object and a display device in the environment;
or,
when the face data are multiple, calculating multiple distances of the face data in the image data;
calculating a target distance based on the plurality of distances;
setting the target distance as an actual distance between the target object and a display device in the environment.
Optionally, the actual position comprises an actual deflection angle with the display device;
the step of measuring the actual position of the target object in the environment from the image data comprises:
establishing a coordinate system based on the display device;
calculating face coordinates of the face data in the coordinate system;
and calculating the offset angle of the face data through the face coordinates to serve as the actual offset angle of the target object between the environment and display equipment.
Optionally, the step of calculating the angle of the face data offset by the face coordinates includes:
when the face data is one, calculating the offset angle of the face data by adopting the face coordinates through a trigonometric function relationship;
or,
when the face data are multiple, searching the face data arranged at the leftmost side and the face data arranged at the rightmost side;
calculating the angle of the first offset by adopting the face coordinates in the leftmost face data through a trigonometric function relationship;
calculating a second offset angle by adopting the face coordinates in the rightmost face data through a trigonometric function relationship;
and averaging the angle of the first deflection and the angle of the second deflection to obtain the angle of the deviation of the plurality of face data.
Optionally, the step of adjusting the application interface in the display device to display according to the actual position includes:
adjusting the area of an application interface according to the actual distance between the target object and the display device in the environment;
wherein the area of the application interface is proportional to the distance.
Optionally, the step of adjusting the application interface in the display device to display according to the actual position includes:
adjusting the direction and the position of the application interface offset according to the actual deflection angle of the target object in the environment and the display equipment;
shifting the application interface according to the shifting direction and the shifting angle;
the direction of the application interface offset is the same as the actual deflection angle, and the position of the application interface offset is in direct proportion to the actual deflection angle.
Optionally, the method further comprises:
the area outside the application interface is displayed as a designated color.
According to another aspect of the present invention, there is provided a display device of an application interface, including:
the image data acquisition module is suitable for acquiring image data of the environment where the display equipment is located;
a target object detection module adapted to detect a detection target object from the image data;
an actual position measurement module adapted to measure an actual position of the target object in the environment from the image data;
and the application interface adjusting module is suitable for adjusting the application interface in the display equipment to display according to the actual position.
Optionally, the image data acquisition module is further adapted to:
and calling a camera to acquire image data of the environment where the display equipment is located at certain intervals.
Optionally, the target object detection module is further adapted to:
the step of detecting a detection target object by the image data includes:
carrying out face detection on the image data;
when the face data are detected, determining that the face data represent the target object.
Optionally, the actual position comprises an actual distance from the display device;
the actual position measurement module is further adapted to:
and calculating the distance of the face data in the image data as the actual distance between the target object and a display device in the environment.
Optionally, the actual position measurement module is further adapted to:
when the face data is one, taking a distance calculated based on the face data in the image data as an actual distance between the target object and a display device in the environment;
or,
when the face data are multiple, calculating multiple distances of the face data in the image data;
calculating a target distance based on the plurality of distances;
setting the target distance as an actual distance between the target object and a display device in the environment.
Optionally, the actual position comprises an actual deflection angle with the display device;
the actual position measurement module is further adapted to:
establishing a coordinate system based on the display device;
calculating face coordinates of the face data in the coordinate system;
and calculating the offset angle of the face data through the face coordinates to serve as the actual offset angle of the target object between the environment and display equipment.
Optionally, the actual position measurement module is further adapted to:
when the face data is one, calculating the offset angle of the face data by adopting the face coordinates through a trigonometric function relationship;
or,
when the face data are multiple, searching the face data arranged at the leftmost side and the face data arranged at the rightmost side;
calculating the angle of the first offset by adopting the face coordinates in the leftmost face data through a trigonometric function relationship;
calculating a second offset angle by adopting the face coordinates in the rightmost face data through a trigonometric function relationship;
and averaging the angle of the first deflection and the angle of the second deflection to obtain the angle of the deviation of the plurality of face data.
Optionally, the application interface adjusting module is further adapted to:
adjusting the area of an application interface according to the actual distance between the target object and the display device in the environment;
wherein the area of the application interface is proportional to the distance.
Optionally, the application interface adjusting module is further adapted to:
adjusting the direction and the position of the application interface offset according to the actual deflection angle of the target object in the environment and the display equipment;
shifting the application interface according to the shifting direction and the shifting angle;
the direction of the application interface offset is the same as the actual deflection angle, and the position of the application interface offset is in direct proportion to the actual deflection angle.
Optionally, the method further comprises:
and the specified color display module is suitable for displaying the area outside the application interface as a specified color.
The embodiment of the invention collects the image data of the environment where the display equipment is located, detects and detects the target object, and adjusts the application interface in the display equipment for display based on the actual position of the target object in the environment, thereby realizing the self-adaptive adjustment of the application interface, ensuring the requirement of a user for watching the display equipment at different positions, avoiding the limitation of watching at a certain position, and facilitating other operations of the user.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flowchart illustrating steps of an embodiment of a method for displaying an application interface according to an embodiment of the present invention;
2A-2C illustrate exemplary measured positions of a target object in the environment, according to one embodiment of the invention; and
fig. 3 is a block diagram illustrating an embodiment of a display device for an application interface according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a display method for an application interface according to an embodiment of the present invention is shown, which may specifically include the following steps:
step 101, collecting image data of an environment where display equipment is located;
in practical applications, the display device may receive a Television signal (e.g., an analog signal), and the display terminal may also be called a Television (Television, TV, Video), and may specifically refer to a technology and a device that transmit a moving image by using an electronic device, that is, a Television receiver.
The tv may include an LCD (Liquid Crystal Display) tv, an LED (Light Emitting Diode) tv, a plasma tv, and the like, which is not limited in the embodiments of the present invention.
Further, the television may include a smart television, which may refer to a fully-open platform, which is loaded with an operating system, such as an Android system, an IOS system, a Windows system, and the like, and may be used by a user to install and uninstall applications (i.e., third-party applications) provided by third-party service providers, such as application programs, games, and the like, and continuously expand functions of the television through the applications, and may be used to implement internet access through a network cable and a wireless network.
In addition, the display terminal may not receive a television signal (e.g., an analog signal), but may be connected to an electronic device such as a computer or a mobile phone, and perform input of information such as video or games through the electronic device such as the computer or the mobile phone, or perform other interactive operations.
Of course, the display device may also be other electronic devices, for example, a personal computer, a notebook computer, a video conference system, and the like, which is not limited in this embodiment of the present invention.
In the embodiment of the invention, the display equipment can be provided with a camera, and the camera can be called to collect the image data of the environment where the display equipment is located at intervals of a certain time so as to perform the self-adaptive adjustment of the application interface in real time.
Specifically, an optical image generated by a Scene (Scene) through a Lens (Lens) of a camera is projected onto the surface of an image sensing processor (Sensor), then converted into an electric signal, converted into a digital image signal after a/D (analog-to-digital conversion), and the digital image signal is compressed and converted into a specific image file format in a digital signal processing chip (DSP) or a coding library to obtain image data.
Step 102, detecting a detection target object through the image data;
the target object, i.e. the reference for adjusting the application interface, is typically a person.
In a specific implementation, face detection may be performed on the image data, and when the face data is detected, it is determined that the face data represents the target object.
The image data may be pre-processed prior to face recognition.
Because the acquired image data is an original image, is limited by various conditions and random interference and often cannot be directly used, the image data can be subjected to image preprocessing such as gray scale correction, noise filtering and the like in the early stage of image processing.
For the face image, the preprocessing process mainly includes light compensation, gray level transformation, histogram equalization, normalization, geometric correction, filtering, sharpening, and the like of the face image.
The face recognition mainly detects whether a face exists, and generally comprises two stages:
the first stage is a machine learning process, that is, features related to a human face pattern are obtained through the learning of a large number of known human face patterns, and the features are saved.
The second stage is a matching process, namely, the image data to be detected is compared with the human face pattern characteristics for detection, if certain conditions are met, the human face is detected, and a result is output.
Taking a template matching method to perform face recognition as an example, the processing procedure may be as follows:
1. predefining or parameterizing a standard face template, wherein a simpler template is to regard the face as an ellipse, and the definition of the face template is realized by defining the parameters of the ellipse;
2. calculating correlation values of the head portrait data and the standard face template, wherein the correlation values are all comprehensive descriptions obtained by independently calculating the matching degrees of the face contour, the glasses, the nose and the mouth;
3. and determining whether the human face exists in the image according to the comparison between the correlation value and a preset threshold value.
In particular, in the Android system, an API (Application Programming Interface) for directly performing face recognition on a bitmap is provided, and the two APIs are an Android.
Specifically, the base class ImageView is extended to MyImageView, and the bitmap file containing the face to be detected is generally in 565 format, so as to ensure that the API works normally.
The detected face needs a confidence measure (confidence measure), which is defined in android.
The setFace () can instantiate a FaceDetector object and call findFaces at the same time, the result is stored in the faces, and the midpoint of the face is transferred to MyImageView.
Next, a setDisplayPoints () method is added to the MyImageView to mark rendering on the detected face.
The API returns other useful information such as eyedistance, position, and confidence, and the eyedistance can be used to locate the center of the eye.
Step 103, measuring the actual position of the target object in the environment according to the image data;
in the embodiment of the invention, the actual position of the target object in the environment is used as the basis for adjusting the application interface.
In an alternative embodiment of the invention, the actual position may comprise an actual distance from the display device; then, in an embodiment of the present invention, step 103 may include the following sub-steps:
a substep S11 of calculating a distance of the face data in the image data as an actual distance of the target object from a display device in the environment.
In the embodiment of the invention, the distance between the camera and the person represented by the face data can be calculated through the image data.
Since the camera is generally disposed in an area in or near the display device, the difference between the two is generally small, and therefore, the difference in distance between the camera and the display device is small, generally within an acceptable difference range.
To avoid adding additional hardware, the distance between the camera and the person represented by the face data can be considered as the distance between the display device and the person represented by the face data.
In particular implementations, the distance between the camera and the target object may be calculated by one or more of:
1. and (4) stereoscopic vision.
A human-simulated three-dimensional perception analysis method includes observing the same target object at different viewpoints by a binocular or multi-view camera, acquiring two-dimensional images of the target object at different viewing angles, and calculating position deviation, namely parallax, of image pixels according to a triangulation principle to acquire three-dimensional information of the target object.
2. And (4) a motion distance measurement method.
And acquiring continuous two-dimensional images of the target object at different times or different spatial positions by using the bullet screen camera, and calculating the distance and other parameters of the target object through the time or spatial change of the target object in the two-dimensional image sequence.
3. And (5) monocular distance measurement.
The distance measurement method based on image processing in monocular distance measurement comprises the following steps: focus from Focus (DFF) and Defocus (DFD) ranging methods.
The focusing and ranging method is to shoot a series of image data by adjusting the optical writing parameters, find the clearest image data in the image data, and calculate the distance according to the shooting parameters of the image data by using the imaging principle of geometric optics.
The defocusing ranging method is to determine a diffusion parameter of a defocusing point diffusion function by using two or three frames of image data shot under different optical parameters according to the principle that the larger the defocusing degree of an object is, the more blurred an image is, and to perform depth calculation according to the relation between the defocusing diffusion parameter and the distance of a target object.
Of course, the above calculation manner is only an example, and when the embodiment of the present invention is implemented, other calculation manners may be set according to actual situations, and the embodiment of the present invention is not limited thereto. In addition, besides the above calculation methods, those skilled in the art may also adopt other calculation methods according to actual needs, and the embodiment of the present invention is not limited thereto.
In practical applications, there may be one or more persons facing the display device, and therefore, there may be one or more face data detected in the image data.
As shown in fig. 2A, when there is one face data, the actual distance L between the target object and the display device in the environment is taken as the distance L calculated based on the face data in the image data;
as shown in fig. 2B and 2C, when there are a plurality of face data, a plurality of distances L of the plurality of face data in the image data are calculated1、L2
Calculating a target distance based on the plurality of distances, e.g., the target distance is an average (L) of the plurality of distances1+L2)/2;
Setting a target distance to an actual distance L ═ between a target object and a display device in the environment (L ═ L)1+L2)/2。
In another alternative embodiment of the invention, the actual position comprises an actual deflection angle to the display device; then, in an embodiment of the present invention, step 103 may include the following sub-steps:
a substep S21 of establishing a coordinate system based on the display device;
a substep S22 of calculating face coordinates of the face data in the coordinate system;
and a substep S23 of calculating an angle of the face data offset by the face coordinates as an actual deflection angle of the target object between the display device and the environment.
In the embodiment of the invention, a rectangular coordinate system can be established by taking the position of the camera as an origin, the facing direction of the rectangular coordinate system is 0 degree, and in the rectangular coordinate system, the offset angle between the camera and the person represented by the face data can be calculated through image data.
Since the camera is generally disposed in an area in or near the display device, the difference between the two is generally small, and therefore, the difference in distance between the camera and the display device is small, generally within an acceptable difference range.
To avoid adding additional hardware, the angle of the offset between the camera and the person represented by the face data can be considered as the actual angle of the offset between the display device and the person represented by the face data.
In practical applications, there may be one or more persons facing the display device, and therefore, there may be one or more face data detected in the image data.
When the face data is one, the face coordinate (such as the midpoint of the face data) is adopted to calculate the offset angle of the face data through the trigonometric function relationship.
For example, as shown in FIG. 2A, the angle S of the offset can be calculated using the following trigonometric relationship:
tanS=X0/Y0
wherein (X)0,Y0) Face coordinates which are face data.
When the number of the face data is multiple, searching the face data arranged at the leftmost side and the face data arranged at the rightmost side;
calculating the angle of the first offset by using face coordinates (such as the midpoint of the face data) in the leftmost face data through a trigonometric function relationship;
calculating a second offset angle by using face coordinates (such as the midpoint of the face data) in the rightmost face data through a trigonometric function relationship;
and averaging the angle of the first deflection and the angle of the second deflection to obtain the angle of the deviation of the plurality of face data.
As shown in fig. 2B, if the leftmost face data and the rightmost face data are respectively located at two sides of the camera in the forward direction, the offset angle S:
S=(S1-S2)/2
wherein S is1Angle of first offset, S, for leftmost face data2An angle of a second offset for the rightmost face data;
further, the angle of the offset may be calculated by the following trigonometric function relationship:
tanS1=X1/Y1
tanS2=X2/Y2
wherein (X)1,Y1) Face coordinates which are the leftmost face data, (X)2,Y2) The face coordinates of the rightmost face data.
As shown in fig. 2C, if the leftmost face data and the rightmost face data are located on the same side of the forward direction of the camera, the offset angle S is:
S=(S1+S2)/2
wherein S is1Angle of first offset, S, for leftmost face data2An angle of a second offset for the rightmost face data;
further, the angle of the offset may be calculated by the following trigonometric function relationship:
tanS1=X1/Y1
tanS2=X2/Y2
wherein (X)1,Y1) Face coordinates which are the leftmost face data, (X)2,Y2) The face coordinates of the rightmost face data.
With respect to fig. 2B and 2C, the center of gravity O between a plurality of persons is found by calculating the actual distances L and the actual deflection angles S of a plurality of face data.
And 104, adjusting an application interface in the display equipment to display according to the actual position.
In specific implementation, an application interface in the display device can be adaptively adjusted based on the actual position of the user, so that the user can conveniently watch the application interface.
The application interface may refer to an interface where the display device displays signals, and is generally full of the full screen.
In one adjustment, the area of the application interface may be adjusted according to the actual distance of the target object from the display device in the environment.
Wherein the area of the application interface is proportional to the distance.
If the user is closer to the display equipment, the smaller the display area of the application interface is, so that the eyes of the user are prevented from being stimulated;
if the user is farther away from the display equipment, the larger the display area of the application interface is, and the user can be ensured to clearly see the content.
In another adjustment mode, the offset direction and the offset position of the application interface are adjusted according to the actual deflection angle of the target object between the target object and the display device in the environment, and the application interface is offset according to the offset direction and the offset angle, so that the user can see clearly the content.
The direction of the application interface deviation is the same as the actual deflection angle;
for example, as shown in fig. 2A and 2C, if the actual deflection angle is biased to the left, the designated area may be biased to the left of the screen from the initial position (e.g., the middle) of the display.
For another example, as shown in fig. 2B, if the actual deflection angle is shifted to the right, the designated area may be shifted to the right of the screen from the initial position (e.g., the middle) of the display.
Furthermore, the position of the application interface offset is proportional to the actual deflection angle, i.e. the larger the actual deflection angle, the further away the application interface offset is.
It should be noted that, if the position of the application interface offset has reached the edge of the screen, the offset is stopped.
Since the application interface may be scaled down, not necessarily occupying full screen, the area outside the application interface may be displayed as a specified color, such as black, so that the user focuses attention on the application interface.
The embodiment of the invention collects the image data of the environment where the display equipment is located, detects and detects the target object, and adjusts the application interface in the display equipment for display based on the actual position of the target object in the environment, thereby realizing the self-adaptive adjustment of the application interface, ensuring the requirement of a user for watching the display equipment at different positions, avoiding the limitation of watching at a certain position, and facilitating other operations of the user.
For simplicity of explanation, the method embodiments are described as a series of acts or combinations, but those skilled in the art will appreciate that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently with other steps in accordance with the embodiments of the invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 3, a block diagram of a display device of an application interface according to an embodiment of the present invention is shown, which may specifically include the following modules:
the image data acquisition module 301 is suitable for acquiring image data of the environment where the display device is located;
a target object detection module 302 adapted to detect a detection target object from the image data;
an actual position measurement module 303 adapted to measure an actual position of the target object in the environment from the image data;
and the application interface adjusting module 304 is adapted to adjust an application interface in the display device to display according to the actual position.
In an optional embodiment of the invention, the image data acquisition module 301 may be further adapted to:
and calling a camera to acquire image data of the environment where the display equipment is located at certain intervals.
In an optional embodiment of the invention, the target object detection module 302 may be further adapted to:
the step of detecting a detection target object by the image data includes:
carrying out face detection on the image data;
when the face data are detected, determining that the face data represent the target object.
In an alternative embodiment of the invention, the actual position may comprise an actual distance from the display device;
the actual position measurement module 303 may be further adapted to:
and calculating the distance of the face data in the image data as the actual distance between the target object and a display device in the environment.
In an alternative embodiment of the present invention, the actual position measuring module 303 may be further adapted to:
when the face data is one, taking a distance calculated based on the face data in the image data as an actual distance between the target object and a display device in the environment;
or,
when the face data are multiple, calculating multiple distances of the face data in the image data;
calculating a target distance based on the plurality of distances;
setting the target distance as an actual distance between the target object and a display device in the environment.
In an alternative embodiment of the invention, the actual position may comprise an actual deflection angle to the display device;
the actual position measurement module 303 may be further adapted to:
establishing a coordinate system based on the display device;
calculating face coordinates of the face data in the coordinate system;
and calculating the offset angle of the face data through the face coordinates to serve as the actual offset angle of the target object between the environment and display equipment.
In an alternative embodiment of the present invention, the actual position measuring module 303 may be further adapted to:
when the face data is one, calculating the offset angle of the face data by adopting the face coordinates through a trigonometric function relationship;
or,
when the face data are multiple, searching the face data arranged at the leftmost side and the face data arranged at the rightmost side;
calculating the angle of the first offset by adopting the face coordinates in the leftmost face data through a trigonometric function relationship;
calculating a second offset angle by adopting the face coordinates in the rightmost face data through a trigonometric function relationship;
and averaging the angle of the first deflection and the angle of the second deflection to obtain the angle of the deviation of the plurality of face data.
In an optional embodiment of the present invention, the application interface adjusting module 304 may be further adapted to:
adjusting the area of an application interface according to the actual distance between the target object and the display device in the environment;
wherein the area of the application interface is proportional to the distance.
In an optional embodiment of the present invention, the application interface adjusting module 304 may be further adapted to:
adjusting the direction and the position of the application interface offset according to the actual deflection angle of the target object in the environment and the display equipment;
shifting the application interface according to the shifting direction and the shifting angle;
the direction of the application interface offset is the same as the actual deflection angle, and the position of the application interface offset is in direct proportion to the actual deflection angle.
In an optional embodiment of the present invention, the apparatus may further include the following module:
and the specified color display module is suitable for displaying the area outside the application interface as a specified color.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some or all of the components in a display device of an application interface according to embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The embodiment of the invention discloses A1 and a display method of an application interface, which comprises the following steps:
acquiring image data of an environment where display equipment is located;
detecting a detection target object through the image data;
measuring an actual position of the target object in the environment from the image data;
and adjusting an application interface in the display equipment to display according to the actual position.
A2, the method of A1, the step of acquiring image data of an environment in which a display device is located comprising:
and calling a camera to acquire image data of the environment where the display equipment is located at certain intervals.
A3, the method of a1, the detecting a detection target object by the image data comprising:
the step of detecting a detection target object by the image data includes:
carrying out face detection on the image data;
when the face data are detected, determining that the face data represent the target object.
A4, the method as in A3, the actual position comprising an actual distance from a display device;
the step of measuring the actual position of the target object in the environment from the image data comprises:
and calculating the distance of the face data in the image data as the actual distance between the target object and a display device in the environment.
A5, the method of A4, the step of calculating the distance of the face data in the image data as the actual distance of the target object from a display device in the environment comprising:
when the face data is one, taking a distance calculated based on the face data in the image data as an actual distance between the target object and a display device in the environment;
or,
when the face data are multiple, calculating multiple distances of the face data in the image data;
calculating a target distance based on the plurality of distances;
setting the target distance as an actual distance between the target object and a display device in the environment.
A6, the method as in A3, the actual position comprising an actual deflection angle with a display device;
the step of measuring the actual position of the target object in the environment from the image data comprises:
establishing a coordinate system based on the display device;
calculating face coordinates of the face data in the coordinate system;
and calculating the offset angle of the face data through the face coordinates to serve as the actual offset angle of the target object between the environment and display equipment.
A7, the method of a6, wherein the step of calculating the angle of the face data offset from the face coordinates comprises:
when the face data is one, calculating the offset angle of the face data by adopting the face coordinates through a trigonometric function relationship;
or,
when the face data are multiple, searching the face data arranged at the leftmost side and the face data arranged at the rightmost side;
calculating the angle of the first offset by adopting the face coordinates in the leftmost face data through a trigonometric function relationship;
calculating a second offset angle by adopting the face coordinates in the rightmost face data through a trigonometric function relationship;
and averaging the angle of the first deflection and the angle of the second deflection to obtain the angle of the deviation of the plurality of face data.
A8, the method as in A4 or A5, wherein the step of adjusting the application interface in the display device to display according to the actual position comprises:
adjusting the area of an application interface according to the actual distance between the target object and the display device in the environment;
wherein the area of the application interface is proportional to the distance.
A9, the method as in A6 or A7, wherein the step of adjusting the application interface in the display device to display according to the actual position comprises:
adjusting the direction and the position of the application interface offset according to the actual deflection angle of the target object in the environment and the display equipment;
shifting the application interface according to the shifting direction and the shifting angle;
the direction of the application interface offset is the same as the actual deflection angle, and the position of the application interface offset is in direct proportion to the actual deflection angle.
A10, the method of any one of a1-a6, further comprising:
the area outside the application interface is displayed as a designated color.
The embodiment of the invention also discloses B11, a display device of an application interface, comprising:
the image data acquisition module is suitable for acquiring image data of the environment where the display equipment is located;
a target object detection module adapted to detect a detection target object from the image data;
an actual position measurement module adapted to measure an actual position of the target object in the environment from the image data;
and the application interface adjusting module is suitable for adjusting the application interface in the display equipment to display according to the actual position.
B12, the apparatus of B11, the image data acquisition module further adapted to:
and calling a camera to acquire image data of the environment where the display equipment is located at certain intervals.
B13, the apparatus of B11, the target object detection module further adapted to:
the step of detecting a detection target object by the image data includes:
carrying out face detection on the image data;
when the face data are detected, determining that the face data represent the target object.
B14, the apparatus as described in B13, the actual position comprising the actual distance to the display device;
the actual position measurement module is further adapted to:
and calculating the distance of the face data in the image data as the actual distance between the target object and a display device in the environment.
B15, the apparatus of B14, the actual position measurement module further adapted to:
when the face data is one, taking a distance calculated based on the face data in the image data as an actual distance between the target object and a display device in the environment;
or,
when the face data are multiple, calculating multiple distances of the face data in the image data;
calculating a target distance based on the plurality of distances;
setting the target distance as an actual distance between the target object and a display device in the environment.
B16, the apparatus as described in B13, the actual position comprising an actual deflection angle with the display device;
the actual position measurement module is further adapted to:
establishing a coordinate system based on the display device;
calculating face coordinates of the face data in the coordinate system;
and calculating the offset angle of the face data through the face coordinates to serve as the actual offset angle of the target object between the environment and display equipment.
B17, the apparatus of B16, the actual position measurement module further adapted to:
when the face data is one, calculating the offset angle of the face data by adopting the face coordinates through a trigonometric function relationship;
or,
when the face data are multiple, searching the face data arranged at the leftmost side and the face data arranged at the rightmost side;
calculating the angle of the first offset by adopting the face coordinates in the leftmost face data through a trigonometric function relationship;
calculating a second offset angle by adopting the face coordinates in the rightmost face data through a trigonometric function relationship;
and averaging the angle of the first deflection and the angle of the second deflection to obtain the angle of the deviation of the plurality of face data.
B18, the apparatus of B14 or B15, the application interface adjustment module further adapted to:
adjusting the area of an application interface according to the actual distance between the target object and the display device in the environment;
wherein the area of the application interface is proportional to the distance.
B19, the apparatus of B16 or B17, the application interface adjustment module further adapted to:
adjusting the direction and the position of the application interface offset according to the actual deflection angle of the target object in the environment and the display equipment;
shifting the application interface according to the shifting direction and the shifting angle;
the direction of the application interface offset is the same as the actual deflection angle, and the position of the application interface offset is in direct proportion to the actual deflection angle.
The device of any one of B20, B11-B16, further comprising:
and the specified color display module is suitable for displaying the area outside the application interface as a specified color.

Claims (10)

1. A display method of an application interface comprises the following steps:
acquiring image data of an environment where display equipment is located;
detecting a detection target object through the image data;
measuring an actual position of the target object in the environment from the image data;
and adjusting an application interface in the display equipment to display according to the actual position.
2. The method of claim 1, wherein the step of acquiring image data of an environment in which the display device is located comprises:
and calling a camera to acquire image data of the environment where the display equipment is located at certain intervals.
3. The method of claim 1, wherein the step of detecting a detection target object by the image data comprises:
the step of detecting a detection target object by the image data includes:
carrying out face detection on the image data;
when the face data are detected, determining that the face data represent the target object.
4. The method of claim 3, wherein the actual location comprises an actual distance from a display device;
the step of measuring the actual position of the target object in the environment from the image data comprises:
and calculating the distance of the face data in the image data as the actual distance between the target object and a display device in the environment.
5. The method of claim 4, wherein the step of calculating the distance of the face data in the image data as the actual distance of the target object from a display device in the environment comprises:
when the face data is one, taking a distance calculated based on the face data in the image data as an actual distance between the target object and a display device in the environment;
or,
when the face data are multiple, calculating multiple distances of the face data in the image data;
calculating a target distance based on the plurality of distances;
setting the target distance as an actual distance between the target object and a display device in the environment.
6. The method of claim 3, wherein the actual position comprises an actual deflection angle from a display device;
the step of measuring the actual position of the target object in the environment from the image data comprises:
establishing a coordinate system based on the display device;
calculating face coordinates of the face data in the coordinate system;
and calculating the offset angle of the face data through the face coordinates to serve as the actual offset angle of the target object between the environment and display equipment.
7. The method of claim 6, wherein the step of calculating the angle of the face data offset from the face coordinates comprises:
when the face data is one, calculating the offset angle of the face data by adopting the face coordinates through a trigonometric function relationship;
or,
when the face data are multiple, searching the face data arranged at the leftmost side and the face data arranged at the rightmost side;
calculating the angle of the first offset by adopting the face coordinates in the leftmost face data through a trigonometric function relationship;
calculating a second offset angle by adopting the face coordinates in the rightmost face data through a trigonometric function relationship;
and averaging the angle of the first deflection and the angle of the second deflection to obtain the angle of the deviation of the plurality of face data.
8. The method of claim 4 or 5, wherein the step of adjusting the application interface in the display device for display according to the actual position comprises:
adjusting the area of an application interface according to the actual distance between the target object and the display device in the environment;
wherein the area of the application interface is proportional to the distance.
9. The method of claim 6 or 7, wherein the step of adjusting the application interface in the display device for display according to the actual position comprises:
adjusting the direction and the position of the application interface offset according to the actual deflection angle of the target object in the environment and the display equipment;
shifting the application interface according to the shifting direction and the shifting angle;
the direction of the application interface offset is the same as the actual deflection angle, and the position of the application interface offset is in direct proportion to the actual deflection angle.
10. A display device for an application interface, comprising:
the image data acquisition module is suitable for acquiring image data of the environment where the display equipment is located;
a target object detection module adapted to detect a detection target object from the image data;
an actual position measurement module adapted to measure an actual position of the target object in the environment from the image data;
and the application interface adjusting module is suitable for adjusting the application interface in the display equipment to display according to the actual position.
CN201510991496.8A 2015-12-24 2015-12-24 The display methods and device of a kind of application interface Pending CN106919246A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510991496.8A CN106919246A (en) 2015-12-24 2015-12-24 The display methods and device of a kind of application interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510991496.8A CN106919246A (en) 2015-12-24 2015-12-24 The display methods and device of a kind of application interface

Publications (1)

Publication Number Publication Date
CN106919246A true CN106919246A (en) 2017-07-04

Family

ID=59459036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510991496.8A Pending CN106919246A (en) 2015-12-24 2015-12-24 The display methods and device of a kind of application interface

Country Status (1)

Country Link
CN (1) CN106919246A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109596076A (en) * 2019-01-16 2019-04-09 余刚 Object area analysis institution
CN111754827A (en) * 2020-05-20 2020-10-09 四川科华天府科技有限公司 Presentation system based on AR interactive teaching equipment
US20220013080A1 (en) * 2018-10-29 2022-01-13 Goertek Inc. Directional display method and apparatus for audio device and audio device
CN114115776A (en) * 2021-11-16 2022-03-01 深圳Tcl新技术有限公司 Display control method, display control device, electronic equipment and storage medium
CN114760433A (en) * 2022-04-15 2022-07-15 维沃移动通信有限公司 Control method and device of video conference system, electronic equipment and storage medium
WO2024124480A1 (en) * 2022-12-15 2024-06-20 京东方科技集团股份有限公司 User interface display system and method, and computer device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101079972A (en) * 2006-05-24 2007-11-28 松下电器产业株式会社 Control method, control device and image system
CN102033549A (en) * 2009-09-30 2011-04-27 三星电子(中国)研发中心 Viewing angle adjusting device of display device
CN202121716U (en) * 2011-06-15 2012-01-18 康佳集团股份有限公司 Stereoscopic display device
CN102917232A (en) * 2012-10-23 2013-02-06 深圳创维-Rgb电子有限公司 Face recognition based 3D (three dimension) display self-adaptive adjusting method and face recognition based 3D display self-adaptive adjusting device
CN103500054A (en) * 2013-09-24 2014-01-08 广东明创软件科技有限公司 Method for automatically adjusting application interface and mobile terminal thereof
CN103677269A (en) * 2013-12-11 2014-03-26 小米科技有限责任公司 Method and device for displaying content and terminal device
CN105182983A (en) * 2015-10-22 2015-12-23 深圳创想未来机器人有限公司 Face real-time tracking method and face real-time tracking system based on mobile robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101079972A (en) * 2006-05-24 2007-11-28 松下电器产业株式会社 Control method, control device and image system
CN102033549A (en) * 2009-09-30 2011-04-27 三星电子(中国)研发中心 Viewing angle adjusting device of display device
CN202121716U (en) * 2011-06-15 2012-01-18 康佳集团股份有限公司 Stereoscopic display device
CN102917232A (en) * 2012-10-23 2013-02-06 深圳创维-Rgb电子有限公司 Face recognition based 3D (three dimension) display self-adaptive adjusting method and face recognition based 3D display self-adaptive adjusting device
CN103500054A (en) * 2013-09-24 2014-01-08 广东明创软件科技有限公司 Method for automatically adjusting application interface and mobile terminal thereof
CN103677269A (en) * 2013-12-11 2014-03-26 小米科技有限责任公司 Method and device for displaying content and terminal device
CN105182983A (en) * 2015-10-22 2015-12-23 深圳创想未来机器人有限公司 Face real-time tracking method and face real-time tracking system based on mobile robot

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220013080A1 (en) * 2018-10-29 2022-01-13 Goertek Inc. Directional display method and apparatus for audio device and audio device
US11551633B2 (en) * 2018-10-29 2023-01-10 Goeriek Inc. Directional display method and apparatus for audio device and audio device
CN109596076A (en) * 2019-01-16 2019-04-09 余刚 Object area analysis institution
CN109596076B (en) * 2019-01-16 2019-07-23 国网江苏省电力有限公司宿迁供电分公司 Object area analysis institution
CN111754827A (en) * 2020-05-20 2020-10-09 四川科华天府科技有限公司 Presentation system based on AR interactive teaching equipment
CN114115776A (en) * 2021-11-16 2022-03-01 深圳Tcl新技术有限公司 Display control method, display control device, electronic equipment and storage medium
CN114760433A (en) * 2022-04-15 2022-07-15 维沃移动通信有限公司 Control method and device of video conference system, electronic equipment and storage medium
WO2024124480A1 (en) * 2022-12-15 2024-06-20 京东方科技集团股份有限公司 User interface display system and method, and computer device and storage medium

Similar Documents

Publication Publication Date Title
US11948282B2 (en) Image processing apparatus, image processing method, and storage medium for lighting processing on image using model data
CN106919246A (en) The display methods and device of a kind of application interface
US20160180510A1 (en) Method and system of geometric camera self-calibration quality assessment
US20190019299A1 (en) Adaptive stitching of frames in the process of creating a panoramic frame
CN108234858B (en) Image blurring processing method and device, storage medium and electronic equipment
US10063840B2 (en) Method and system of sub pixel accuracy 3D measurement using multiple images
JP2017022694A (en) Method and apparatus for displaying light field based image on user's device, and corresponding computer program product
CN111345029B (en) Target tracking method and device, movable platform and storage medium
US20150198439A1 (en) Generation of depth data based on spatial light pattern
US20130002814A1 (en) Method for automatically improving stereo images
JP2017520050A (en) Local adaptive histogram flattening
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
US20150304625A1 (en) Image processing device, method, and recording medium
CN114339194A (en) Projection display method and device, projection equipment and computer readable storage medium
US9355436B2 (en) Method, system and computer program product for enhancing a depth map
IL275047B1 (en) Head-Mounted Display Device and Method Thereof
CN111968052B (en) Image processing method, image processing apparatus, and storage medium
CN110706283B (en) Calibration method and device for sight tracking, mobile terminal and storage medium
JP7378219B2 (en) Imaging device, image processing device, control method, and program
EP4093015A1 (en) Photographing method and apparatus, storage medium, and electronic device
US20190156511A1 (en) Region of interest image generating device
CN106454061B (en) Electronic device and image processing method
CN104205825B (en) Image processing apparatus and method and camera head
CN107436681A (en) Automatically adjust the mobile terminal and its method of the display size of word
CN114910052B (en) Distance measurement method, control method and device based on camera and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170704

RJ01 Rejection of invention patent application after publication