CN107038746B - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN107038746B
CN107038746B CN201710189056.XA CN201710189056A CN107038746B CN 107038746 B CN107038746 B CN 107038746B CN 201710189056 A CN201710189056 A CN 201710189056A CN 107038746 B CN107038746 B CN 107038746B
Authority
CN
China
Prior art keywords
angle
rendering
electronic equipment
rendering angle
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710189056.XA
Other languages
Chinese (zh)
Other versions
CN107038746A (en
Inventor
许枫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201710189056.XA priority Critical patent/CN107038746B/en
Publication of CN107038746A publication Critical patent/CN107038746A/en
Application granted granted Critical
Publication of CN107038746B publication Critical patent/CN107038746B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an information processing method and electronic equipment, comprising the following steps: detecting a spatial position relationship between the electronic equipment and a shot object; when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state; if the electronic equipment is in the relative static state, determining a rendering angle; and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time. The technical scheme provided by the invention is used for solving the technical problem that the accuracy of determining the rendering angle of the virtual object is low in the prior art.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to an information processing method and an electronic device.
Background
With the continuous development of scientific technology, various electronic devices in the prior art, such as: smart mobile phones, tablet computers, and the like are constantly developing, and functions are also more and more abundant, as: augmented Reality (AR), which is to render a designed virtual object, i.e. a three-dimensional model, on a screen in real time at a certain coordinate position of a real scene, and to reconstruct an environment with realistic vision and touch from the virtual object and a real scene, so as to realize natural interaction between a user and the environment.
However, when the AR device with the augmented reality function aligns the coordinate position from directly above, the angle of the rendering model may be disordered, thereby affecting the rendering and experience effects.
Therefore, the technical problem that the accuracy of determining the rendering angle of the virtual object is low exists in the prior art.
Disclosure of Invention
The embodiment of the invention provides an information processing method and electronic equipment, which are used for solving the technical problem that the accuracy of determining the rendering angle of a virtual object is low in the prior art so as to achieve the technical effect of improving the accuracy of determining the rendering angle of the virtual object.
In one aspect, an embodiment of the present invention provides an information processing method, including:
detecting a spatial position relationship between the electronic equipment and a shot object;
when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state;
if the electronic equipment is in the relative static state, determining a rendering angle;
and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time.
Optionally, detecting a spatial position relationship between the electronic device and the object to be photographed includes:
detecting an angle between the electronic equipment and a plane where the shot object is located;
wherein when the angle satisfies a predetermined angle, it is determined whether the electronic device is in a relatively stationary state.
Optionally, determining whether the electronic device is in a relatively stationary state includes:
acquiring a motion parameter of the electronic equipment through a sensor of the electronic equipment, wherein the motion parameter at least comprises the angle;
determining whether the electronic device is in the relatively stationary state based on the motion parameter.
Optionally, generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic device at the same time, includes:
generating an image corresponding to the subject based on the rendering angle and the subject;
synthesizing the image and the shot object to obtain a synthesized image;
displaying the composite image on the display unit.
Optionally, if the electronic device is in the relatively static state, determining a rendering angle includes:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
obtaining an average rendering angle based on the at least one rendering angle; wherein the average rendering angle is the rendering angle; or
If the electronic device is in the relatively static state, determining a rendering angle, including:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
determining a rendering angle with the highest occurrence frequency from the at least one rendering angle; and the rendering angle with the highest occurrence frequency is the rendering angle.
On the other hand, an embodiment of the present invention further provides an electronic device, including:
a sensor for detecting a spatial positional relationship between the electronic apparatus and a subject to be photographed;
the processor is connected with the sensor and used for determining whether the electronic equipment is in a relative static state or not when the spatial position relation meets a preset condition; if the electronic equipment is in the relative static state, determining a rendering angle; and generating an image based on the rendering angle;
and the display is connected with the processor and is used for displaying the image and the shot object on a display unit of the electronic equipment at the same time.
Optionally, the sensor is configured to:
detecting an angle between the electronic equipment and a plane where the shot object is located;
wherein when the angle satisfies a predetermined angle, it is determined whether the electronic device is in a relatively stationary state.
Optionally, the processor is configured to:
acquiring a motion parameter of the electronic equipment through a sensor of the electronic equipment, wherein the motion parameter at least comprises the angle;
determining whether the electronic device is in the relatively stationary state based on the motion parameter.
Optionally, the processor is configured to:
generating an image corresponding to the subject based on the rendering angle and the subject;
synthesizing the image and the shot object to obtain a synthesized image;
accordingly, the display is configured to:
displaying the composite image on the display unit.
Optionally, the processor is configured to:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
obtaining an average rendering angle based on the at least one rendering angle; wherein the average rendering angle is the rendering angle; or
The processor is configured to:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
determining a rendering angle with the highest occurrence frequency from the at least one rendering angle; and the rendering angle with the highest occurrence frequency is the rendering angle.
One or more technical solutions in the embodiments of the present invention at least have one or more of the following technical effects:
firstly, according to the technical scheme in the embodiment of the invention, the spatial position relationship between the electronic equipment and the shot object is detected; when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state; if the electronic equipment is in the relative static state, determining a rendering angle; and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time. That is, it is not possible to determine which rendering angle is used because the rendering angle of the rendering model is disordered when the spatial position relationship satisfies the predetermined condition as in the prior art. In the technical scheme, when the spatial relationship meets the predetermined condition, the electronic equipment is firstly determined to be in a relatively static state, and when the electronic equipment is in the relatively static state, a fixed rendering angle is determined to render the shot object at the fixed rendering angle, so that the rendering angle is prevented from being disordered. Therefore, the technical problem that the accuracy of determining the rendering angle of the virtual object is low in the prior art can be effectively solved, and the technical effect of improving the accuracy of determining the rendering angle is achieved.
Drawings
FIG. 1 is a diagram illustrating a rendering angle in a prior art is disordered;
fig. 2 is a flowchart illustrating an implementation of an information processing method according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical scheme provided by the embodiment of the invention is used for solving the technical problem of lower accuracy of determining the rendering angle of the virtual object in the prior art so as to achieve the technical effect of improving the accuracy of determining the rendering angle of the virtual object.
In order to solve the technical problems, the technical scheme in the embodiment of the invention has the following general idea:
detecting a spatial position relationship between the electronic equipment and a shot object;
when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state;
if the electronic equipment is in the relative static state, determining a rendering angle;
and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time.
In the technical scheme, the spatial position relation between the electronic equipment and the shot object is detected; when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state; if the electronic equipment is in the relative static state, determining a rendering angle; and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time. That is, it is not possible to determine which rendering angle is used because the rendering angle of the rendering model is disordered when the spatial position relationship satisfies the predetermined condition as in the prior art. In the technical scheme, when the spatial relationship meets the predetermined condition, the electronic equipment is firstly determined to be in a relatively static state, and when the electronic equipment is in the relatively static state, a fixed rendering angle is determined to render the shot object at the fixed rendering angle, so that the rendering angle is prevented from being disordered. Therefore, the technical problem that the accuracy of determining the rendering angle of the virtual object is low in the prior art can be effectively solved, and the technical effect of improving the accuracy of determining the rendering angle is achieved.
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
First, a brief description is given of an application scenario of the embodiment of the present invention.
For the AR equipment, a designed virtual object, namely a three-dimensional model, can be rendered at a certain coordinate position of a real scene in real time, so that a vivid visual and tactile environment is provided for a user, and natural interaction between the user and the environment is realized.
And rendering a three-dimensional model on the two-dimensional image requires calculating a 4 x 4 matrix, multiplying the vertex of the three-dimensional model by the 4 x 4 matrix, and then mapping out the points required to be drawn on the two-dimensional picture. However, when the two-dimensional image and the electronic device are in a parallel state and reach a critical state, the rendering angle is disordered, please refer to fig. 1. In fig. 1, an angle formed between the three-dimensional model and the two-dimensional picture is a rendering angle referred to in the following description.
In view of this, an embodiment of the present invention provides an information processing method, in which when it is determined that a spatial position relationship satisfies a preset condition and an electronic device is in a relatively stationary state, a fixed rendering angle can be determined, and a confusion of rendering angles is avoided, so that accuracy of determining rendering angles is improved, and a rendering effect is ensured as much as possible.
Referring to fig. 2, an information processing method according to an embodiment of the present invention includes:
s201: detecting a spatial position relationship between the electronic equipment and a shot object;
s202: when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state;
s203: if the electronic equipment is in the relative static state, determining a rendering angle;
s204: and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time.
The information processing method provided by the embodiment of the invention can be applied to electronic equipment, and the electronic equipment can be AR equipment including but not limited to AR smart phones, AR smart watches and AR tablet computers; or an AR digital device, an AR digital camera; or head-mounted AR devices such as AR helmets, AR glasses, etc., or other types of electronic devices, which are not limited in the embodiments of the present invention.
The shot object in the embodiment of the present invention may be any object in a real scene, such as: office, green plant, person, book content, etc., as long as they can be captured by the image capture unit of the electronic device. In the embodiment of the present invention, the image capturing unit may be a camera, including but not limited to a front camera, a rear camera, or other cameras, which is not limited in the embodiment of the present invention.
The spatial relationship between the electronic device and the photographed object in the embodiment of the present invention is as follows: parallel, coincident, perpendicular, at an angle, such as: at an angle of 60 degrees, 120 degrees, etc., or in other spatial relationships.
In the embodiment of the present invention, step S101 is first executed: a spatial positional relationship between the electronic device and the subject is detected.
In the embodiment of the present invention, the detecting the spatial position relationship between the electronic device and the object includes, but is not limited to, the following two implementation manners, which are described below separately.
First one
Detecting an angle between the electronic equipment and a plane where the shot object is located;
wherein when the angle satisfies a predetermined angle, it is determined whether the electronic device is in a relatively stationary state.
In a specific implementation process, the sensor may be arranged in the electronic device, such as: and the angle sensor is used for determining the angle between the electronic equipment and the plane where the shot object is located, and further determining the spatial position relation between the electronic equipment and the plane.
In the embodiment of the invention, when the predetermined angle is 0 degree, namely the electronic equipment is parallel to the plane of the shot object, whether the electronic equipment is in a relative static state or not is determined.
Second kind
Detecting the distance between two ends of the electronic equipment and the plane where the shot object is located;
wherein when the distance satisfies a predetermined distance, it is determined whether the electronic device is in a relatively stationary state.
In a specific implementation process, distance sensors are arranged at any two ends of the electronic equipment, namely the four ends of the electronic equipment, so that the distance between the two ends of the electronic equipment and a plane where a shot object is located is measured through the distance sensors arranged at the two ends, and further the spatial position relation between the electronic equipment and the plane is determined.
In the embodiment of the invention, when the distances from the two ends of the electronic equipment to the plane where the shot object is located are equal, whether the electronic equipment is in a relatively static state or not is determined.
For the above two implementation manners, a person skilled in the art may select them according to actual needs, and the embodiments of the present invention are not limited thereto.
After step S202 is executed, step S202 is executed: and when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state.
In the embodiment of the present invention, as to the specific implementation process of step S202, the following steps are included:
acquiring a motion parameter of the electronic equipment through a sensor of the electronic equipment, wherein the motion parameter at least comprises the angle;
determining whether the electronic device is in the relatively stationary state based on the motion parameter.
In embodiments of the present invention, sensors of an electronic device include, but are not limited to: a gyroscope, an angle sensor, a velocity sensor, a distance sensor, etc., or another sensor, will not be exemplified herein.
(1) And when the sensor is a gyroscope, the position and/or the movement track of the electronic equipment are/is measured through the sensor. When in a preset time period, such as: when the position and/or the moving track of the electronic equipment are not changed within 1 minute, 2 minutes or other preset time period, the electronic equipment is in a relatively static state.
(2) And when the sensor is an angle sensor, the angle between the electronic equipment and the plane where the shot object is located in the current state is measured through the sensor. When the angle between the electronic equipment and the plane where the shot object is located does not change within the preset time period, the electronic equipment is in a relatively static state.
(3) When the sensor is a speed sensor, the current movement speed of the electronic equipment is obtained through measurement of the sensor. When the movement speed of the electronic equipment is 0 within the preset time period, the electronic equipment is in a relatively static state.
(4) When the sensor is a distance sensor, the distance between the electronic equipment and the plane where the shot object is located in the preset time period is measured through the electronic equipment. When the distance between the electronic equipment and the plane where the shot object is located does not change within the preset time period, the electronic equipment is in a relatively static state.
In the embodiment of the present invention, in addition to determining whether the electronic device is in the relatively stationary state through the motion parameter detected by the sensor, in a specific implementation process, whether a pressing operation performed on the electronic device by a user of the electronic device to determine that the electronic device is currently in the relatively stationary state is detected according to a pressure sensor in the electronic device; or detecting whether voice input by a user of the electronic equipment exists or not according to the voice acquisition sensor of the electronic equipment, wherein the voice input is used for determining that the electronic equipment is currently in a relatively static state; or determine whether the electronic device is in a relatively static state by other means, and a person skilled in the art may select the electronic device according to actual needs, which is not limited in the embodiments of the present invention.
After determining that the electronic device is in the relatively static state, step S203 is executed: and if the electronic equipment is in the relative static state, determining a rendering angle.
In the embodiment of the present invention, the manner of determining the rendering angle includes, but is not limited to, the following two implementation manners, which are described in cases below.
The first method comprises the following steps:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
obtaining an average rendering angle based on the at least one rendering angle; wherein the average rendering angle is the rendering angle.
In a specific implementation process, when the spatial position relationship between the electronic device and the photographed object satisfies a preset condition and the electronic device is in a relatively static state, the rendering angle can be obtained through continuous calculation, for example: 89 degrees, 90 degrees, 91 degrees, 88 degrees, etc., in which case the average of the rendering angles is taken as the fixed rendering angle, i.e., the determined rendering angle is 89.8 degrees.
The second method comprises the following steps:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
determining a rendering angle with the highest occurrence frequency from the at least one rendering angle; and the rendering angle with the highest occurrence frequency is the rendering angle.
In a specific implementation process, when the spatial position relationship between the electronic device and the photographed object satisfies a preset condition and the electronic device is in a relatively static state, the rendering angle can be obtained through continuous calculation, for example: 89 degrees, 90 degrees, 91 degrees, 88 degrees, etc., in which case the angle with the highest number of occurrences among the calculated rendering angles is taken as the rendering angle, i.e., the specified rendering angle is 91 degrees.
In the embodiment of the present invention, in addition to determining the rendering angle by the two methods, the method may also be performed according to the following method, including:
acquiring a historical rendering angle recorded by the electronic equipment;
determining the rendering angle based on the historical rendering angle.
In a specific implementation process, when the spatial position relationship between the electronic device and the photographed object meets a preset condition and the electronic device is in a relatively static state, obtaining a historical rendering angle recorded by the electronic device, such as: 89 degrees, 90 degrees, 91 degrees. In the embodiment of the present invention, the historical rendering angle may be stored in the electronic device, and may also be stored in other electronic devices, such as: a usb-disk, a removable hard disk, etc., obtained from these electronic devices when needed.
After the historical rendering angles are obtained, any one of the historical rendering angles can be used as the rendering angle.
In the embodiment of the present invention, after determining the rendering angle, step S204 is executed: and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time.
In the specific implementation process, as for the specific implementation process of step S204, the following steps are included:
generating an image corresponding to the subject based on the rendering angle and the subject;
synthesizing the image and the shot object to obtain a synthesized image;
displaying the composite image on the display unit.
In embodiments of the present invention, the image may be a virtual three-dimensional model. The image may be an image generated based on the rendering angle after the rendering angle is determined; the image may be obtained by adjusting the generated image based on the rendering angle after the rendering angle is determined. In the embodiment of the invention, the generated image can be stored in the electronic device, and also can be stored in other electronic devices which can be connected with the electronic device, and can be acquired from other electronic devices when needed, so that the storage space of the electronic device is saved.
In the embodiment of the present invention, the images correspond to the object to be photographed, specifically, the images may correspond to one another, may be of the same type, may also be derived images, or have other corresponding relationships, and the embodiment of the present invention is not limited in this respect.
In the embodiment of the present invention, the image is obtained by adjusting the generated image based on the rendering angle, where the generated image and the object to be photographed have a one-to-one correspondence relationship, such as: after the shot object is acquired and acquired through the image acquisition unit, the shot object is subjected to feature extraction analysis, and then an image corresponding to the shot object is acquired according to the corresponding relation between the feature of the shot object and the feature of the image. In the specific implementation process, if the shot object is a blooming flower, the generated image can be a small animal such as a butterfly, a bee and the like; if the shot object is a text "xiaoming and mom take a walk around the river", the generated image may be a flowing river, or other situations, which are not illustrated.
In the embodiment of the present invention, after an image corresponding to a subject is generated, the image and the subject are synthesized and displayed on the display unit of the electronic device.
On the other hand, referring to fig. 3, the present invention further provides an electronic device, including:
a sensor 301 for detecting a spatial positional relationship between the electronic apparatus and a subject;
a processor 302, connected to the sensor, for determining whether the electronic device is in a relatively stationary state when the spatial position relationship satisfies a predetermined condition; if the electronic equipment is in the relative static state, determining a rendering angle; and generating an image based on the rendering angle;
and a display 303, connected to the processor, for displaying the image and the subject on a display unit of the electronic device at the same time.
In the embodiment of the present invention, the sensor 301 may be an angle sensor, a direction sensor, a gyroscope, a speed sensor, or an acceleration sensor, or another type of sensor, which is not limited in the embodiment of the present invention.
In embodiments of the invention, the processor 302 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), one or more Integrated circuits for controlling program execution, a baseband chip, or the like.
In an embodiment of the present invention, the electronic device may further include a memory, and the memory may be connected through the processor 302. The number of the memories may be one or more, and the memories may be Read Only Memories (ROMs), Random Access Memories (RAMs), or magnetic disk memories, etc.
By programming the processor, the code corresponding to the information processing method is solidified into the chip, so that the chip can execute the information processing method provided by the embodiment shown in fig. 2 when running, and how to program the processor 302 is a technique known by those skilled in the art, and will not be described herein again.
Optionally, the sensor 301 is configured to:
detecting an angle between the electronic equipment and a plane where the shot object is located;
wherein when the angle satisfies a predetermined angle, it is determined whether the electronic device is in a relatively stationary state.
Optionally, the processor 302 is configured to:
acquiring a motion parameter of the electronic equipment through a sensor of the electronic equipment, wherein the motion parameter at least comprises the angle;
determining whether the electronic device is in the relatively stationary state based on the motion parameter.
Optionally, the processor 302 is configured to:
generating an image corresponding to the subject based on the rendering angle and the subject;
synthesizing the image and the shot object to obtain a synthesized image;
accordingly, the display 303 is configured to:
displaying the composite image on the display unit.
Optionally, the processor 302 is configured to:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
obtaining an average rendering angle based on the at least one rendering angle; wherein the average rendering angle is the rendering angle; or
The processor 302 is configured to:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
determining a rendering angle with the highest occurrence frequency from the at least one rendering angle; and the rendering angle with the highest occurrence frequency is the rendering angle.
One or more technical solutions in the embodiments of the present invention at least have one or more of the following technical effects:
firstly, according to the technical scheme in the embodiment of the invention, the spatial position relationship between the electronic equipment and the shot object is detected; when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state; if the electronic equipment is in the relative static state, determining a rendering angle; and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time. That is, it is not possible to determine which rendering angle is used because the rendering angle of the rendering model is disordered when the spatial position relationship satisfies the predetermined condition as in the prior art. In the technical scheme, when the spatial relationship meets the predetermined condition, the electronic equipment is firstly determined to be in a relatively static state, and when the electronic equipment is in the relatively static state, a fixed rendering angle is determined to render the shot object at the fixed rendering angle, so that the rendering angle is prevented from being disordered. Therefore, the technical problem that the accuracy of determining the rendering angle of the virtual object is low in the prior art can be effectively solved, and the technical effect of improving the accuracy of determining the rendering angle is achieved.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Specifically, the computer program instructions corresponding to an information processing method in the embodiments of the present application may be stored on a storage medium such as an optical disc, a hard disc, a usb disk, or the like, and when the computer program instructions corresponding to the information processing method in the storage medium are read or executed by an electronic device, the method includes the following steps:
detecting a spatial position relationship between the electronic equipment and a shot object;
when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state;
if the electronic equipment is in the relative static state, determining a rendering angle;
and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time.
Optionally, the step of storing in the storage medium: detecting the spatial position relation between the electronic equipment and the shot object, wherein the corresponding computer instructions comprise, in the specific executed process:
detecting an angle between the electronic equipment and a plane where the shot object is located;
wherein when the angle satisfies a predetermined angle, it is determined whether the electronic device is in a relatively stationary state.
Optionally, the step of storing in the storage medium: determining whether the electronic device is in a relatively stationary state, wherein the corresponding computer instructions, when executed in a specific manner, include:
acquiring a motion parameter of the electronic equipment through a sensor of the electronic equipment, wherein the motion parameter at least comprises the angle;
determining whether the electronic device is in the relatively stationary state based on the motion parameter.
Optionally, the step of storing in the storage medium: generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic device at the same time, wherein the corresponding computer instructions, in a specific executed process, include:
generating an image corresponding to the subject based on the rendering angle and the subject;
synthesizing the image and the shot object to obtain a synthesized image;
displaying the composite image on the display unit.
Optionally, the step of storing in the storage medium: if the electronic device is in the relatively static state, determining a rendering angle, wherein the corresponding computer instructions comprise, in a specific executed process:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
obtaining an average rendering angle based on the at least one rendering angle; wherein the average rendering angle is the rendering angle; or
If the electronic device is in the relatively static state, determining a rendering angle, including:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
determining a rendering angle with the highest occurrence frequency from the at least one rendering angle; and the rendering angle with the highest occurrence frequency is the rendering angle.
The above embodiments are only used to describe the technical solutions of the present application in detail, but the above embodiments are only used to help understanding the method and the core idea of the present invention, and should not be construed as limiting the present invention. Those skilled in the art should also appreciate that they can easily conceive of various changes and substitutions within the technical scope of the present disclosure.

Claims (10)

1. An information processing method comprising:
detecting a spatial position relationship between the electronic equipment and a shot object;
when the spatial position relation meets a preset condition, determining whether the electronic equipment is in a relatively static state, wherein the rendering angle for rendering the shot object under the preset condition is disordered;
if the electronic equipment is in the relative static state, determining a fixed rendering angle, wherein the rendering angle is an angle formed between a three-dimensional model and a two-dimensional picture during augmented reality processing;
and generating an image based on the rendering angle, and displaying the image and the photographed object on a display unit of the electronic equipment at the same time.
2. The method of claim 1, wherein detecting a spatial positional relationship between the electronic device and the subject comprises:
detecting an angle between the electronic equipment and a plane where the shot object is located;
wherein when the angle satisfies a predetermined angle, it is determined whether the electronic device is in a relatively stationary state.
3. The method of claim 2, wherein determining whether the electronic device is in a relatively stationary state comprises:
acquiring a motion parameter of the electronic equipment through a sensor of the electronic equipment, wherein the motion parameter at least comprises the angle;
determining whether the electronic device is in the relatively stationary state based on the motion parameter.
4. The method of claim 3, wherein generating an image based on the rendering angle, the image being displayed on a display unit of the electronic device simultaneously with the subject, comprises:
generating an image corresponding to the subject based on the rendering angle and the subject;
synthesizing the image and the shot object to obtain a synthesized image;
displaying the composite image on the display unit.
5. The method of claim 1, wherein determining a rendering angle if the electronic device is in the relatively stationary state comprises:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
obtaining an average rendering angle based on the at least one rendering angle; wherein the average rendering angle is the rendering angle; or
If the electronic device is in the relatively static state, determining a rendering angle, including:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
determining a rendering angle with the highest occurrence frequency from the at least one rendering angle; and the rendering angle with the highest occurrence frequency is the rendering angle.
6. An electronic device, comprising:
a sensor for detecting a spatial positional relationship between the electronic apparatus and a subject to be photographed;
the processor is connected with the sensor and used for determining whether the electronic equipment is in a relative static state or not when the spatial position relation meets a preset condition, wherein the rendering angle for rendering the shot object under the preset condition is disordered; if the electronic equipment is in the relative static state, determining a fixed rendering angle, wherein the rendering angle is an angle formed between a three-dimensional model and a two-dimensional picture during augmented reality processing; and generating an image based on the rendering angle;
and the display is connected with the processor and is used for displaying the image and the shot object on a display unit of the electronic equipment at the same time.
7. The electronic device of claim 6, wherein the sensor is to:
detecting an angle between the electronic equipment and a plane where the shot object is located;
wherein when the angle satisfies a predetermined angle, it is determined whether the electronic device is in a relatively stationary state.
8. The electronic device of claim 7, wherein the processor is to:
acquiring a motion parameter of the electronic equipment through a sensor of the electronic equipment, wherein the motion parameter at least comprises the angle;
determining whether the electronic device is in the relatively stationary state based on the motion parameter.
9. The electronic device of claim 8, wherein the processor is to:
generating an image corresponding to the subject based on the rendering angle and the subject;
synthesizing the image and the shot object to obtain a synthesized image;
accordingly, the display is configured to:
displaying the composite image on the display unit.
10. The electronic device of claim 6, wherein the processor is to:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
obtaining an average rendering angle based on the at least one rendering angle; wherein the average rendering angle is the rendering angle; or
The processor is configured to:
calculating at least one corresponding rendering angle when the electronic equipment is in the relative static state;
determining a rendering angle with the highest occurrence frequency from the at least one rendering angle; and the rendering angle with the highest occurrence frequency is the rendering angle.
CN201710189056.XA 2017-03-27 2017-03-27 Information processing method and electronic equipment Active CN107038746B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710189056.XA CN107038746B (en) 2017-03-27 2017-03-27 Information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710189056.XA CN107038746B (en) 2017-03-27 2017-03-27 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN107038746A CN107038746A (en) 2017-08-11
CN107038746B true CN107038746B (en) 2019-12-24

Family

ID=59533641

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710189056.XA Active CN107038746B (en) 2017-03-27 2017-03-27 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN107038746B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109615703B (en) * 2018-09-28 2020-04-14 阿里巴巴集团控股有限公司 Augmented reality image display method, device and equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193625A (en) * 2010-02-24 2011-09-21 索尼公司 Image processing apparatus, image processing method, program, and image processing system
CN102200881A (en) * 2010-03-24 2011-09-28 索尼公司 Image processing apparatus, image processing method and program
CN105339865A (en) * 2013-04-04 2016-02-17 索尼公司 Display control device, display control method and program
CN106030484A (en) * 2014-02-27 2016-10-12 三星电子株式会社 Method and device for displaying three-dimensional graphical user interface screen

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193625A (en) * 2010-02-24 2011-09-21 索尼公司 Image processing apparatus, image processing method, program, and image processing system
CN102200881A (en) * 2010-03-24 2011-09-28 索尼公司 Image processing apparatus, image processing method and program
CN105339865A (en) * 2013-04-04 2016-02-17 索尼公司 Display control device, display control method and program
CN106030484A (en) * 2014-02-27 2016-10-12 三星电子株式会社 Method and device for displaying three-dimensional graphical user interface screen

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于FLARToolkit的多标识增强现实系统的实现;付文秀,张馨则;《北京交通大学学报》;20161031;正文第16-22页 *

Also Published As

Publication number Publication date
CN107038746A (en) 2017-08-11

Similar Documents

Publication Publication Date Title
CN106355153B (en) A kind of virtual objects display methods, device and system based on augmented reality
US11024014B2 (en) Sharp text rendering with reprojection
US9412205B2 (en) Extracting sensor data for augmented reality content
CN112797897B (en) Method and device for measuring geometric parameters of object and terminal
US8878846B1 (en) Superimposing virtual views of 3D objects with live images
CN109743892B (en) Virtual reality content display method and device
CN109064390B (en) Image processing method, image processing device and mobile terminal
US9514574B2 (en) System and method for determining the extent of a plane in an augmented reality environment
CN107291222B (en) Interactive processing method, device and system of virtual reality equipment and virtual reality equipment
CN111242908A (en) Plane detection method and device and plane tracking method and device
US11373329B2 (en) Method of generating 3-dimensional model data
WO2016193537A1 (en) Mediated reality
CN108028904B (en) Method and system for light field augmented reality/virtual reality on mobile devices
TW201810217A (en) Article image display method, apparatus and system
CN114531553B (en) Method, device, electronic equipment and storage medium for generating special effect video
CN110458954B (en) Contour line generation method, device and equipment
CN109978945B (en) Augmented reality information processing method and device
CN107038746B (en) Information processing method and electronic equipment
CN113178017A (en) AR data display method and device, electronic equipment and storage medium
CN110868581A (en) Image display method, device and system
CN112073632A (en) Image processing method, apparatus and storage medium
CN111710044A (en) Image processing method, apparatus and computer-readable storage medium
CN114862997A (en) Image rendering method and apparatus, medium, and computer device
CN115690363A (en) Virtual object display method and device and head-mounted display device
CN112825198B (en) Mobile tag display method, device, terminal equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant