CN107657663B - Method and device for displaying information - Google Patents

Method and device for displaying information Download PDF

Info

Publication number
CN107657663B
CN107657663B CN201710867785.6A CN201710867785A CN107657663B CN 107657663 B CN107657663 B CN 107657663B CN 201710867785 A CN201710867785 A CN 201710867785A CN 107657663 B CN107657663 B CN 107657663B
Authority
CN
China
Prior art keywords
virtual
space
dimensional object
focal length
length data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710867785.6A
Other languages
Chinese (zh)
Other versions
CN107657663A (en
Inventor
赵晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201710867785.6A priority Critical patent/CN107657663B/en
Publication of CN107657663A publication Critical patent/CN107657663A/en
Application granted granted Critical
Publication of CN107657663B publication Critical patent/CN107657663B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0221Testing optical properties by determining the optical axis or position of lenses

Abstract

The embodiment of the application discloses a method and a device for displaying information. One embodiment of the method comprises: monitoring focal length data of the lens; determining the placement position of a virtual three-dimensional object in a virtual space through focal length data, wherein the virtual space is stored on an intelligent terminal and contains the virtual three-dimensional object with a preset size; and after the virtual three-dimensional object is arranged at the placing position, the virtual three-dimensional object is displayed on a display screen of the intelligent terminal. The implementation method improves the accuracy of the placement position of the virtual object in the virtual space, and improves the display precision of the virtual object on the intelligent terminal.

Description

Method and device for displaying information
Technical Field
The present application relates to the field of data processing technologies, and in particular, to the field of visual display technologies, and in particular, to a method and an apparatus for displaying information.
Background
In the augmented reality technology, a real-time positioning and Mapping technology (SLAM) uses any object existing in a physical space as a reference object, calculates position and orientation information of a current intelligent terminal (such as a mobile phone, a tablet computer and the like) in the physical space, controls a virtual camera in a rendering engine, and can achieve the effect of fusion of the physical space and the virtual space in augmented reality. SLAM technology greatly expands the range of applications for augmented reality technology by eliminating the need for a specific trigger (e.g., a specific printed pattern).
The existing real-time positioning and mapping technology has some defects. For example, the size of the virtual three-dimensional object can be estimated only by using a specific object with a preset size in the physical space as a reference, or scale estimation is performed by using dedicated hardware such as augmented reality hardware such as a multi-camera system which is not usually provided by most intelligent terminals, so that the virtual three-dimensional object in the virtual space cannot be displayed relatively truly, and the display accuracy of the virtual three-dimensional object on the intelligent terminal is low.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for displaying information, so as to solve the technical problems mentioned in the above background.
In a first aspect, an embodiment of the present application provides a method for displaying information, where the method includes: monitoring focal length data of the lens; determining the placement position of a virtual three-dimensional object in a virtual space through focal length data, wherein the virtual space is stored on an intelligent terminal and contains the virtual three-dimensional object with a preset size; and after the virtual three-dimensional object is arranged at the placing position, the virtual three-dimensional object is displayed on a display screen of the intelligent terminal.
In some embodiments, the determining the placement position of the virtual three-dimensional object in the virtual space through the focal distance data includes: determining the focal position of a focal point corresponding to the focal length data in a virtual space; acquiring a plurality of spatial focal length data within a set range of the focal position, and constructing a virtual object placing space through the spatial focal length data, wherein the virtual object placing space is used for placing a virtual three-dimensional object; and setting a placing position of the virtual three-dimensional object in the virtual object placing space based on the preset size of the virtual three-dimensional object.
In some embodiments, the constructing the virtual object placement space by the spatial focal distance data includes: and calculating distance information and angle information between the space focus position corresponding to each space focal length data and the lens to obtain a space position relation between the space focus positions, and constructing a virtual object placing space based on the space position relation.
In some embodiments, after the setting of the virtual three-dimensional object at the placement position, displaying the virtual three-dimensional object on a display screen of the intelligent terminal includes: monitoring the display range of the display screen for displaying the virtual space; and determining the display size of the virtual three-dimensional object in the display screen according to the display range and the preset size of the virtual three-dimensional object.
In a second aspect, an embodiment of the present application provides an apparatus for displaying information, where the apparatus includes: the focal length data monitoring unit is used for monitoring the focal length data of the lens; the device comprises a placement position determining unit, a storage unit and a control unit, wherein the placement position determining unit is used for determining the placement position of a virtual three-dimensional object in a virtual space through focal length data, and the virtual space is stored on an intelligent terminal and contains the virtual three-dimensional object with a preset size; and the display unit is used for displaying the virtual three-dimensional object on a display screen of the intelligent terminal after the virtual three-dimensional object is arranged at the placing position.
In some embodiments, the placement position determining unit includes: the focal position determining subunit is used for determining the focal position of a focal point corresponding to the focal length data in the virtual space; the virtual object placing space constructing subunit is used for acquiring a plurality of space focal length data within the set range of the focal position, and constructing a virtual object placing space through the space focal length data, wherein the virtual object placing space is used for placing a virtual three-dimensional object; and the placing position determining subunit is used for setting the placing position of the virtual three-dimensional object in the virtual object placing space based on the preset size of the virtual three-dimensional object.
In some embodiments, the virtual object placement space constructing subunit includes: and calculating distance information and angle information between the space focus position corresponding to each space focal length data and the lens to obtain a space position relation between the space focus positions, and constructing a virtual object placing space based on the space position relation.
In some embodiments, the display unit includes: the display range monitoring subunit is used for monitoring the display range of the display screen for displaying the virtual space; and the display size determining subunit is used for determining the display size of the virtual three-dimensional object in the display screen according to the display range and the preset size of the virtual three-dimensional object.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; the lens is used for measuring focal length data; a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to perform the method for displaying information of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the method for displaying information of the first aspect.
According to the method and the device for displaying information, the placing position of the virtual object in the virtual space is determined through the focal length data, so that the accuracy of the placing position of the virtual object in the virtual space is improved; after the virtual object is arranged at the placing position, the virtual object is displayed on a display screen of the intelligent terminal, and the display precision of the virtual object on the intelligent terminal is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for displaying information according to the present application;
FIG. 3 is a schematic illustration of an application scenario of a method for displaying information according to the present application;
FIG. 4 is a schematic block diagram of one embodiment of an apparatus for displaying information according to the present application;
fig. 5 is a schematic structural diagram of a computer system suitable for implementing a terminal device according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the method for displaying information or the apparatus for displaying information of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, and a virtual three-dimensional object 104. The virtual three-dimensional object 104 may be set within a virtual space constructed within the terminal devices 101, 102, 103. The virtual three-dimensional object 104 may include various dimensions, such as a virtual cube of equal side length, a virtual chair, and so forth.
The user can select a designated placement position within the virtual space formed by the terminal devices 101, 102, 103, and after setting the virtual three-dimensional object 104 at the placement position, the virtual three-dimensional object 104 is displayed on the display screens of the terminal devices 101, 102, 103. Specifically, lenses of the terminal devices 101, 102, and 103 may acquire focal length data; the terminal devices 101, 102, 103 determine the placement position of the virtual three-dimensional object in the virtual space by the focal distance data, and then display the virtual three-dimensional object on the display screen according to the focal distance data. The terminal devices 101, 102, 103 may be various electronic devices with lenses and supporting augmented reality technology, including but not limited to smart phones, tablets, laptop and desktop computers, and the like.
It should be noted that the method for displaying information provided in the embodiments of the present application is generally executed by the terminal devices 101, 102, and 103, and accordingly, the apparatus for displaying information is generally disposed in the terminal devices 101, 102, and 103.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for displaying information in accordance with the present application is shown. The method for displaying information comprises the following steps:
step 201, focus data of the lens is monitored.
In this embodiment, an electronic device (e.g., the smart terminal shown in fig. 1) on which the method for displaying information operates may monitor the focal length data of the lens in real time.
In order to set a virtual three-dimensional object in a virtual space, the present embodiment needs to monitor focal length data of a lens of the intelligent terminal. Generally, a lens of a smart terminal has a capability of automatically adjusting a focal length. The lens can automatically adjust the focal length according to different positions of an imaged object. When a certain point in an image collected by the lens is clicked on a display screen of the intelligent terminal, the lens can normally automatically adjust the focal length and obtain a clear image by taking the point as a focus. When the lens can acquire a clear image of an imaged object, the focal length data of the lens at the moment can represent the distance between the imaged object and the lens. Therefore, the distance between the imaged object and the lens (i.e., the depth of field) can be obtained by monitoring the focal length data of the lens.
And step 202, determining the placement position of the virtual three-dimensional object in the virtual space through the focal distance data.
After the focal length data is acquired, the distance between the imaged object and the lens can be determined. On the basis of the position, the placement position of the virtual three-dimensional object in the virtual space is determined. The virtual space is stored on the intelligent terminal and comprises a virtual three-dimensional object with a preset size.
In some optional implementations of the present embodiment, the determining a placement position of the virtual three-dimensional object in the virtual space by the focal distance data may include the following steps:
the first step is to determine the focal position of the focal point corresponding to the focal length data in the virtual space.
The focal length data is acquired, which indicates that the lens at this time can already measure the distance between the focal point and the lens in the physical space corresponding to the focal length data. In the augmented reality technology, the virtual space and the physical space have a corresponding relationship, so that the focus position of the focus in the virtual space can be determined through the corresponding focus of the focal length data in the physical space.
And secondly, acquiring a plurality of spatial focal length data within the set range of the focal position, and constructing a virtual object placing space according to the spatial focal length data.
The focal position of the focal point in the virtual space corresponds to only one point in the virtual space. In order to place a virtual three-dimensional object having a preset size, it is necessary to construct a virtual space around the focal position and then place the virtual three-dimensional object in the virtual space. In the present embodiment, a plurality of spatial focal length data are acquired within the above-described setting range of the focal position. Here, the spatial focal length data is also acquired by acquiring a plurality of other focal points within a set range of focal positions through the lens. For example, a plurality of other focuses may be selected from an image captured by a lens displayed on the smart terminal, or the lens may be set so that the lens can automatically obtain the plurality of other focuses around the focus position. Since the physical space and the virtual space have a corresponding relationship, the focal length data between the lens and the plurality of other focal points corresponds to the spatial focal length data of the lens and the plurality of other focal points in the virtual space. Generally, a plurality of other focal points belong to the same or different planes in the physical space, and therefore, at least one plane in which the focal points corresponding to the plurality of spatial focal length data are located in the virtual space can be determined by a plane fitting method or the like. And then constructing a virtual object placing space through the surface. The virtual object placing space is used for placing a virtual three-dimensional object.
And thirdly, setting a placing position of the virtual three-dimensional object in the virtual object placing space based on the preset size of the virtual three-dimensional object.
After the virtual object placement space is constructed, a space position can be selected in the virtual object placement space to place a virtual three-dimensional object.
In some optional implementations of this embodiment, the constructing the virtual object placement space by using the spatial focal length data may include: and calculating distance information and angle information between the space focus position corresponding to each space focal length data and the lens to obtain a space position relation between the space focus positions, and constructing a virtual object placing space based on the space position relation.
When constructing the virtual object placement space, it is necessary to first determine faces in a plurality of virtual spaces. The plurality of spatial focal length data are acquired within the set range of the focal position, and under the condition that the position of the lens in the physical space is not changed, distance information and angle information between the spatial focal position corresponding to each spatial focal length data and the lens can be calculated, so that the spatial position relation among the plurality of spatial focal positions is obtained; then, on the basis of the spatial position relationship, at least one virtual space surface corresponding to the spatial focus position can be determined by methods such as plane fitting and the like; finally, a virtual object placement space is constructed from the at least one face.
And 203, displaying the virtual three-dimensional object on a display screen of the intelligent terminal after the virtual three-dimensional object is arranged at the placing position.
After the virtual three-dimensional object is arranged at the placement position, the virtual three-dimensional object in the virtual space can be displayed through a display screen of the intelligent terminal.
In some optional implementation manners of this embodiment, after the virtual three-dimensional object is set at the placement position, displaying the virtual three-dimensional object on the display screen of the intelligent terminal may include the following steps:
firstly, monitoring the display range of the display screen for displaying the virtual space.
The virtual three-dimensional object is placed in a virtual space formed in the intelligent terminal. Theoretically, how large a physical space is and how large a virtual space is. And the display space of the display screen of the intelligent terminal is in a range. In order to reasonably display the virtual three-dimensional object, the display range of the display screen for displaying the virtual space needs to be determined. Since the physical space and the virtual space have a corresponding relationship, a display range in which the virtual space is displayed corresponds to a display range in which the physical space is displayed. And the display range quantization mode of the physical space can also be obtained through the focal length information of the lens. Therefore, the display range of the virtual space can be obtained correspondingly. The display range is a physical space range displayed by a display screen of the intelligent terminal, and the display range can also be a virtual space range displayed by the display screen of the intelligent terminal due to the fact that the physical space and the virtual space have a corresponding relation. The display range may be indicated by a certain length unit (for example, the length unit may be meter), and the indication may be displayed on a display screen of the intelligent terminal or may not be displayed. The display range of the corresponding display screen may be: the display range in the horizontal direction is 5 meters, and the display range in the vertical direction is 3 meters.
And secondly, determining the display size of the virtual three-dimensional object in the display screen according to the display range and the preset size of the virtual three-dimensional object.
The virtual three-dimensional object has a preset size, and the size of the virtual three-dimensional object on the display screen of the intelligent terminal can be determined by comparing the display range after the display range of the virtual space is obtained. When the virtual three-dimensional object is drawn close to the display screen, the display range of the virtual space on the display screen is reduced, and correspondingly, the size of the virtual three-dimensional object on the display screen is increased; for example, the display range of the display screen may be: the display range in the horizontal direction is 5 meters, the display range in the vertical direction is 3 meters, and the preset size of the virtual three-dimensional object is 1 meter by 1 meter. The virtual three-dimensional object occupies a display section of 1/5 in the horizontal direction and 1/3 in the vertical direction within the current display range. Conversely, when the virtual three-dimensional object is zoomed out on the display screen, the display range of the virtual space on the display screen is enlarged, and correspondingly, the size of the virtual three-dimensional object on the display screen is reduced. For example, after zooming out, the display range of the display screen may be: the display range in the horizontal direction is 10 meters, the display range in the vertical direction is 6 meters, and the preset size of the virtual three-dimensional object is still 1 meter by 1 meter. The virtual three-dimensional object occupies a display section of 1/10 in the horizontal direction and 1/6 in the vertical direction within the current display range. Compared with the zooming-in situation, under the zooming-out situation, the size of the virtual three-dimensional object on the display screen of the intelligent terminal is obviously smaller than that of the virtual three-dimensional object on the display screen of the intelligent terminal under the zooming-in situation. Therefore, the automatic size adjustment of the virtual three-dimensional object on the display screen of the intelligent terminal is realized, and the authenticity and the display precision of the virtual three-dimensional object on the display screen of the intelligent terminal are improved.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for displaying information according to the present embodiment. In the application scenario of fig. 3, two virtual three- dimensional objects 403 and 404 with identical preset sizes are placed on the desktop 401 and the ground 402 in the virtual space of the terminal device 102, respectively, based on the focal length data of the lens of the terminal device 102; when the terminal device 102 is in the position shown in fig. 3, the display size of the virtual three- dimensional objects 403 and 404 in the display screen of the smart terminal may change with the difference of the focal length data (i.e., the distance between the virtual three-dimensional object and the lens) (the display size of the virtual three-dimensional object far from the lens on the display screen is smaller than the display size of the virtual three-dimensional object near the lens on the display screen), so that the reality and accuracy of the display of the virtual three- dimensional objects 403 and 404 on the display screen of the smart terminal are improved.
According to the method provided by the embodiment of the application, the placing position of the virtual object in the virtual space is determined through the focal length data, so that the accuracy of the placing position of the virtual object in the virtual space is improved; after the virtual object is arranged at the placing position, the virtual object is displayed on a display screen of the intelligent terminal, and the display precision of the virtual object on the intelligent terminal is improved.
With further reference to fig. 4, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of an apparatus for displaying information, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 4, the apparatus 400 for displaying information of the present embodiment may include: a focal length data monitoring unit 401, a placement position determining unit 402, and a display unit 403. The focal length data monitoring unit 401 is configured to monitor focal length data of the lens; the placement position determining unit 402 is configured to determine, according to the focal length data, a placement position of a virtual three-dimensional object in a virtual space, where the virtual space is stored in the intelligent terminal and includes a virtual three-dimensional object with a preset size; the display unit 403 is configured to display the virtual three-dimensional object on the display screen of the intelligent terminal after the virtual three-dimensional object is set at the above-mentioned placement position.
In some optional implementations of the present embodiment, the placement position determining unit 402 may include: a focus position determining subunit (not shown in the figure), a virtual object placement space constructing subunit (not shown in the figure), and a placement position determining subunit (not shown in the figure). The focal position determining subunit is configured to determine a focal position of a focal point corresponding to the focal length data in a virtual space; the virtual object placing space constructing subunit is used for acquiring a plurality of spatial focal length data within a set range of the focal position, and constructing a virtual object placing space through the spatial focal length data, wherein the virtual object placing space is used for placing a virtual three-dimensional object; the placement position determination subunit is configured to set a placement position of the virtual three-dimensional object in the virtual object placement space based on a preset size of the virtual three-dimensional object.
In some optional implementations of this embodiment, the virtual object placement space constructing subunit may include: and calculating distance information and angle information between the space focus position corresponding to each space focal length data and the lens to obtain a space position relation between the space focus positions, and constructing a virtual object placing space based on the space position relation.
In some optional implementations of this embodiment, the display unit 403 may include: a display range monitoring subunit and a display size determining subunit. The display range monitoring subunit is used for monitoring the display range of the display virtual space of the display screen; the display size determining subunit is used for determining the display size of the virtual three-dimensional object in the display screen according to the display range and the preset size of the virtual three-dimensional object.
The present embodiment also provides an electronic device, including: one or more processors; the lens is used for measuring focal length data; a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to perform the above-described method for displaying information.
The present embodiment also provides a computer-readable storage medium on which a computer program is stored which, when being executed by a processor, carries out the above-mentioned method for displaying information.
Referring now to FIG. 5, shown is a block diagram of a computer system 500 suitable for use in implementing the electronic device of an embodiment of the present application. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 5, the computer system 500 includes a Central Processing Unit (CPU)501 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the system 500 are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 507 including a Liquid Crystal Display (LCD) and the like and a speaker and the like; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program performs the above-described functions defined in the method of the present application when executed by the Central Processing Unit (CPU) 501. It should be noted that the computer readable medium mentioned above in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a focal length data monitoring unit, a placement position determining unit, and a display unit. Here, the names of the units do not constitute a limitation to the unit itself in some cases, and for example, the display unit may also be described as a "unit for displaying a virtual three-dimensional object on a smart terminal".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: monitoring focal length data of the lens; determining the placement position of a virtual three-dimensional object in a virtual space through focal length data, wherein the virtual space is stored on an intelligent terminal and contains the virtual three-dimensional object with a preset size; and after the virtual three-dimensional object is arranged at the placing position, the virtual three-dimensional object is displayed on a display screen of the intelligent terminal.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (8)

1. A method for displaying information, the method comprising:
monitoring focal length data of the lens;
determining the placement position of a virtual three-dimensional object in a virtual space through focal length data, wherein the virtual space is stored on an intelligent terminal and contains the virtual three-dimensional object with a preset size;
after the virtual three-dimensional object is arranged at the placement position, displaying the virtual three-dimensional object on a display screen of the intelligent terminal;
wherein the determining the placement position of the virtual three-dimensional object in the virtual space through the focal distance data comprises:
determining the focal position of a focal point corresponding to the focal length data in a virtual space;
acquiring a plurality of spatial focal length data within the set range of the focal position, and constructing a virtual object placing space through the spatial focal length data, wherein the virtual object placing space is used for placing a virtual three-dimensional object;
and setting the placement position of the virtual three-dimensional object in the virtual object placement space based on the preset size of the virtual three-dimensional object.
2. The method of claim 1, wherein the constructing the virtual object placement space from the spatial focal distance data comprises:
and calculating distance information and angle information between the space focus position corresponding to each space focal length data and the lens to obtain a space position relation between the space focus positions, and constructing a virtual object placing space based on the space position relation.
3. The method according to claim 1, wherein after the virtual three-dimensional object is set at the placement position, displaying the virtual three-dimensional object on a display screen of the intelligent terminal comprises:
monitoring the display range of the display screen for displaying the virtual space;
and determining the display size of the virtual three-dimensional object in the display screen according to the display range and the preset size of the virtual three-dimensional object.
4. An apparatus for displaying information, the apparatus comprising:
the focal length data monitoring unit is used for monitoring the focal length data of the lens;
the device comprises a placing position determining unit, a storage unit and a control unit, wherein the placing position determining unit is used for determining the placing position of a virtual three-dimensional object in a virtual space through focal length data, and the virtual space is stored on the intelligent terminal and contains the virtual three-dimensional object with a preset size;
the display unit is used for displaying the virtual three-dimensional object on a display screen of the intelligent terminal after the virtual three-dimensional object is arranged at the placement position;
wherein the placement position determination unit includes:
the focal position determining subunit is used for determining the focal position of a focal point corresponding to the focal length data in the virtual space;
the virtual object placing space constructing subunit is used for acquiring a plurality of spatial focal length data within the set range of the focal position, and constructing a virtual object placing space through the spatial focal length data, wherein the virtual object placing space is used for placing a virtual three-dimensional object;
and the placement position determining subunit is used for setting the placement position of the virtual three-dimensional object in the virtual object placement space based on the preset size of the virtual three-dimensional object.
5. The apparatus of claim 4, wherein the virtual object placement space constructing subunit comprises:
and calculating distance information and angle information between the space focus position corresponding to each space focal length data and the lens to obtain a space position relation between the space focus positions, and constructing a virtual object placing space based on the space position relation.
6. The apparatus of claim 4, wherein the display unit comprises:
the display range monitoring subunit is used for monitoring the display range of the display screen for displaying the virtual space;
and the display size determining subunit is used for determining the display size of the virtual three-dimensional object in the display screen according to the display range and the preset size of the virtual three-dimensional object.
7. An electronic device, comprising:
one or more processors;
the lens is used for measuring focal length data;
a memory for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method of any of claims 1-3.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 3.
CN201710867785.6A 2017-09-22 2017-09-22 Method and device for displaying information Active CN107657663B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710867785.6A CN107657663B (en) 2017-09-22 2017-09-22 Method and device for displaying information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710867785.6A CN107657663B (en) 2017-09-22 2017-09-22 Method and device for displaying information

Publications (2)

Publication Number Publication Date
CN107657663A CN107657663A (en) 2018-02-02
CN107657663B true CN107657663B (en) 2021-03-12

Family

ID=61131250

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710867785.6A Active CN107657663B (en) 2017-09-22 2017-09-22 Method and device for displaying information

Country Status (1)

Country Link
CN (1) CN107657663B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110662015A (en) * 2018-06-29 2020-01-07 北京京东尚科信息技术有限公司 Method and apparatus for displaying image
CN114419298A (en) * 2022-01-21 2022-04-29 北京字跳网络技术有限公司 Virtual object generation method, device, equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678802A (en) * 2014-04-21 2016-06-15 杨祖立 Method for generating three-dimensional information by identifying two-dimensional image

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10533850B2 (en) * 2013-07-12 2020-01-14 Magic Leap, Inc. Method and system for inserting recognized object data into a virtual world

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678802A (en) * 2014-04-21 2016-06-15 杨祖立 Method for generating three-dimensional information by identifying two-dimensional image

Also Published As

Publication number Publication date
CN107657663A (en) 2018-02-02

Similar Documents

Publication Publication Date Title
CN108492364B (en) Method and apparatus for generating image generation model
JP6203406B2 (en) System and method for determining plane spread in an augmented reality environment
CN108668086B (en) Automatic focusing method and device, storage medium and terminal
JP2018503066A (en) Accuracy measurement of image-based depth detection system
CN110619807B (en) Method and device for generating global thermodynamic diagram
US10129504B2 (en) Method and system for measuring quality of video call
CN112488783B (en) Image acquisition method and device and electronic equipment
US20180180459A1 (en) Distance calculation method, flow volume measuring method, distance calculation apparatus, and flow volume measuring apparatus
CN113607185B (en) Lane line information display method, lane line information display device, electronic device, and computer-readable medium
CN110232707A (en) A kind of distance measuring method and device
CN107657663B (en) Method and device for displaying information
CN109978753B (en) Method and device for drawing panoramic thermodynamic diagram
CN110657760A (en) Method and device for measuring space area based on artificial intelligence and storage medium
CN108427110A (en) Distance measuring method, device and electronic equipment
CN109840059B (en) Method and apparatus for displaying image
CN109816791B (en) Method and apparatus for generating information
CN107734324B (en) Method and system for measuring illumination uniformity of flash lamp and terminal equipment
CN107942692B (en) Information display method and device
CN113301324B (en) Virtual focus detection method, device, equipment and medium based on camera device
CN112132909B (en) Parameter acquisition method and device, media data processing method and storage medium
CN110196422B (en) Laser ranging test method and device and mobile terminal
US10229522B2 (en) Fixed size scope overlay for digital images
CN113470103A (en) Method and device for determining camera action distance in vehicle-road cooperation and road side equipment
CN112037280A (en) Object distance measuring method and device
CN111698545A (en) Remote controller, method and device for identifying operation action, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant