CN110825333B - Display method, display device, terminal equipment and storage medium - Google Patents

Display method, display device, terminal equipment and storage medium Download PDF

Info

Publication number
CN110825333B
CN110825333B CN201810924523.3A CN201810924523A CN110825333B CN 110825333 B CN110825333 B CN 110825333B CN 201810924523 A CN201810924523 A CN 201810924523A CN 110825333 B CN110825333 B CN 110825333B
Authority
CN
China
Prior art keywords
image
equipment
control instruction
image acquisition
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810924523.3A
Other languages
Chinese (zh)
Other versions
CN110825333A (en
Inventor
乔绎夫
戴景文
贺杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suiguang Technology (Beijing) Co.,Ltd.
Original Assignee
Guangdong Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Virtual Reality Technology Co Ltd filed Critical Guangdong Virtual Reality Technology Co Ltd
Priority to CN201810924523.3A priority Critical patent/CN110825333B/en
Priority to PCT/CN2019/097128 priority patent/WO2020020102A1/en
Priority to US16/666,429 priority patent/US11049324B2/en
Publication of CN110825333A publication Critical patent/CN110825333A/en
Application granted granted Critical
Publication of CN110825333B publication Critical patent/CN110825333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a display method, a display device, terminal equipment and a storage medium, and relates to the technical field of display. The display method is applied to the terminal equipment and comprises the following steps: acquiring attitude information of the terminal equipment; generating an acquisition control instruction based on the attitude information, and sending the acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire a scene image of a surrounding scene of the carrier by adopting an orientation angle matched with the attitude information; and receiving a scene image acquired by the image acquisition equipment, and displaying display content according to the scene image. The method can realize the control of the image acquisition equipment of the carrier by utilizing the attitude of the terminal equipment and display the image acquired by the image acquisition equipment.

Description

Display method, display device, terminal equipment and storage medium
Technical Field
The present application relates to the field of display technologies, and in particular, to a display method, an apparatus, a terminal device, and a storage medium.
Background
In the shooting and displaying technology mostly applied to a carrier, a camera is usually fixedly arranged at a certain position outside the carrier, and after the camera arranged outside the carrier collects images of a scene outside the carrier, a partial image collecting device transmits the collected images to a remote device for displaying. However, the image capturing device in the above technology can only capture an image from one direction, and cannot meet the requirement of the user to view the scene from another direction.
Disclosure of Invention
The embodiment of the application provides a display method, a display device, a terminal device and a storage medium, which can realize that an image acquisition device on a carrier acquires and displays an image of a scene with a direction and a visual angle required by a user.
In a first aspect, an embodiment of the present application provides a display method, which is applied to a terminal device, and the method includes: acquiring attitude information of the terminal equipment; generating an acquisition control instruction based on the attitude information, and sending the acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire a scene image of a surrounding scene of the carrier by adopting an orientation angle matched with the attitude information; and receiving a scene image acquired by the image acquisition equipment, and displaying display content according to the scene image.
In a second aspect, an embodiment of the present application provides a display apparatus, which is applied to a terminal device, and the apparatus includes: the terminal equipment comprises a posture acquisition module, an acquisition control module and a content display module, wherein the posture acquisition module is used for acquiring posture information of the terminal equipment; the acquisition control module is used for generating an acquisition control instruction based on the attitude information and sending the acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire a scene image of a surrounding scene of the carrier by adopting an orientation angle matched with the attitude information; the content display module is used for receiving the scene image acquired by the image acquisition equipment and displaying the display content according to the scene image.
In a third aspect, an embodiment of the present application provides a terminal device, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the display method provided by the first aspect above.
In a fourth aspect, an embodiment of the present application provides a storage medium, where a program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the display method provided in the first aspect.
The scheme that this application provided, attitude information through acquireing terminal equipment, then generate the acquisition control instruction based on attitude information, and with the image acquisition equipment that acquisition control instruction sent on the carrier, wherein, the acquisition control instruction is used for instructing image acquisition equipment to adopt the orientation angle that matches with attitude information and gathers the scene image around the carrier, receive the scene image that image acquisition equipment gathered at last, and show the display content according to the scene image, thereby can realize that image acquisition equipment on the carrier gathers the scene image around the carrier according to the orientation angle that terminal equipment's attitude information matches, and show in terminal equipment, convenience of customers observes the scene image of the direction visual angle that needs to watch.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic diagram of an application scenario suitable for use in an embodiment of the present application.
FIG. 2 shows a flow diagram of a display method according to one embodiment of the present application.
FIG. 3 shows a flow diagram of a display method according to another embodiment of the present application.
FIG. 4 shows a flow chart of a display method according to yet another embodiment of the present application.
FIG. 5 shows a block diagram of a display device according to an embodiment of the present application.
FIG. 6 shows a block diagram of an acquisition control module in a display device according to one embodiment of the present application.
Fig. 7 illustrates a block diagram of a content display module in a display device according to an embodiment of the present application.
FIG. 8 shows a block diagram of a display device according to an embodiment of the present application.
Fig. 9 is a block diagram of a terminal device for executing a display method according to an embodiment of the present application.
Fig. 10 is a storage unit for storing or carrying program codes for implementing a display method according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
An application scenario of the display method provided in the embodiment of the present application is described below.
Referring to fig. 1, a schematic diagram of an application scenario of the display method provided in the embodiment of the present application is shown, where the application scenario includes a display system 10. The display system 10 includes: terminal device 100, image capture device 200, and vehicle 300.
In the embodiment of the present application, the image capturing apparatus 200 is disposed on the carrier 300. Alternatively, the image capturing apparatus 200 may be rotatably disposed on the top of the carrier 300 by a rotating mechanism. The rotatable mechanism can rotate under the control of the image capturing apparatus 200, so as to drive the image capturing apparatus 200 to rotate relative to the carrier 300, so as to change the orientation, the rotation angle, and the like of the image capturing apparatus 200. Of course, the specific location of the image capturing apparatus 200 on the carrier 300 is not limited in the embodiment of the present application, and may also be set at other locations outside the carrier 300, and the specific location may be set according to the user requirement, for example, set at the tail of the carrier 300.
In the embodiment of the present application, the terminal device 100 and the image capturing device 200 may establish a communication connection. Specifically, the terminal device 100 may establish a communication connection with the image capturing device 200 through wifi (WIreless-Fidelity), GPRS (General Packet Radio Service), 4G (the 4th Generation mobile communication technology, fourth Generation mobile communication technology), NB-IOT (Narrow Band Internet of Things), and other communication methods. Of course, the communication mode between the terminal device 100 and the image capturing device 200 is not limited in the embodiment of the present application.
When the terminal device 100 is connected to the image capturing device 200 through wifi, data transmission between the terminal device 100 and the image capturing device 200 may be implemented through 2.4GHz or another channel.
In the embodiment of the present application, the terminal device 100 may be a head-mounted display device, or may be a mobile device such as a mobile phone and a tablet. When the terminal device 100 is a head-mounted display device, the head-mounted display device may be an integrated head-mounted display device. The terminal device 100 may also be an intelligent terminal such as a mobile phone connected to an external head-mounted display device, that is, the terminal device 100 may be used as a processing and storage device of the head-mounted display device, inserted into or connected to the external head-mounted display device, and display the display content in the head-mounted display device.
The image capturing device 200 is used for capturing a scene image around the vehicle and sending the scene image to the terminal device 100. The image capturing device 200 may be a color camera, an infrared camera, or the like, and the specific type of the image capturing device 200 may not be limited in the embodiment of the present application.
Referring to fig. 2, an embodiment of the present application provides a display method, which is applicable to a terminal device, and the method may include:
step S110: and acquiring attitude information of the terminal equipment.
In this embodiment, the terminal device may be provided with an attitude sensor for detecting attitude information of the terminal device. The attitude sensor is a three-dimensional motion attitude measurement System based on a Micro-Electro-Mechanical System (MEMS) technology. The attitude sensor can comprise motion sensors such as a three-axis gyroscope, a three-axis accelerometer, a three-axis electronic compass and the like, and attitude information of the terminal equipment is obtained through an embedded low-power-consumption ARM (advanced RISC machines) processor. The attitude information of the terminal device includes information such as the orientation and the rotation angle of the terminal device.
Therefore, the attitude information of the terminal device can be acquired by the attitude sensor. Of course, in the embodiment of the present application, a specific manner of obtaining the posture information of the terminal device is not limited in the embodiment of the present application, and for example, an Inertial Measurement Unit (IMU) may be set in the terminal device to obtain the posture information of the terminal device. The IMU is a device for measuring the three-axis attitude angle (or angular velocity) and acceleration of the terminal device. For example, the IMU generally includes three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detect acceleration signals of the object in three independent axes of the carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to the navigation coordinate system, and measure the angular velocity and acceleration of the object in three-dimensional space, and then calculate the attitude of the terminal device.
Step S120: and generating an acquisition control instruction based on the attitude information, and sending the acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire a scene image of a surrounding scene of the carrier by adopting an orientation angle matched with the attitude information.
After the attitude information of the terminal device is obtained, the image acquisition device can be controlled according to the attitude information of the terminal device, so that the image acquisition device acquires an image in a required view angle direction.
In this embodiment of the application, generating an acquisition control instruction based on the attitude information, and sending the acquisition control instruction to the image acquisition device on the vehicle may include: generating an acquisition control instruction based on the rotation angle and the orientation of the terminal equipment; and sending an acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire a scene image of a surrounding scene of the carrier by adopting the orientation and the rotation angle.
As one mode, the terminal device may obtain an angle and a direction that the image capturing device needs to rotate according to the acquired orientation and the acquired rotation angle. Specifically, a preset posture information, that is, a preset orientation, a preset rotation angle, and the like of the terminal device may be preset, and the image capturing device has an initial lens orientation state, where the preset posture information of the terminal device may correspond to the initial lens orientation state of the image capturing device. It can be understood that, when the posture information acquired by the terminal device is the preset posture information, the image acquisition device does not need to rotate to adjust the lens direction, so that the generated acquisition control instruction indicates that the image acquisition device does not rotate; when the posture information acquired by the terminal device is different from the preset posture information, the image acquisition device needs to determine the direction and the angle of the image acquisition device, which need to be rotated, according to the difference of the posture information of the terminal device relative to the preset posture information, so that the generated acquisition control instruction instructs the image acquisition device to rotate according to the direction and the angle of the image acquisition device, and the orientation of the lens of the image acquisition device corresponds to the posture information of the terminal device.
Further, the preset posture information may be that the rotation angle of each axis of the terminal device is 0, and the initial lens orientation state may be that the lens direction is parallel to the forward direction or the backward direction of the carrier. When the acquired posture information of the terminal device is different from the preset posture information, a X, Y, Z three-dimensional space coordinate system can be established by taking the terminal device as an origin, and a first direction and a first angle of rotation of the terminal device in a plane formed by an X axis and a Y axis and a second direction and a second angle of rotation of the terminal device in a plane formed by the X axis and a Z axis are determined according to the rotation angle and the orientation of the terminal device in each axis, wherein the X axis, the Y axis and the Z axis are mutually perpendicular and intersect at the origin, the X axis and the Y axis are mutually perpendicular coordinate axes on a horizontal plane, and the Z axis is perpendicular to the horizontal plane. Then, according to a first direction and a first angle of rotation of the terminal device in a plane formed by an X axis and a Y axis, determining that the direction and the angle of rotation of the lens of the image acquisition device in the horizontal plane are respectively the first direction and the first angle, and according to a second direction and a second angle of rotation of the terminal device in the plane formed by the X axis and the Z axis, determining that the direction and the angle of rotation of the lens of the image acquisition device in a vertical plane in which the carrier moves forward or backward are respectively the second direction and the second angle. And finally, generating an acquisition control instruction according to the first direction, the first angle, the second direction and the second angle, and sending the acquisition control instruction to image acquisition equipment. After receiving the acquisition control instruction, the image acquisition device controls the lens to rotate by a first angle in a first direction in a horizontal plane according to the acquisition control instruction, and controls the lens to rotate by a second angle in a second direction in a vertical plane in which the carrier moves forward or backward, and after the image acquisition device rotates, the image acquisition device can acquire a scene image of a surrounding scene of the carrier by using the current lens facing the angle.
In the embodiment of the application, after a first direction and a first angle of rotation of the terminal device in a plane formed by an X axis and a Y axis and a second direction and a second angle of rotation of the terminal device in the plane formed by the X axis and the Z axis are obtained, whether the first angle of rotation of the terminal device in the plane formed by the X axis and the Y axis is greater than a preset angle and whether the second angle of rotation of the terminal device in the plane formed by the X axis and the Z axis is greater than the preset angle can also be judged; when the first angle is larger than the preset angle, the lens of the image acquisition equipment is controlled to rotate by the first angle in the first direction in the horizontal plane, and when the second angle is larger than the preset angle, the lens of the image acquisition equipment is controlled to rotate by the second angle in the second direction in the vertical plane where the carrier moves forward or backward. It can be understood that the lens direction of the image capturing device may be controlled to be adjusted on the basis of the initial lens orientation state only when the posture information of the terminal device is greatly different from the preset posture information, so as to avoid the orientation angle of the image capturing device from being changed due to the user's misoperation such as shaking of the terminal device.
Of course, in this embodiment of the application, the preset posture information and the initial lens orientation state of the image capturing device may be not limited in this embodiment of the application, the preset posture information may also be any other posture information, and the initial lens orientation state of the image capturing device may also be any other state.
It can be understood that after the acquisition control instruction is generated according to the attitude information of the terminal device and sent to the image acquisition device, the image acquisition device adjusts the orientation of the lens according to the acquisition control instruction, so that the orientation of the lens is matched with the attitude information of the terminal device, and thus the visual angle direction of the image acquisition device for acquiring the image of the scene around the carrier is matched with the attitude information of the terminal device, and the purpose of controlling the visual angle direction of the lens of the image acquisition device through the attitude information of the terminal device is achieved.
Step S130: and receiving a scene image acquired by the image acquisition equipment, and displaying the display content according to the scene image.
The image capture device may transmit the captured scene image to the terminal device after capturing the scene image around the vehicle with an orientation angle that matches the terminal device pose information.
Correspondingly, the terminal equipment receives the scene image collected by the image collecting equipment. After receiving the scene image captured by the image capturing device, the display content may be displayed according to the scene image.
In this embodiment of the application, in some application scenarios, the terminal device may directly display the scene image as the display content. In some application scenarios, the terminal device may also generate a virtual object (virtual image) from the scene image, and display the generated virtual object as display content.
Of course, in this embodiment of the application, when the posture information of the terminal device changes, the orientation angle of the image capturing device may be adjusted based on the above method, so that the orientation angle of the image capturing device corresponds to the posture information of the terminal device after the change, and a requirement of changing the viewing angle direction of the image capturing device by changing the posture information of the terminal device is met.
Further, the terminal device may compare the posture information acquired twice, compare the currently acquired posture information with the posture information acquired last time, and may control the image acquisition device to change the orientation angle of the image acquisition device according to the change when the change of the currently acquired posture information compared with the posture information acquired last time is greater than a preset change.
Specifically, after a first direction and a first angle of rotation of the current terminal device in a plane formed by an X axis and a Y axis and a second direction and a second angle of rotation of the current terminal device in a plane formed by an X axis and a Z axis are obtained, it may be determined whether a first difference between the current first angle and a first angle obtained last time is greater than a preset difference, and whether a second difference between the current second angle and a second angle obtained last time is greater than the preset difference; when the first difference is larger than the preset difference, the lens of the image acquisition equipment is controlled to rotate by the angle of the first difference in the horizontal plane, and when the second difference is larger than the preset difference, the lens of the image acquisition equipment is controlled to rotate by the angle of the second difference in the vertical plane where the carrier moves forward or backward. It can be understood that the lens direction of the image acquisition device can be controlled to be adjusted only when the posture information of the current terminal device is greatly different from the posture information acquired at the previous time, so that the situation that the orientation angle of the image acquisition device is changed due to misoperation of a user such as shaking of the terminal device is avoided.
In the embodiment of the application, the terminal device may be a head-mounted display device, or the terminal device may be a smart terminal and is plugged into an external head-mounted display device. When the head-mounted display device is worn on the head of the user, the posture information detected by the head-mounted display device (terminal equipment or terminal equipment inserted into the head-mounted display device) changes according to the movement of the head of the user, and when the change is detected, the orientation of the image acquisition equipment is adjusted according to the change, that is, a third control instruction is sent to the image acquisition equipment, and the third control instruction is used for instructing the image acquisition equipment to adjust the orientation and the rotation angle according to the change, and a specific adjustment method can be referred to the method. The user can change the posture of the terminal equipment according to the movement of the head, so that the orientation angle of the image acquisition equipment is adjusted when the posture of the terminal equipment is changed. After receiving the scene image acquired by the image acquisition device, the terminal device can display the display content on the lens of the head-mounted display device according to the scene image, so that the user can observe the display content.
According to the display method provided by the embodiment of the application, the attitude information of the terminal device is obtained, the acquisition control instruction is generated and sent to the image acquisition device on the carrier based on the attitude information, the image acquisition device acquires the scene image of the surrounding scene of the carrier by adopting the orientation angle matched with the attitude information according to the acquisition control instruction, finally, the terminal device receives the scene image acquired by the image acquisition device and displays the display content according to the scene image, the image acquisition device transmits the scene image of the surrounding scene of the carrier to the terminal device for displaying according to the orientation angle matched with the attitude information of the terminal device, and the requirement for acquiring the scene image in the required visual angle direction is met.
Referring to fig. 3, another embodiment of the present application provides a display method, which can be applied to a terminal device, and the method can include:
step S210: and acquiring attitude information of the terminal equipment.
Step S220: and generating an acquisition control instruction based on the rotation angle and the orientation of the terminal equipment, and sending the acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire a scene image of a surrounding scene of the carrier by adopting an orientation angle matched with the attitude information.
In the embodiment of the application, one or more markers may be disposed in an active scene of the vehicle, and when there are markers in a scene around the vehicle corresponding to the orientation angle of the image capturing device, the markers are within the field of view of the image capturing device, and the image capturing device may capture an image of the markers including the markers.
Wherein, the marker may include at least one sub-marker, and the sub-marker may be a pattern having a certain shape. In one embodiment, each sub-marker may have one or more feature points, wherein the shape of the feature points is not limited, and may be a dot, a ring, a triangle, or other shapes. In addition, the distribution rules of the sub-markers in different markers are different, so each marker can have different identity information, and the terminal device can acquire the identity information corresponding to the marker by identifying the sub-markers included in the marker, wherein the identity information can be information that can be used for uniquely identifying the marker, such as codes, but is not limited thereto.
In one embodiment, the outline of the marker may be rectangular, but the shape of the marker may be other shapes, and the rectangular region and the plurality of sub-markers in the region constitute one marker. Of course, the marker may also be an object which is composed of a light spot and can emit light, the light spot marker may emit light of different wavelength bands or different colors, and the terminal device acquires the identity information corresponding to the marker by identifying information such as the wavelength band or the color of the light emitted by the light spot marker. Of course, the specific marker is not limited in the embodiment of the present application, and the marker only needs to be recognized by the terminal device.
Step S230: and receiving a scene image acquired by image acquisition equipment, and identifying the marker image to obtain the identification result of the marker.
The image acquisition device transmits the marker image to the terminal device after acquiring the marker image of the marker in the surrounding scene of the vehicle. Correspondingly, the terminal equipment receives the marker image.
In the embodiment of the application, when the image acquired by the image acquisition device is a marker image in a surrounding scene of the vehicle, the terminal device displays the image according to the scene image, may be a virtual object (virtual image) generated according to the marker image, and displays the virtual object as display content.
After the terminal device receives the marker image of the marker in the surrounding scene of the carrier, which is acquired by the image acquisition device, the marker in the marker image can be identified to obtain an identification result of the marker identification.
In the embodiment of the present application, the identification result of the marker may include a spatial position of the marker relative to the image capturing device, identity information of the marker, and the like. The spatial position may include a position of the marker relative to the image capturing device, posture information, and the like, where the posture information is an orientation and a rotation angle of the marker relative to the image capturing device.
Step S240: and acquiring virtual object data corresponding to the identification result.
In this embodiment, the terminal device may store data corresponding to the identity information of the marker. Therefore, after the terminal device identifies the marker and obtains the identification result of the marker identification, the data corresponding to the identity information can be obtained according to the identity information of the marker in the identification result.
Further, the data corresponding to the identity information of the marker may include virtual object data. The virtual object data may be model data of a virtual object, where the model data of the virtual object is used for rendering the virtual object, and may include colors used for establishing a model corresponding to the virtual object, coordinates of each vertex in the 3D model, and the like.
Step S250: and constructing a virtual object according to the virtual object data, and displaying the virtual object as display content.
After the virtual object data corresponding to the identity information of the marker is obtained, the virtual object can be constructed according to the virtual object data so as to display the virtual object.
In this embodiment of the present application, constructing a virtual object according to the virtual object data, and displaying the virtual object as display content may include:
acquiring a display position of the virtual object based on the position and the posture of the marker relative to the image acquisition equipment; and displaying the virtual object according to the display position.
It is understood that the recognition result of the marker includes the position and posture of the marker with respect to the image capturing device. And taking the position and the posture of the marker relative to the image acquisition equipment as the position and the posture of the marker relative to the terminal equipment. The virtual object can be located at the position of the marker, so that the spatial position of the virtual object relative to the terminal device is determined according to the position and the posture of the marker relative to the terminal device, and then coordinate conversion is performed according to the spatial position of the virtual object relative to the terminal device to obtain the display position of the virtual object in the display space of the terminal device.
After the display position of the virtual object is obtained, the constructed virtual object can be displayed at the display position in the display space of the terminal device, so that the effect that the virtual object is overlapped with the marker for displaying is achieved.
In some application scenes for simulating driving, the virtual object can be a virtual road, the data of the virtual road is stored in the terminal equipment, and the virtual road can be displayed according to the identification result of the collected marker image, so that the vivid effect of simulating driving is achieved.
In one embodiment, after the terminal device obtains the display position of the virtual object, the virtual object and the scene image collected by the image collecting device can be overlaid for display according to the display position, so that an Augmented Reality (AR) display effect is realized. For example, in some application scenarios, the virtual object may be a virtual signpost. And acquiring a virtual road sign corresponding to the identification result through the identification result of the marker image acquired by the image acquisition equipment, and displaying the virtual road sign and the scene image of the real scene acquired by the image acquisition equipment in an overlapping manner.
In some driving simulation application scenarios, the virtual object may also be a virtual obstacle, such as a virtual automobile, a virtual character, and the like. The virtual obstacle is displayed at the position of the marker according to the identification result of the marker image acquired by the image acquisition equipment, so that a user can control the vehicle to avoid and the like according to the seen virtual obstacle, and the sense of reality of driving simulation is improved.
Of course, the above application scenarios are only examples, and the specific application scenarios and the specific virtual objects are not limited in the embodiments of the present application.
According to the display method provided by the embodiment of the application, the attitude information of the terminal equipment is obtained to generate the control instruction, the control instruction is sent to the image acquisition equipment on the carrier, the image acquisition equipment is enabled to acquire the marker image of the marker around the carrier by adopting the orientation angle matched with the attitude information, then the marker image acquired by the image acquisition equipment is received, after the marker image is identified, the virtual object is displayed according to the virtual object data corresponding to the identification result, the marker image is acquired by controlling the image acquisition equipment on the carrier to adopt the orientation angle matched with the attitude information according to the attitude information of the terminal equipment, the terminal equipment is displayed according to the marker image acquired by the image equipment, and the requirements of acquiring the marker image in the required visual angle direction and displaying according to the marker are met.
Referring to fig. 4, another embodiment of the present application provides a display method, which can be applied to a terminal device, and the method can include:
step S310: and acquiring attitude information of the terminal equipment.
Step S320: and generating an acquisition control instruction based on the rotation angle and the orientation of the terminal equipment, and sending the acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire a scene image of a surrounding scene of the carrier by adopting the orientation and the rotation angle.
Step S330: the scene image is displayed as display content.
In the embodiment of the application, after receiving the scene image collected by the image device, the terminal device may directly display the scene image as the display content.
It is understood that in some scenes, the user needs to see the scene image around the vehicle, so that the user can know the scene content around the vehicle and perform related operations, such as remotely controlling the vehicle.
Step S340: when the attitude information of the terminal equipment is not changed within the first preset time, synthesizing the multi-frame images sent by the image acquisition equipment into the same image for displaying.
In the embodiment of the application, the acquisition control instruction is sent to the image acquisition equipment according to the attitude information of the terminal equipment, the image acquisition equipment is controlled to adopt the orientation angle matched with the attitude information, the scene image is acquired and displayed on the terminal equipment, the attitude information of the terminal equipment can be monitored, and when the attitude information of the terminal equipment changes, the acquisition control instruction is regenerated according to the current attitude information and is sent to the image acquisition equipment for image acquisition.
Further, whether the posture information of the terminal device changes within the first preset time can be judged, if the posture information of the terminal device does not change within the first preset time is judged, the orientation angle of the current image acquisition device does not need to change, and multiple frames of images acquired by the image acquisition device can be synthesized into the same image to be displayed.
In this embodiment of the application, when the posture information of the terminal device does not change within a first preset time, synthesizing a plurality of frames of images sent by the image acquisition device into the same image for display, including:
when the attitude information of the terminal equipment is not changed within a first preset time, sending a first control instruction to the image acquisition equipment, wherein the first control instruction is used for instructing the image acquisition equipment to rotate within a preset angle range according to a preset rotating speed and acquiring an image according to a preset frequency; the method comprises the steps of receiving images collected by image collecting equipment, and synthesizing multi-frame images collected by the image collecting equipment within a preset angle range into the same image for displaying.
It will be appreciated that the field of view of the image capture device is typically smaller than that of the human eye, so that the image of the scene captured by the image capture device is not sufficiently realistic for the user when displayed. Therefore, when the posture information of the terminal equipment is not changed within the first preset time, the image acquisition equipment is controlled to rotate within a certain range to acquire multi-frame images, and then the terminal equipment synthesizes the multi-frame images acquired by the image acquisition equipment into one image to be displayed. In addition, in order to ensure that the time difference of the collected multi-frame images is not large, the image collecting device needs to be controlled to rotate according to a preset speed and collect images according to a preset frequency, so that the time difference of the collected multi-frame images is not large. For example, the image capturing device may be rotated within a range of 15 ° to 30 ° to capture scene images in multiple viewing angle directions, and the terminal device combines the scene images in multiple viewing angle directions captured by the image capturing device into one image to be displayed, so that a user can see the scene image with a large viewing range.
Step S350: when the posture information of the terminal equipment is not changed within second preset time, when the eyeball position of the user is monitored to be changed, a second control instruction is sent to the image acquisition equipment, and the second control instruction is used for indicating the image acquisition equipment to adjust the orientation and the rotation angle according to the change.
In the embodiment of the application, besides controlling the image acquisition device by using the attitude information of the terminal device, enabling the image acquisition device to acquire the scene image around the carrier by using the orientation angle matched with the attitude information, the image acquisition device can be controlled by tracking the eyeballs of the user.
Specifically, the orientation angle of the image capturing device can be controlled by tracking the change in the focal point of the eyeball. It can be understood that, by acquiring the image of the user's eye, the image data of the retina and cornea of the user can be captured, and the terminal device constructs a 3D model of the eye according to the data, and tracks the focus of the eyeball through a three-dimensional space, so as to obtain the change information of the focus of the eyeball of the user. After the change information of the eyeballs of the user is obtained, a second control instruction is generated according to the change information of the eyeballs of the user and is sent to the image acquisition equipment so as to control the orientation and the rotation angle of the image acquisition equipment. For example, when the focus of the eyeball of the user is detected to move to the left, a second control instruction is generated and sent to the image acquisition equipment, the image acquisition equipment is instructed to rotate counterclockwise by a certain angle in the horizontal plane, and when the focus of the eyeball of the user is detected to move to the right, the generated second control instruction instructs the image acquisition equipment to rotate clockwise by a certain angle in the horizontal plane.
Therefore, the orientation and the rotation angle of the image acquisition equipment can be adjusted according to the eyeball position change of the user, and the purpose that the orientation and the rotation angle of the image acquisition equipment can be adjusted through the eyeball position change by the user is achieved.
In the embodiment of the application, shooting parameters of the image acquisition equipment can be adjusted through the terminal equipment. For example, a control instruction can be generated and sent to the image capturing device through a control operation performed by a user on the terminal device, and the image capturing device is instructed to adjust shooting parameters such as a focal length, an exposure time, and an aperture. In an application scenario, the vehicle may be in communication with a remote controller, and the vehicle may adjust a driving direction thereof according to a vehicle control command of the remote controller. When the vehicle control instruction is a control instruction for changing the driving direction, the vehicle can adjust the driving direction according to the vehicle control instruction after receiving the vehicle control instruction. At this time, since the vehicle driving direction changes, the content of the scene image of the surrounding scene of the vehicle, which is acquired by the image acquisition device, also changes, the image acquisition device can send the acquired scene image to the terminal device, and the terminal device can display the scene image after receiving the scene image. For example, when the controller sends a control instruction for controlling the vehicle to turn right under the operation of the user, the vehicle will turn right in response to the control instruction after receiving the control instruction, and at this time, if the orientation of the image acquisition device is the same as the orientation of the head of the vehicle, the image acquisition device will acquire the image of the scene content of the head orientation of the vehicle after turning right of the vehicle. Therefore, the purpose that the user remotely controls the change of the driving direction of the vehicle to change the content of the scene image acquired by the image acquisition equipment can be achieved.
According to the display method provided by the embodiment of the application, the acquisition control instruction is sent to the image acquisition equipment according to the attitude information of the terminal equipment, the image acquisition equipment is controlled to adopt the orientation angle matched with the attitude information, the scene image is acquired to the terminal equipment for display, and when the attitude information of the terminal equipment is not changed within a certain time, the terminal equipment controls the image acquisition equipment to acquire images of multiple visual angles, the images are synthesized into one image for display according to the images of the multiple visual angles, so that a user can see the image with a large visual field range, and the viewing reality is improved.
Referring to fig. 5, a block diagram of a display device 400 according to an embodiment of the present disclosure is shown, where the display device 400 is applied to a terminal device. The display device 400 may include: a pose acquisition module 410, an acquisition control module 420, and a content display module 430. The gesture obtaining module 410 is configured to obtain gesture information of the terminal device; the acquisition control module 420 is configured to generate an acquisition control instruction based on the attitude information, and send the acquisition control instruction to an image acquisition device on the vehicle, where the acquisition control instruction is used to instruct the image acquisition device to acquire a scene image of a surrounding scene of the vehicle at an orientation angle matched with the attitude information; the content display module 430 is configured to receive a scene image captured by the image capturing device, and display content according to the scene image.
In the embodiment of the present application, the posture information may include a rotation angle and an orientation of the terminal device. Referring to fig. 6, the acquisition control module 420 may include: a control instruction generating unit 421 and a control instruction transmitting unit 422. The control instruction generating unit 421 is configured to generate an acquisition control instruction based on the rotation angle and the orientation of the terminal device; the control instruction sending unit 422 is configured to send an acquisition control instruction to the image acquisition device on the vehicle, where the acquisition control instruction is used to instruct the image acquisition device to acquire a scene image of a surrounding scene of the vehicle by adopting an orientation and a rotation angle.
In an embodiment of the present application, when the scene image is a marker image including markers in surrounding scenes of the vehicle, please refer to fig. 7, the content display module 430 may include: an image recognition unit 431, a data acquisition unit 432, and an object display unit 433. The image recognition unit 431 is configured to recognize the marker image and obtain a recognition result of the marker; the data acquisition unit 432 is configured to acquire virtual object data corresponding to the recognition result; the object display unit 433 is configured to construct a virtual object from the virtual object data, and display the virtual object as display content.
Further, the recognition result includes the position and the posture of the marker relative to the image acquisition device. The object display unit 433 may be specifically configured to: acquiring a display position of the virtual object based on the position and the posture of the marker relative to the image acquisition equipment; and displaying the virtual object according to the display position.
In this embodiment, the content display module 430 may also be specifically configured to: the scene image is displayed as display content.
In the embodiment of the present application, please refer to fig. 8, the display apparatus 400 may further include: a composite image display module 440. The composite image display module 440 is configured to: when the attitude information of the terminal equipment is not changed within the first preset time, synthesizing the multi-frame images sent by the image acquisition equipment into the same image for displaying.
Further, the composite image display module 440 may be specifically configured to: when the attitude information of the terminal equipment is not changed within a first preset time, sending a first control instruction to the image acquisition equipment, wherein the first control instruction is used for instructing the image acquisition equipment to rotate within a preset angle range according to a preset rotating speed and acquiring an image according to a preset frequency; the method comprises the steps of receiving images collected by image collecting equipment, and synthesizing multi-frame images collected by the image collecting equipment within a preset angle range into the same image for displaying.
In the embodiment of the present application, please refer to fig. 8, the display apparatus 400 may further include: the control module 450 is rotated. The rotation control module 450 may be configured to: when the posture information of the terminal equipment is not changed within second preset time, when the eyeball position of the user is monitored to be changed, a second control instruction is sent to the image acquisition equipment, and the second control instruction is used for indicating the image acquisition equipment to adjust the orientation and the rotation angle according to the change.
In an embodiment of the present application, the display device may further include: and an orientation adjusting module. The orientation adjustment module may be configured to, when it is detected that the posture information of the terminal device changes with the movement of the head of the user, send a third control instruction to the image capturing device according to the change, where the third control instruction is used to instruct the image capturing device to adjust the orientation and the rotation angle according to the change; and receiving a scene image acquired by the image acquisition equipment, and displaying corresponding content in the head-mounted display device according to the scene image.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
To sum up, the display method and apparatus provided by the embodiments of the present application acquire the attitude information of the terminal device, generate the acquisition control command based on the attitude information, and send the acquisition control command to the image acquisition device on the carrier, wherein the acquisition control command is used to instruct the image acquisition device to acquire the scene image around the carrier by using the orientation angle matched with the attitude information, and finally receive the scene image acquired by the image acquisition device, and display the display content according to the scene image, so that the image acquisition device on the carrier can acquire the scene image around the carrier according to the orientation angle matched with the attitude information of the terminal device, and display the scene image on the terminal device, thereby facilitating the user to view the scene image at the direction view angle required to be viewed.
Referring to fig. 9, a block diagram of a terminal device according to an embodiment of the present application is shown. The terminal device 100 may be a terminal device capable of running an application, such as a smart phone, a tablet computer, an electronic book, or the like. The terminal device 100 in the present application may include one or more of the following components: a processor 110, a memory 120, and one or more applications, wherein the one or more applications may be stored in the memory 120 and configured to be executed by the one or more processors 110, the one or more programs configured to perform a method as described in the aforementioned method embodiments.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the entire terminal device 100 using various interfaces and lines, and performs various functions of the terminal device 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120 and calling data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal 100 in use, and the like.
Referring to fig. 10, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer readable medium 800 has stored therein a program code that can be called by a processor to execute the method described in the above method embodiments.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable storage medium. The computer readable storage medium 800 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (7)

1. A display method is applied to a terminal device, the terminal device is an AR head-mounted display device, and the display method is used for simulating an application scene of driving, and the method comprises the following steps:
acquiring attitude information of the terminal equipment;
generating an acquisition control instruction based on the attitude information, and sending the acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire a scene image of a surrounding scene of the carrier by adopting an orientation angle matched with the attitude information;
receiving a scene image acquired by the image acquisition equipment;
when the scene image is a marker image containing markers in the surrounding scene of the carrier, identifying the marker image to obtain an identification result of the markers, wherein the identification result comprises the positions and postures of the markers relative to the image acquisition equipment;
acquiring virtual object data corresponding to the identification result;
constructing a virtual object according to the virtual object data, wherein the virtual object comprises a virtual road and a virtual obstacle;
acquiring a display position of the virtual object based on the position and the posture of the marker relative to the image acquisition equipment;
overlapping the virtual object and the scene image according to the display position for displaying;
when the attitude information of the terminal equipment is not changed within a first preset time, sending a first control instruction to the image acquisition equipment, wherein the first control instruction is used for instructing the image acquisition equipment to rotate within a preset angle range according to a preset rotating speed and acquiring an image according to a preset frequency;
and receiving the image collected by the image collecting equipment, and synthesizing a plurality of frames of images collected by the image collecting equipment within a preset angle range into the same image for displaying.
2. The method of claim 1, wherein the attitude information includes a rotation angle and an orientation of the terminal device, and wherein generating acquisition control instructions based on the attitude information and sending the acquisition control instructions to an image acquisition device on a vehicle comprises:
generating an acquisition control instruction based on the rotation angle and the orientation of the terminal equipment;
and sending the acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire the scene image of the surrounding scene of the carrier by adopting the orientation and the rotation angle.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
when the posture information of the terminal equipment is not changed within second preset time, and when the eyeball position of the user is monitored to be changed, a second control instruction is sent to the image acquisition equipment, and the second control instruction is used for indicating the image acquisition equipment to adjust the orientation and the rotation angle according to the change.
4. The method according to claim 1 or 2, wherein the terminal device is a head-mounted display device or the terminal device is provided with a head-mounted display device, the method further comprising:
when detecting that the posture information of the terminal equipment changes along with the movement of the head of the user, sending a third control instruction to the image acquisition equipment according to the change, wherein the third control instruction is used for indicating the image acquisition equipment to adjust the orientation and the rotation angle according to the change;
and receiving a scene image acquired by the image acquisition equipment, and displaying corresponding content in the head-mounted display device according to the scene image.
5. A display device, applied to a terminal device, for simulating an application scene of driving, the device comprising: a gesture acquisition module, an acquisition control module, a content display module and a composite image display module, wherein,
the attitude acquisition module is used for acquiring attitude information of the terminal equipment;
the acquisition control module is used for generating an acquisition control instruction based on the attitude information and sending the acquisition control instruction to image acquisition equipment on the carrier, wherein the acquisition control instruction is used for instructing the image acquisition equipment to acquire a scene image of a surrounding scene of the carrier by adopting an orientation angle matched with the attitude information;
the content display module is used for receiving a scene image acquired by the image acquisition equipment; when the scene image is a marker image containing markers in the surrounding scene of the carrier, identifying the marker image to obtain an identification result of the markers, wherein the identification result comprises the positions and postures of the markers relative to the image acquisition equipment; acquiring virtual object data corresponding to the identification result; constructing a virtual object according to the virtual object data, wherein the virtual object comprises a virtual road and a virtual obstacle; acquiring a display position of the virtual object based on the position and the posture of the marker relative to the image acquisition equipment; overlapping the virtual object and the scene image according to the display position for displaying;
the synthetic image display module is used for sending a first control instruction to the image acquisition equipment when the attitude information of the terminal equipment is not changed within a first preset time, wherein the first control instruction is used for indicating the image acquisition equipment to rotate within a preset angle range according to a preset rotation speed and acquiring an image according to a preset frequency; and receiving the image collected by the image collecting equipment, and synthesizing a plurality of frames of images collected by the image collecting equipment within a preset angle range into the same image for displaying.
6. A terminal device, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-4.
7. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 4.
CN201810924523.3A 2018-07-23 2018-08-14 Display method, display device, terminal equipment and storage medium Active CN110825333B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201810924523.3A CN110825333B (en) 2018-08-14 2018-08-14 Display method, display device, terminal equipment and storage medium
PCT/CN2019/097128 WO2020020102A1 (en) 2018-07-23 2019-07-22 Method for generating virtual content, terminal device, and storage medium
US16/666,429 US11049324B2 (en) 2018-07-23 2019-10-29 Method of displaying virtual content based on markers

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810924523.3A CN110825333B (en) 2018-08-14 2018-08-14 Display method, display device, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110825333A CN110825333A (en) 2020-02-21
CN110825333B true CN110825333B (en) 2021-12-21

Family

ID=69547278

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810924523.3A Active CN110825333B (en) 2018-07-23 2018-08-14 Display method, display device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110825333B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111595346B (en) * 2020-06-02 2022-04-01 浙江商汤科技开发有限公司 Navigation reminding method and device, electronic equipment and storage medium
CN111710047A (en) * 2020-06-05 2020-09-25 北京有竹居网络技术有限公司 Information display method and device and electronic equipment
CN112261295B (en) * 2020-10-22 2022-05-20 Oppo广东移动通信有限公司 Image processing method, device and storage medium
WO2022109774A1 (en) * 2020-11-24 2022-06-02 深圳市大疆创新科技有限公司 Camera control method, device, and system
CN113286160A (en) * 2021-05-19 2021-08-20 Oppo广东移动通信有限公司 Video processing method, video processing device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105898346A (en) * 2016-04-21 2016-08-24 联想(北京)有限公司 Control method, electronic equipment and control system
CN105931272A (en) * 2016-05-06 2016-09-07 上海乐相科技有限公司 Method and system for tracking object in motion
CN105955456A (en) * 2016-04-15 2016-09-21 深圳超多维光电子有限公司 Virtual reality and augmented reality fusion method, device and intelligent wearable equipment
CN106131483A (en) * 2016-06-24 2016-11-16 宇龙计算机通信科技(深圳)有限公司 A kind of method for inspecting based on virtual reality and relevant device, system
CN107340870A (en) * 2017-07-13 2017-11-10 深圳市未来感知科技有限公司 A kind of fusion VR and AR virtual reality display system and its implementation

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120212406A1 (en) * 2010-02-28 2012-08-23 Osterhout Group, Inc. Ar glasses with event and sensor triggered ar eyepiece command and control facility of the ar eyepiece
US20140340424A1 (en) * 2013-05-17 2014-11-20 Jeri J. Ellsworth System and method for reconfigurable projected augmented/virtual reality appliance
CN105357433B (en) * 2015-10-13 2018-12-07 哈尔滨工程大学 A kind of adaptive method for panoramic imaging of high speed rotation focal length
CN105654808A (en) * 2016-02-03 2016-06-08 北京易驾佳信息科技有限公司 Intelligent training system for vehicle driver based on actual vehicle
CN107092314A (en) * 2017-06-29 2017-08-25 南京多伦科技股份有限公司 A kind of head-mounted display apparatus and detection method that driving behavior monitor detection is provided
CN108022302B (en) * 2017-12-01 2021-06-29 深圳市天界幻境科技有限公司 Stereo display device of Inside-Out space orientation's AR

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955456A (en) * 2016-04-15 2016-09-21 深圳超多维光电子有限公司 Virtual reality and augmented reality fusion method, device and intelligent wearable equipment
CN105898346A (en) * 2016-04-21 2016-08-24 联想(北京)有限公司 Control method, electronic equipment and control system
CN105931272A (en) * 2016-05-06 2016-09-07 上海乐相科技有限公司 Method and system for tracking object in motion
CN106131483A (en) * 2016-06-24 2016-11-16 宇龙计算机通信科技(深圳)有限公司 A kind of method for inspecting based on virtual reality and relevant device, system
CN107340870A (en) * 2017-07-13 2017-11-10 深圳市未来感知科技有限公司 A kind of fusion VR and AR virtual reality display system and its implementation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
增强现实中标记设计与识别方法研究;吴洪飞;《中国优秀硕士学位论文全文数据库 信息科技辑》;20110415;I138-812页 *

Also Published As

Publication number Publication date
CN110825333A (en) 2020-02-21

Similar Documents

Publication Publication Date Title
CN110825333B (en) Display method, display device, terminal equipment and storage medium
US20210131790A1 (en) Information processing apparatus, information processing method, and recording medium
US10999412B2 (en) Sharing mediated reality content
US11049324B2 (en) Method of displaying virtual content based on markers
JP2013258614A (en) Image generation device and image generation method
JP6899875B2 (en) Information processing device, video display system, information processing device control method, and program
US11244511B2 (en) Augmented reality method, system and terminal device of displaying and controlling virtual content via interaction device
US11443540B2 (en) Information processing apparatus and information processing method
CN113448343B (en) Method, system and readable medium for setting a target flight path of an aircraft
JP2023126474A (en) Systems and methods for augmented reality
CN108885487A (en) A kind of gestural control method of wearable system and wearable system
KR102190743B1 (en) AUGMENTED REALITY SERVICE PROVIDING APPARATUS INTERACTING WITH ROBOT and METHOD OF THEREOF
US20190172271A1 (en) Information processing device, information processing method, and program
JP2015118442A (en) Information processor, information processing method, and program
WO2017061890A1 (en) Wireless full body motion control sensor
CN113498531A (en) Head-mounted information processing device and head-mounted display system
JP7405083B2 (en) Information processing device, information processing method, and program
JP6200604B1 (en) Spherical camera robot altitude adjustment system, Spherical camera robot altitude adjustment method and program
JP6554139B2 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
JP2017086542A (en) Image change system, method, and program
CN111292424A (en) Multi-view 360-degree VR content providing system
WO2021079636A1 (en) Display control device, display control method and recording medium
CN111651031B (en) Virtual content display method and device, terminal equipment and storage medium
JP6856572B2 (en) An information processing method, a device, and a program for causing a computer to execute the information processing method.
KR20210150881A (en) Electronic apparatus and operaintg method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20221109

Address after: 100000 No. 24, Floor 3, Block A, Building 1, No. 26, Lugu East Street, Shijingshan District, Beijing

Patentee after: Suiguang Technology (Beijing) Co.,Ltd.

Address before: 510335 1401, international purchasing center, 8 Pazhou Avenue East, Haizhu District, Guangzhou City, Guangdong Province

Patentee before: GUANGDONG VIRTUAL REALITY TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right