CN212847452U - Display device - Google Patents

Display device Download PDF

Info

Publication number
CN212847452U
CN212847452U CN202021566648.2U CN202021566648U CN212847452U CN 212847452 U CN212847452 U CN 212847452U CN 202021566648 U CN202021566648 U CN 202021566648U CN 212847452 U CN212847452 U CN 212847452U
Authority
CN
China
Prior art keywords
display device
real scene
camera
image
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202021566648.2U
Other languages
Chinese (zh)
Inventor
栾青
刘畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Manhou Network Technology Co ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202021566648.2U priority Critical patent/CN212847452U/en
Application granted granted Critical
Publication of CN212847452U publication Critical patent/CN212847452U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The application provides a display device, including: the camera is used for acquiring a real scene image of a real scene; the display device is connected to the camera and used for displaying the AR image formed by overlapping the real scene image and the virtual scene image; and the base is fixedly connected with the display device so as to support the display device. The display equipment realizes the fusion of the AR image displayed by the display device and the environment around the display device, and effectively improves the immersive immersion of the user.

Description

Display device
Technical Field
The application relates to the field of display equipment, in particular to display equipment.
Background
The augmented reality technology can fuse the virtual scene image and the real scene image, so that the visual effect of the virtual scene image in the real scene image is realized.
SUMMERY OF THE UTILITY MODEL
In view of this, embodiments of the present application provide a display device to solve the problem that the display device is difficult to realize an immersive sensation.
In order to achieve the above purpose, the technical solution of the embodiment of the present application is implemented as follows:
an embodiment of the present application provides a display device, including: the camera is used for acquiring a real scene image of a real scene; the display device is connected to the camera and used for displaying the AR image formed by overlapping the real scene image and the virtual scene image; and the base is fixedly connected with the display device so as to support the display device.
Further, the display device is provided with a first side surface and a second side surface, wherein the first side surface is a display surface, and the camera is arranged on the second side surface.
Further, still include: the processor is arranged in the base or the display device, connected to the camera and used for receiving a real scene image acquired by the camera, constructing a three-dimensional virtual scene model based on the real scene image and rendering the virtual scene image based on the three-dimensional virtual scene model.
Further, the camera comprises an RGB camera, and the RGB camera is used for acquiring an RGB image of a real scene; the processor is used for receiving the RGB images acquired by the RGB camera, identifying position information of a real scene based on the RGB images, and determining the three-dimensional virtual scene model matched with the position information.
Furthermore, the camera also comprises at least two grayscale cameras, and the at least two grayscale cameras are used for respectively acquiring grayscale images of a real scene; the processor is used for receiving the gray images collected by the at least two gray cameras and determining the depth information of a real scene based on the at least two gray images.
Further, the display device includes: the first display screen is arranged on the first side face of the display device and used for displaying the AR image formed by overlapping the real scene image and the virtual scene image; the wrapping piece is used for accommodating the accommodating space of the first display screen, and the wrapping piece wraps the side wall of the first display screen.
Further, the wrapping piece partially extends out of the first side face, and the extending portion of the wrapping piece inclines from one end close to the first display screen to one end far away from the first display screen towards one side far away from the accommodating space.
Further, the first display screen is a non-transparent screen or a transparent screen.
Further, the base is fixedly connected to the wrapping piece, and one surface of the base is attached to one side surface of the wrapping piece, which faces away from the accommodating space.
Further, the base has a triangular prism structure.
Further, the display device includes: the first display screen is arranged on the first side face of the display device and used for displaying the AR image formed by overlapping the real scene image and the virtual scene image; a second display screen disposed on the second side of the display device.
Further, still include: and the loudspeaker is arranged on the display device or the base and is used for playing the audio data corresponding to the AR image.
Further, still include: the radiating hole is arranged at the bottom of the display device.
The display equipment provided by the embodiment of the application comprises a camera, a display device and a base, wherein the camera is used for collecting real scene images of a real scene, the display device is connected to the camera and used for displaying the real scene images and virtual scene images to overlapped AR images, and the base is fixedly connected to the display device to support the display device. Due to the support of the base, the display device can stand on the bottom surface, the display device can be set to be slightly lower than the height of a user or higher than the height of the user, real scene images can be displayed by the display device in an equal proportion, namely the size of the real scene image displayed by the display device and the size of the real scene observed by the display device are the same, the fusion of the AR image displayed by the display device and the environment around the display device is realized, and the immersive sensation of the user in the scene is effectively improved.
Drawings
Fig. 1 is a first perspective view of a display device provided in an embodiment of the present application;
fig. 2 is a second perspective view of a display device provided in an embodiment of the present application;
fig. 3 is a third perspective view of a display device according to an embodiment of the present application.
Description of reference numerals:
100-a display device; 110-a display device; 111-a first display screen; 112-a wrapper; 113-heat dissipation holes; 120-a base; 130-a camera; 140-loudspeaker.
Detailed Description
Various combinations of the specific features in the embodiments described in the detailed description may be made without contradiction, for example, different embodiments may be formed by different combinations of the specific features, and in order to avoid unnecessary repetition, various possible combinations of the specific features in the present application will not be described separately.
In the description of the embodiments of the present application, it should be noted that, unless otherwise specified and limited, the term "connected" should be interpreted broadly, for example, as an electrical connection, a communication between two elements, a direct connection, or an indirect connection via an intermediate, and the specific meaning of the terms may be understood by those skilled in the art according to specific situations.
It should be noted that the terms "first \ second" and "first \ second" referred to in the embodiments of the present application are only used for distinguishing similar objects and do not represent a specific ordering for the objects, and it should be understood that "first \ second" and "first \ second" may be interchanged under a specific order or sequence where permitted. It should be understood that "first," "second" distinct objects may be interchanged under appropriate circumstances such that embodiments of the present application described herein may be implemented in an order other than those illustrated or described herein.
As shown in fig. 1 and 2, an embodiment of the present application provides a display apparatus 100 including a camera 130, a display device 110, and a base 120.
The camera 130 is configured to acquire a real scene image of a real scene. Specifically, the real scene image may be an image that exists in a real scene and can be observed by a user; it may also be an image that is not present in the real scene but is displayed by means of, for example, a display screen, a curtain, etc., and that can be observed by the user. For example, the camera 130 captures a display screen having an image being displayed in a real scene, and the display screen and the image displayed in the display screen are captured as a real scene image by the camera 130.
The display device 110 is connected to the camera 130. Specifically, the display device 110 may be electrically connected to the camera 130 through a wire, so that the signal of the real scene image collected by the camera 130 may be transmitted to the display device 110 by using an electrical signal; the display device 110 may also be connected to the camera 130 through an optical fiber, so that the signal of the real scene image collected by the camera 130 may be transmitted to the display device 110 by using an optical signal; the display device 110 may also be connected to the camera 130 through a wireless connection such as a wireless network or bluetooth, so that the signal of the real scene image collected by the camera 130 may be transmitted to the display device 110 through a wireless signal such as radio. The display device 110 is used for displaying an AR image in which a real scene image and a virtual scene image are superimposed. Specifically, the virtual scene image is a virtual scene model constructed based on the real scene image (i.e. the three-dimensional virtual scene model can represent the real scene), and the virtual scene image is rendered based on the three-dimensional virtual scene model.
The base 120 is fixedly coupled to the display device 110 to support the display device 110. Specifically, the base 120 may be fixedly connected to the display device 110 by bonding, welding, or the like, or may be detachably connected to the display device 110 by a structure such as a buckle, a screw, or the like, or may be integrated with the display device 110, and it is only necessary that the base 120 may be fixedly connected to the display device 110 to support the display device 110. The base 120 provides a firm support for the display device 110, allowing the display device 110 to be positioned large enough to reach a height slightly below or above the user. For example, the display device 110 may be set to be 1.5m to 2.5m, the user can see the real scene image behind the display device 110 while standing in front of the display device 110, and can see the AR image in which the virtual scene image is superimposed on the real scene image, and by setting, the proportions of the real scene image and the real scene image displayed by the display device 110 can be set to be equal, that is, the real scene image viewed by the user and the size of the real scene image viewed by the display device 110 are the same, and the immersive sensation of the user's presence can be significantly improved.
The display equipment provided by the embodiment of the application comprises a camera, a display device and a base, wherein the camera is used for collecting real scene images of a real scene, the display device is connected to the camera and used for displaying the real scene images and virtual scene images to overlapped AR images, and the base is fixedly connected to the display device to support the display device. Due to the support of the base, the display device can stand on the bottom surface, the display device can be set to be slightly lower than the height of a user or higher than the height of the user, real scene images can be displayed by the display device in an equal proportion, namely the size of the real scene image displayed by the display device and the size of the real scene observed by the display device are the same, the fusion of the AR image displayed by the display device and the environment around the display device is realized, the perspective effect is realized, and the immersive experience of the user in the scene is effectively improved.
As shown in fig. 1 and 2, in some embodiments of the present application, the display device 110 has a first side (e.g., the front side in fig. 1) and a second side (e.g., the front side in fig. 2). The first side is a display surface, that is, the display device 110 displays the AR image by using the first side. Specifically, the first side surface may be a display surface, that is, the range of the display surface covers the whole first side surface; the first side surface may also be a part of the display surface, that is, a part of the display surface covering the first side surface, and the other part may be another structure such as a frame. The camera 130 is disposed on the second side surface. Specifically, the camera 130 may be fixedly disposed on the surface of the second side surface; or embedded in the display device 110, and the collecting direction of the camera 130 and the orientation of the second side surface are substantially the same, for example, the included angle between the collecting direction of the camera 130 and the orientation of the second side surface is less than or equal to 20 °; can also protrude from the surface of the second side surface; it is only necessary that the camera 130 can acquire a real scene image of the real scene with the second side facing. The second side surface can be located on the back of the first side surface, the camera 130 can be used for collecting the real scene image facing the second side surface, and the user is located at the position right opposite to the first side surface, so that the real scene image behind the first side surface can be seen, and the perspective effect is achieved.
In some embodiments of the present application, the display apparatus further includes a processor (not shown), and the processor may be disposed in the base or the display device, and only needs to be fixed. Preferably, the processor may be provided within the display device so that it is connected to the display device by means of a wire or the like. The processor is connected to the camera to receive the real scene image collected by the camera. Specifically, the processor can be electrically connected to the camera through a wire, so that the signal of the real scene image acquired by the camera can be transmitted to the processor by using an electric signal; the processor can be connected with the camera through an optical fiber, so that a signal of a real scene image acquired by the camera can be transmitted to the processor through an optical signal; the processor can also be connected to the camera in a wireless connection mode such as a wireless network and Bluetooth, and thus, signals of the real scene image collected by the camera can be transmitted to the processor by wireless signals such as radio.
The processor constructs a three-dimensional virtual scene model based on the received real scene image, and renders a virtual scene image based on the three-dimensional virtual scene model. Specifically, the real scene image acquired by the camera may include a plurality of grayscale images of the real scene, and depth information of an image in the real scene may be obtained by analyzing the grayscale images; the real scene image collected by the camera may further include an RGB image, and the pixel characteristics of the real scene image in the real scene may be obtained by analyzing the RGB image. Therefore, by analyzing the gray images and the RGB image or images, the pixel characteristics and the depth characteristics of different objects in the real scene image can be obtained, and the three-dimensional virtual scene model can be constructed based on the pixel characteristics and the depth characteristics of different objects in the real scene image. The processor renders the virtual scene image based on the three-dimensional virtual scene model, and the virtual space coordinate systems of different positions in the three-dimensional virtual scene model can be analyzed by the processor, namely the coordinate systems of the three-dimensional virtual scene model and the virtual scene image are both virtual space coordinate systems, and whether the virtual scene image is shielded by a solid object in a real scene corresponding to the three-dimensional virtual scene model can be determined by determining the relative position relation of the three-dimensional virtual scene model and the virtual scene image on the same virtual space coordinate system. For example: the area A in the three-dimensional virtual scene model corresponds to an entity object a in a real scene, the virtual scene image is a penguin (or other images such as dinosaurs), and if a certain part of the penguin is shielded by the area A, the part of the penguin cannot be rendered; if a certain portion of the penguin is not occluded by the a-zone, that portion of the penguin will be rendered. The three-dimensional virtual scene model can be processed into a transparent form in the real scene represented by the three-dimensional virtual scene model, that is, the user cannot see the three-dimensional virtual scene model in the transparent form in the AR display device, but can see the AR image in which the virtual scene image is superimposed on the real scene image, and the layer of the virtual scene image is located above the layer of the real scene image, that is, the virtual scene image covers at least part of the real scene image.
In some embodiments of the present application, the camera may include an RGB camera (not shown) for capturing RGB images of the real scene. Specifically, the RGB camera may be separately disposed, or may be combined with the grayscale camera to form an integrated structure, for example, the RGB depth-of-field camera may be separately used. The RGB camera transmits the collected RGB image to the processor, and the processor is used for receiving the RGB image, recognizing the position information of the real scene based on the RGB image and determining the three-dimensional virtual scene model matched with the position information. Namely, a three-dimensional virtual scene model matched with the RGB image is constructed by analyzing the position information of different objects in the RGB image.
The analysis of the position information of the different objects in the RGB image may be performed by comparing a large number of images stored in the processor and the memory to identify the different objects in the RGB image and the position relationship between the different objects, so as to construct a three-dimensional virtual scene model matching the different objects.
In some embodiments of the present application, the camera may further include at least two grayscale cameras (not shown), where the grayscale cameras are used to acquire grayscale images of a real scene. Specifically, the grayscale camera may be set separately, or may be combined with other cameras. The at least two grayscale cameras transmit the acquired grayscale image information to the processor, and the processor is used for receiving the at least two grayscale images and determining the depth information of the real scene based on the at least two grayscale images. After determining the depth information of the real scene, the processor may determine position information of different objects within the image of the real scene, so as to construct a three-dimensional virtual scene model.
As shown in fig. 1 and 2, in some embodiments of the present application, the display device 110 includes a first display screen 111 and a wrapper 112. The first display screen 111 is disposed on a first side surface (i.e., the front surface in fig. 1) of the display device 110, and is used for displaying an AR image in which a real scene image and a virtual scene image are superimposed. Specifically, the first display screen 111 may be an electronic display screen such as an LED display screen, or a screen for a projector, and only the first display screen 111 may display an AR image. The wrapping member 112 forms an accommodating space for accommodating the first display screen 111, the first display screen 111 is located in the accommodating space and wrapped by the wrapping member 112, and the wrapping member 112 wraps the side wall of the first display screen 111. Specifically, the sidewall of the first display 111 refers to a surface between the display surface of the first display 111 and the back surface opposite to the display surface, that is, a surface connected to the periphery of the display surface. The wrapping member 112 wraps the sidewall of the first display screen 111 to fix the position of the first display screen 111, so as to protect the first display screen 111 and not affect the display effect of the first display screen 111.
As shown in fig. 1 and 3, in some embodiments of the present application, the wrap member 112 partially protrudes from the first side (i.e., the front side in fig. 1 and the right side in fig. 3). The wrapping element 112 can shield a portion of the field of view to segment the display screen from the surrounding environment, thereby reducing the interference of the surrounding environment to the user and achieving the effect of improving the immersion of the user. Specifically, the protruding portion of the wrapping member 112 may be inclined from an end close to the first display 111 to an end far from the first display 111 toward a side of the accommodating space. That is, the extending part of the wrapping member 112 forms an open shape, so that the shielding of the user on the display screen in the state of watching the display screen can be reduced, the display range of the display screen is improved, the depth sense is increased, and the window effect is achieved.
In some embodiments of the present application, the first display screen may be a non-transparent screen. In this state, the AR image in which the real scene image and the virtual scene image displayed on the first display screen are superimposed is in an opaque state, that is, the portions of the real scene image other than the portions blocked by the virtual scene image are both displayed in an opaque state, and the portions of the virtual scene image other than the portions blocked by the real scene image are both displayed in an opaque state.
In other embodiments of the present application, the first display screen may also be a transparent screen. In this state, the parts of the real scene image displayed by the first display screen except the part which blocks the virtual scene image can be all displayed in a transparent or semitransparent state, the parts of the virtual scene image displayed by the first display screen except the part which is blocked by the real scene image are all displayed in an opaque state, namely, a user can see the real scene behind the first display screen through the first display screen, the virtual scene image in the real scene is displayed, and the part which is blocked by the virtual scene image is blocked because the virtual scene image is in the opaque state, so that the effect of the virtual scene image in the real scene behind the display screen can be presented, and the immersion feeling of the user is improved.
In some embodiments of the present application, the display device may further include a second display screen disposed at a second side of the display device. The second display screen can display the same content as the first display screen, namely an AR image formed by overlapping a real scene image and a virtual scene image, the user can acquire the image of the user when the user is in a state of being over against the second display screen, namely the user can be used as one part of the real scene image, in the state, the user can see that the user is in the AR image formed by overlapping the real scene image and the virtual scene image when watching the content displayed by the second display screen, and the user can see that the user looks like a mirror in the world of virtual reality when watching the second display screen, so that the immersion of the user can be effectively improved. In order to fix the second display screen, the second display screen can be fixed on the wrapping piece, namely the wrapping piece is used for wrapping the second display screen, namely the first display screen and the second display screen are arranged back to back, and the wrapping piece wraps the first display screen and the second display screen at the same time, so that the effect of fixing the first display screen and the second display screen at the same time is achieved.
As shown in fig. 3, in some embodiments of the present application, the base 120 is fixedly connected to the wrapping member 112, the wrapping member 112 can be supported by the base 120, the first display screen and the second display screen can be supported by the base 120 because the wrapping member 112 wraps around the first display screen and the second display screen, and the base 120 is not directly connected to the first display screen and the second display screen, and the first display screen and the second display screen are not damaged. One surface of the base 120 is attached to a side of the wrap 112 facing away from the receiving space. In this state, the contact area between the base 120 and the wrapping element 112 is large enough to make the force applied to the base portion between the wrapping element 112 and the base 120 uniform, so as to avoid the damage caused by the concentrated force applied to the base 120 or the wrapping element 112.
As shown in fig. 3, in some embodiments of the present application, the susceptor 120 has a triangular prism structure. In particular, one of the faces of the triangular prism structure is fitted to one of the side faces of the packing member 112 facing away from the accommodating space. The other surface is used to attach to a placement platform for placing the base 120 on the placement platform. Wherein, the area of the face for laminating in place the platform can be adjusted according to actual conditions. Specifically, in order to ensure that the base 120 can provide a sufficiently stable support for the display device 110, the area of the surface for fitting to the placement platform may be larger than the area of the projection of the display device 110 on the placement platform.
As shown in fig. 2, in some embodiments of the present application, the display apparatus 100 may further include a speaker 140, and the speaker 140 may be disposed on the display device 110 or on the base 120. Specifically, the speaker 140 may be disposed on the display device 110, and the speaker 140 may be disposed on both sides of the display device 110 to achieve a stereo effect. The speaker 140 is used for playing the audio data corresponding to the AR image. The audio data played by the specific speaker 140 may be a sound such as a call or a song. For example, in the case where the AR image is a penguin, the speaker 140 may play a voice of the penguin, may play a voice imitating speech of the penguin, or may play a voice imitating singing of the penguin.
As shown in fig. 2, in some embodiments of the present application, the display apparatus 100 may further include a heat dissipation hole 113, and the heat dissipation hole 113 may be disposed at a bottom of the display device 110. Because structures such as a processor and a circuit board in the display device 110 generate heat in a working state, the temperature of the structures such as the processor and the circuit board rises, and the working state of the structures such as the processor and the circuit board is influenced, and the structures such as the processor and the circuit board are arranged at the bottom of the display device 110, which is beneficial to reducing the gravity center of the display device 110 and ensures the structural stability of the display device 110, the heat dissipation hole 113 is arranged at the bottom of the display device 110 and can be close to the structures such as the processor and the circuit board, so as to dissipate heat quickly.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (13)

1. A display device, comprising:
the camera is used for acquiring a real scene image of a real scene;
the display device is connected to the camera and used for displaying the AR image formed by overlapping the real scene image and the virtual scene image;
and the base is fixedly connected with the display device so as to support the display device.
2. The display apparatus of claim 1, wherein the display device has a first side and a second side, wherein the first side is a display surface and the second side has the camera disposed thereon.
3. The display device of claim 2, further comprising:
the processor is arranged in the base or the display device, connected to the camera and used for receiving a real scene image acquired by the camera, constructing a three-dimensional virtual scene model based on the real scene image and rendering the virtual scene image based on the three-dimensional virtual scene model.
4. The display device of claim 3, wherein the camera comprises an RGB camera to capture RGB images of a real scene;
the processor is used for receiving the RGB images acquired by the RGB camera, identifying position information of a real scene based on the RGB images, and determining the three-dimensional virtual scene model matched with the position information.
5. The display device of claim 4, wherein the camera further comprises at least two grayscale cameras for respectively capturing grayscale images of a real scene;
the processor is used for receiving the gray images collected by the at least two gray cameras and determining the depth information of a real scene based on the at least two gray images.
6. The display device according to claim 2, wherein the display means includes:
the first display screen is arranged on the first side face of the display device and used for displaying the AR image formed by overlapping the real scene image and the virtual scene image;
the wrapping piece is used for accommodating the accommodating space of the first display screen, and the wrapping piece wraps the side wall of the first display screen.
7. The display device according to claim 6, wherein the wrapping member partially protrudes from the first side surface, and the protruding portion of the wrapping member is inclined from an end close to the first display screen to an end far from the first display screen toward a side away from the accommodating space.
8. The display device of claim 6, wherein the first display screen is a non-transparent screen or a transparent screen.
9. The display device of claim 6, wherein the base is fixedly connected to the wrapping member, and a surface of the base is attached to a side of the wrapping member facing away from the receiving space.
10. The display device according to claim 6, wherein the base has a triangular prism structure.
11. The display device according to claim 2, wherein the display means includes:
the first display screen is arranged on the first side face of the display device and used for displaying the AR image formed by overlapping the real scene image and the virtual scene image;
a second display screen disposed on the second side of the display device.
12. The display device according to any one of claims 1 to 11, further comprising:
and the loudspeaker is arranged on the display device or the base and is used for playing the audio data corresponding to the AR image.
13. The display device according to any one of claims 1 to 11, further comprising:
the radiating hole is arranged at the bottom of the display device.
CN202021566648.2U 2020-07-31 2020-07-31 Display device Active CN212847452U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202021566648.2U CN212847452U (en) 2020-07-31 2020-07-31 Display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202021566648.2U CN212847452U (en) 2020-07-31 2020-07-31 Display device

Publications (1)

Publication Number Publication Date
CN212847452U true CN212847452U (en) 2021-03-30

Family

ID=75125734

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202021566648.2U Active CN212847452U (en) 2020-07-31 2020-07-31 Display device

Country Status (1)

Country Link
CN (1) CN212847452U (en)

Similar Documents

Publication Publication Date Title
WO2020192458A1 (en) Image processing method and head-mounted display device
US9001192B2 (en) Storage medium having stored therein stereoscopic image display program, stereoscopic image display device, stereoscopic image display system, and stereoscopic image display method
JP5739674B2 (en) Information processing program, information processing apparatus, information processing system, and information processing method
CN109327795B (en) Sound effect processing method and related product
JP2013054548A (en) Peripheral device, information processing system, and connection method of peripheral device
CN108616776B (en) Live broadcast analysis data acquisition method and device
US9049424B2 (en) Recording medium storing display control program for controlling display capable of providing stereoscopic display, display system, display control method, and display
CN112492097A (en) Audio playing method, device, terminal and computer readable storage medium
US11259008B2 (en) Sensor misalignment compensation
CN110119260A (en) A kind of screen display method and terminal
CN111565309B (en) Display device and distortion parameter determination method, device and system thereof, and storage medium
CN115804025A (en) Shutter camera pipeline exposure timestamp error determination
US20170310857A1 (en) Camera system and apparatus thereof
CN212847452U (en) Display device
JP6518645B2 (en) INFORMATION PROCESSING APPARATUS AND IMAGE GENERATION METHOD
KR20180076342A (en) Estimation system, estimation method, and estimation program
CN109714585B (en) Image transmission method and device, display method and device, and storage medium
WO2017122004A1 (en) Detection system
CN108712604B (en) Panoramic shooting method and mobile terminal
CN213600992U (en) Display device
KR20180119281A (en) Mobile terminal and method of controlling the same
CN114093020A (en) Motion capture method, motion capture device, electronic device and storage medium
CN110517188B (en) Method and device for determining aerial view image
CN111918089A (en) Video stream processing method, video stream display method, device and equipment
CN112770149A (en) Video processing method, device, terminal and storage medium

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220802

Address after: 210032 12th Floor, Zhongjian Global Building, 17 Xinghuo Road, Jiangbei New District, Nanjing City, Jiangsu Province

Patentee after: NANJING MANHOU NETWORK TECHNOLOGY Co.,Ltd.

Address before: Room 1101-1117, 11 / F, No. 58, Beisihuan West Road, Haidian District, Beijing 100080

Patentee before: BEIJING SENSETIME TECHNOLOGY DEVELOPMENT Co.,Ltd.