CN111899347A - Augmented reality space display system and method based on projection - Google Patents

Augmented reality space display system and method based on projection Download PDF

Info

Publication number
CN111899347A
CN111899347A CN202010674597.3A CN202010674597A CN111899347A CN 111899347 A CN111899347 A CN 111899347A CN 202010674597 A CN202010674597 A CN 202010674597A CN 111899347 A CN111899347 A CN 111899347A
Authority
CN
China
Prior art keywords
augmented reality
pattern
space
depth information
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010674597.3A
Other languages
Chinese (zh)
Inventor
艾佳
苏显渝
韩宇
曾吉勇
张召世
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Shenrui Vision Technology Co ltd
Nanchang Virtual Reality Institute Co Ltd
Original Assignee
Sichuan Shenrui Vision Technology Co ltd
Nanchang Virtual Reality Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Shenrui Vision Technology Co ltd, Nanchang Virtual Reality Institute Co Ltd filed Critical Sichuan Shenrui Vision Technology Co ltd
Priority to CN202010674597.3A priority Critical patent/CN111899347A/en
Publication of CN111899347A publication Critical patent/CN111899347A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application discloses a projection-based augmented reality space display system and method. The system comprises: the projection unit comprises at least two projection devices and is used for projecting a structured light pattern and an augmented reality pattern to a target display space in a staggered manner in time sequence, wherein the target display space comprises a partition for limiting the target display space and a display object arranged in the target display space; the acquisition unit is used for acquiring the structured light pattern which is projected to the target display space by the projection unit and reflected by the partition and/or the display object of the target display space in real time; and the processing unit is used for calculating the depth information of the spacer and/or the exhibit in the target exhibiting space according to the structured light pattern, obtaining the augmented reality pattern according to the depth information, and sending the augmented reality pattern to the projection unit. The space display system can be used for projecting augmented reality patterns, so that the display effect of a target display space is enhanced.

Description

Augmented reality space display system and method based on projection
Technical Field
The application belongs to the technical field of computer vision, and particularly relates to a projection-based augmented reality space display system and method.
Background
Augmented reality refers to the process of displaying information or images provided by a computer system and real world information to a user in an overlapping mode, so that the perception capability of the user to the real world is improved.
The augmented reality has been used in industry, transportation and medical treatment, and how to apply the augmented reality to the field of space display is one of the hot spots of the current research. Projection based on a plane generates trapezoidal distortion due to different placement postures of projectors, and for space projection display with change in depth, the depth of a fit surface needs to be changed without distortion, so that the projection display is more challenging.
Disclosure of Invention
In view of the above problems, the present application provides a projection-based augmented reality space display system and method to improve the above problems.
In a first aspect, an embodiment of the present application provides a projection-based augmented reality space display system, where the system includes: the projection unit comprises at least two projection devices and is used for projecting a structured light pattern and an augmented reality pattern to a target display space in a staggered manner in time sequence, wherein the target display space comprises a partition for limiting the target display space and a display object arranged in the target display space; the acquisition unit is used for acquiring the structured light pattern which is projected to the target display space by the projection unit and reflected by the partition and/or the display object of the target display space in real time; and the processing unit is used for calculating the depth information of the spacer and/or the exhibit in the target exhibiting space according to the structured light pattern, obtaining the augmented reality pattern according to the depth information, and sending the augmented reality pattern to the projection unit.
In a second aspect, an embodiment of the present application provides a projection-based augmented reality space display method, which is applied to a projection-based augmented reality space display system, where the system includes a projection unit, an acquisition unit, and a processing unit, and the method includes: the projection unit emits a structured light pattern to a target display space, and the target display space comprises a partition for limiting the target display space and a display object arranged in the target display space; the acquisition unit acquires the structured light pattern reflected by the separator and/or the display object in the target display space in real time; the processing unit calculates the depth information of the spacer and/or the exhibit of the target exhibiting space based on the structured light pattern, and obtains the augmented reality pattern according to the depth information; the projection unit projects the structured light pattern and the augmented reality pattern to the target display space in an interleaved manner in a time sequence.
The embodiment of the application provides a projection-based augmented reality space display system and method. The projection unit comprises at least two projection devices and is used for projecting structured light patterns and augmented reality patterns to a target display space in a staggered mode in a time sequence, the target display space comprises a partition limiting the target display space and a display object arranged in the target display space, the acquisition unit is used for acquiring the structured light patterns projected to the target display space by the projection unit in real time and reflected by the partition and/or the display object in the target display space, the processing unit is used for calculating the depth information of the partition and/or the display object in the target display space according to the structured light patterns, acquiring the augmented reality patterns according to the depth information and sending the augmented reality patterns to the projection unit. By using the plurality of projection devices to project the structured light pattern and the augmented reality pattern to the target display space, the projection view field is enlarged, the display effect of the target display space is enhanced, and a user can observe the target display space and a display object in the target display space more intuitively.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram illustrating a projection-based augmented reality space display system according to an embodiment of the present application;
fig. 2 is a block diagram illustrating a projection-based augmented reality space presentation system according to an embodiment of the present application;
FIG. 3 is a timing diagram of the units of the projection-based augmented reality space presentation system according to an embodiment of the present application;
FIG. 4 is a block diagram of a projection-based augmented reality space presentation system according to another embodiment of the present application;
FIG. 5 is a scene diagram illustrating a projection-based augmented reality space presentation according to an embodiment of the present application;
fig. 6 is a flowchart of a projection-based augmented reality space display method according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Augmented reality refers to the process of displaying information or images provided by a computer system and real world information to a user in an overlapping mode, so that the perception capability of the user to the real world is improved. Augmented reality has been used in many research applications in industry, transportation, medical, education, and the like.
In the research process of the related augmented reality projection display device, the related augmented reality projection display device is composed of a single projector and a single depth camera, and a three-dimensional measurement technology is utilized to project a structured light pattern to a free-form surface to obtain a depth image, so that the augmented reality projection display is realized. The application limitation faced by the device is that the projection field of view of a single projector is small, and when the projection has the problems of occlusion, shadow and the like on the target display space, the display effect of augmented reality can be influenced.
Therefore, the inventor provides a projection-based augmented reality space display system and method for transmitting a structured light pattern to a spacer and/or a display object in a target display space through a projection unit, wherein a collection unit collects the structured light pattern reflected by the spacer and/or the display object in the target display space in real time, a processing unit calculates depth information of the spacer and/or the display object in the target display space based on the structured light pattern, then an augmented reality pattern is obtained according to the depth information, and finally the structured light pattern and the augmented reality pattern are projected to the target display space through at least two projection devices, so that a projection field is enlarged, the display effect of the target display space is enhanced, and a user can observe the display object in the target display space and the target display space more intuitively.
As shown in fig. 1, a projection-based augmented reality space display system is shown for acquiring an augmented reality pattern generated by a structured light pattern of a partition and/or a display of a target display space for spatial display. The projection unit may be used to project structured light patterns that are not identical in intensity and/or shape, or may be used to project an augmented reality pattern. In addition, it should be noted that, in the embodiments of the present application, the "partition and/or the exhibit" may include: the partition, the exhibit, the partition or the exhibit, and any one of the partition and the exhibit.
Alternatively, the projection unit may be a visible light projection device. Alternatively, the projection unit may be an infrared laser module, and the light source may be a VCSEL array laser for projecting an infrared pattern.
The specific type of structured light pattern projected by the projection unit is not limited in the embodiments of the present application. The structured pattern may include point structured light, line structured light, and area structured light, such as grating stripes, speckle, and the like. When the same structured light pattern is projected from the projection unit, the height of the projected spacer and/or the surface of the object display space is modulated by the spacer and/or the object to be displayed, the modulated structured light is collected by the collection unit and is transmitted to the processing unit for analysis and calculation, and then the three-dimensional surface shape data of the spacer and/or the surface of the object to be displayed in the object display space can be obtained.
The specific light source of the projection unit is not limited in the embodiment of the application, and the structured light pattern projected by the projection unit can be collected by the corresponding collection unit, for example, the structured light pattern projected by the infrared projection unit is collected by an infrared image collection device, and the structured light pattern projected by the visible light projection unit is collected by a visible light image collection device.
The collecting unit and the projecting unit keep a certain base line distance, can be an image sensor for recording the wavelength of the pattern emitted by the projecting unit, is used for collecting the image of the structured light pattern projected by the projecting unit, and can comprise a photosensitive element, an optical filter, a lens and the like. The acquisition unit can be an image sensor corresponding to the type of the light source, for example, the light source of the projection unit is infrared light, and the acquisition unit is infrared light image acquisition equipment; if the light source is visible light, the acquisition unit is a visible light image acquisition device and the like. The position relationship between the image capturing unit and the projection unit is not limited in the embodiments of the present application, for example, the projection unit is horizontally disposed, horizontally projected, and the image capturing unit and the projection unit are disposed at the same horizontal height.
The processing unit is connected with the acquisition unit and used for processing the structured light pattern reflected by the target display space acquired by the acquisition unit, calculating the depth information of the spacer and/or the display object of the target display space according to the acquired structured light pattern, generating an augmented reality pattern according to the depth information and transmitting the augmented reality pattern to the projection unit. The platform of the processing unit can be one of ASIC, FPGA and DSP, and is used for processing the acquired structured light pattern, and also can be used for controlling the projection of the projection unit and the pattern acquisition of the acquisition unit. Optionally, the processing unit may include a controller for controlling, such as by a synchronous timing circuit and an asynchronous timing circuit; a depth processor may also be included for performing the process of depth information acquisition.
The units in the system can be independent from each other or integrated together. For example, the system may be an electronic device such as a mobile phone, a tablet computer, a notebook computer, etc. which integrates a projection unit, an image acquisition unit, a storage unit, and a processing unit.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 2, an embodiment of the present application provides a projection-based augmented reality space display system 100, where the system 100 includes:
the projection unit 110 includes at least two projection devices for projecting the structured light pattern and the editable pattern in a time sequence to a target display space, where the target display space includes a partition defining the target display space and a display object disposed in the target display space.
It should be noted that the projection unit 110 may emit invisible light to the partition and/or the display object of the target display space, for example, a laser light source is used to emit a laser beam, and for the purpose of augmented reality projection, the augmented reality pattern emitted to the partition and/or the display object of the target display space is visible light.
In one way, when the plurality of projection devices in the projection unit 110 are configured to project simultaneously, the partitions and/or the exhibits in the target display space are not shaded, are blocked, and overlap each other, and each overlapping area includes more than 2 and less than 4 projectors, wherein the projectors may be Digital Light Processing (DLP) projectors.
Optionally, in one period, the sum of the time for which the projection unit 110 projects the structured light pattern is less than the time for which the projection unit 110 projects the augmented reality pattern.
Optionally, for the display of the static target display space and the partition and/or the display object in the target display space, the projection unit 110 only needs to scan the projection structured light pattern once, and if the partition or the display object in the target display space needs to be moved, the acquisition unit 120 may be used to identify whether the object is moved, and then the moved target display space and the partition and/or the display object in the target display space are rescanned, and then the projection unit 110 may generate the corresponding augmented reality pattern to perform projection.
As shown in fig. 3, fig. 3 is a timing diagram of each unit in the augmented reality space display system 100 based on projection according to the embodiment of the present application, a plurality of projectors in the projection unit 110 project structured light patterns to partitions and/or displays of a target display space in a manner of staggered projection on a time sequence, and a time sequence of the acquisition unit 120 completely matches the time sequence of the projection unit 110, and a process of reflecting the structured light patterns by the partitions and/or displays of the target display space is completed, where content projected by the projectors can be arbitrarily encoded, and the encoding method is as follows: the projector transmits a structured light pattern and an augmented reality pattern to a spacer and/or a exhibit in a target exhibition space in a staggered manner in a time sequence, the structured light pattern is coded at a certain moment and projected onto the spacer and/or the exhibit in the target exhibition space, the acquisition unit 120 can acquire depth information after acquiring the structured light pattern, and the spacer and/or the exhibit in the target exhibition space can be identified by using the depth information after acquiring the depth information, or the spacer and/or the exhibit in the target exhibition space can be identified by using the structured light pattern in combination with the depth information; coding an augmented reality pattern at the next moment, and projecting the augmented reality pattern to a spacer and/or a display object of a target display space; thereafter, circulation is performed in this manner.
And the acquisition unit 120 is used for acquiring the structured light pattern which is projected to the target display space by the projection unit 110 and reflected by the partition and/or the display object of the target display space in real time.
By way of example, the collecting unit 120 may be a Charge Coupled Device (CCD) camera, which may collect in real time a structured light pattern that is projected by a Digital Light Processing (DLP) projector onto a partition of the target display space and/or reflected back from the display.
The processing unit 130 is configured to calculate depth information of the partition and/or the exhibit in the target exhibition space according to the structured light pattern, obtain the augmented reality pattern according to the depth information, and send the augmented reality pattern to the projection unit 110.
By way of example, the processing unit 130 is configured to establish a mapping relationship between the pixels of the projection unit 110 and the pixels of the acquisition unit 120 by using a back fringe projection technique. Optionally, the processing unit 130 may further use a planar calibration technique based on fringe projection to establish a phase-depth mapping table within the measurement range, and further obtain depth information of the partition and/or the exhibit in the target exhibition space based on the mapping table. The mapping relationship between the pixels of the projection unit 110 and the collection unit 120 is established by a reverse fringe projection technique, which is essentially a modulation of the light transmission process between the collection unit 120 (a Charge Coupled Device (CCD) camera) and the projection unit 110 (a Digital Light Processing (DLP) projector) pixel array by the spacer and/or the exhibit of the target display space. The process can be described as:
Figure 1
where l and m are pixel coordinates of the projection unit 110 (digital light processing (DLP) projector), IproRepresented as the light intensity projected by the projection unit 110 (digital light processing (DLP) projector); i and j are pixel coordinates of the acquisition unit 120 (charge coupled device (CCD) camera), ICCDDenoted as the response intensity of the acquisition unit 120 (charge coupled device (CCD) camera) and f denotes the transfer function between the two arrays, which establishes a mapping between the acquisition unit 120 (charge coupled device (CCD) camera) pixels and the projection unit 110 (digital light processing (DLP) projector) pixels.
As one way, the depth acquisition technique such as high-precision phase shift technique based on triangulation, binocular stereo vision technique or TOF depth camera, etc. can be used, according to the structural parameters of the system, the mapping relation between the height and the phase is established in the nonvolatile memory, the plurality of projectors in the projection unit 110 respectively emit the structured light patterns to the partition and/or the exhibit of the target exhibition space, the collection unit 120 collects the structured light patterns reflected by the partition and/or the exhibit of the target exhibition space, and the structured light patterns are obtained by utilizing a three-frequency phase expansion algorithm and the established phase-height mapping relation, the phase-height mapping relationship may be understood as a phase-depth mapping relationship, and the processing unit 130 calculates the depth information of the spacer and/or the exhibit in the target exhibition space according to the following calculation formula:
Figure 2
wherein h represents the depth of the spacer and/or the exhibit of the target exhibiting space, l represents the distance from the axis center of the collecting unit 120 to the spacer and/or the exhibit of the target exhibiting space,
Figure BDA0002583586590000072
expressed as phase difference, d is expressed as distance between the projection unit 110 and the acquisition unit 120, and f is expressed as fundamental frequency of the structured light.
The processing unit 130 stores the depth information of the partition and/or the exhibit of the target exhibiting space calculated according to the structured light pattern in the database, and may store a plurality of different styles of augmented reality patterns combined with the stored depth information in the database.
In related applications, the processing unit 130 may query whether corresponding depth information exists in the database according to the obtained depth information of the display space, obtain a corresponding augmented reality pattern or a model type of furniture corresponding to the depth information, and send the corresponding augmented reality pattern to the projection unit 110 for projection operation. For example, when the system 100 is applied to a home store, depth information of all furniture in the furniture store may be stored in a database in advance, and a plurality of augmented reality patterns may be matched to each depth information. After the system 100 obtains the depth information of the current furniture, the system 100 may project the augmented reality pattern matched with the current furniture by searching for the augmented reality pattern in the database, or may determine the type and model of the current furniture according to the depth information of the current furniture, and project the matched augmented reality pattern according to the type and model of the furniture. The augmented reality pattern is realized based on depth information coding, the depth information is defined in a fixed reference system during calibration, if the position of furniture in a furniture store in the reference system is fixed, the augmented reality pattern cannot change, and the system 100 can project the augmented reality pattern corresponding to the depth information of the furniture in the furniture store by searching a database; the augmented reality pattern changes if the position of the furniture in the furniture store moves. Therefore, if it can be ensured that the positions of the furniture in the furniture store, the projection unit and the acquisition unit are relatively fixed, after the system 100 acquires the depth information of the furniture in the current furniture store, the system 100 may project the augmented reality pattern matched with the depth information of the furniture in the current furniture store by searching the database.
The embodiment of the application provides an augmented reality space display system based on projection, the collection unit is used for collecting the structured light pattern which is projected to a spacer and/or a display object of a target display space by a projection unit in real time and is reflected by the spacer and/or the display object of the target display space, the processing unit is used for calculating the depth information of the spacer and/or the display object of the target display space according to the structured light pattern, the augmented reality pattern is generated according to the depth information, the augmented reality pattern is sent to the projection unit, and the projection unit comprises at least two projection devices and is used for projecting the structured light pattern and the augmented reality pattern to the spacer and/or the display object of the target display space in a staggered mode in time sequence. Through using a plurality of projection devices to the separator and/or the show thing projection structured light pattern and the augmented reality pattern in target show space, enlarged the projection visual field, strengthened the bandwagon effect in target show space for the user can be more audio-visual observation target show space and the show thing in the target show space.
Further, as shown in fig. 4, the projection-based augmented reality space display system 100 further includes an instruction unit 140, where the instruction unit 140 is configured to project or instruct an optical indicator to any position of the partition and/or the display of the target display space.
As a way, in order to realize a human-computer interaction function and increase the interest of using the system, an instruction unit 140 may be added to the projection-based augmented reality space display system 100, where the instruction unit 140 may include an internal instruction unit 141 and an external instruction unit 142, the internal instruction unit 141 may input an instruction through a mouse, a keyboard, or other devices, the external instruction unit 142 may emit an instruction with an optical mark, such as a pattern, a voice, or an action, and the two parts may be matched to complete or independently complete the emission of the instruction.
The projection-based augmented reality space display system 100 in the embodiment of the present application may project an augmented reality pattern corresponding to its depth information to a partition and/or a display of a target display space, where the augmented reality pattern corresponding to its depth information includes: the display comprises an augmented reality pattern corresponding to the depth information of the separator or an augmented reality pattern corresponding to the depth information of the display, wherein the augmented reality pattern is a texture pattern and/or a pseudo-color pattern. It should be noted that, in the embodiment of the present application, the "texture pattern and/or pseudo-color pattern" may include: the texture pattern, the false color pattern, the texture pattern or the false color pattern, and any one of the texture pattern and the false color pattern.
In one aspect, the processing unit 130 is specifically configured to calculate depth information of the partition and/or the display object in the target display space according to the structured light pattern, and obtain an augmented reality pattern corresponding to the depth information according to the user portrait.
It will be appreciated that the user image may be information manually entered by the user, such as the user's age, preferences, gender, and preferred style, or user information actively obtained by the system from a server.
The processing unit 130 may acquire an augmented reality pattern corresponding to the calculated depth information from the received user image.
As another mode, the processing unit 130 is specifically configured to calculate depth information of the partition and/or the exhibit in the target exhibition space according to the structured light pattern, select multiple styles of augmented reality patterns combined with the depth information from a database, and send the augmented reality patterns selected by the user to the projection unit.
Specifically, the database stores decoration styles of multiple styles in advance, wherein the decoration styles stored in advance can include a japanese style, an european style, a chinese style, a mediterranean style and other decoration styles. The processing unit 130 may obtain the augmented reality pattern corresponding to the depth information from the database according to the calculated depth information, and project the augmented reality pattern of different decoration styles according to the selection of the user.
Further, the processing unit 130 is specifically configured to calculate depth information of the partition and/or the display object in the target display space according to the structured light pattern, acquire time information, acquire a corresponding augmented reality pattern according to the current time information and the depth information, and send the augmented reality pattern to the projection unit.
The time information may include current time information directly acquired by the system, or may also include pre-stored time information, and further, the time information may include day, night, or spring, summer, autumn, and winter.
Optionally, the processing unit 130 is further configured to obtain audio information corresponding to the depth information, and play the audio information.
Specifically, the audio information may be commentary or music corresponding to the projected augmented reality pattern. The processing unit 130 may project different styles of augmented reality patterns according to the selection of the user, and play commentary or music corresponding to the augmented reality patterns according to the currently projected augmented reality patterns. For example, at a building vendor, the processor 130 may play audio commentary or matching music, etc. corresponding to the building selected by the user.
As a mode, the target display space includes a home market and furniture in the home market, and the processing unit 130 is configured to obtain an external texture pattern or a pattern of the furniture according to depth information of the furniture in the home market and the market, and obtain an augmented reality pattern having different external texture patterns or pattern patterns according to user selection.
Optionally, a storage area may be partitioned for the processing unit 130, and used to store external texture patterns, color tones, or pattern patterns of furniture in various home stores. When the user does not select the furniture with the satisfactory pattern or the furniture with the pattern style in the home and sales space, the acquisition unit 120 in the augmented reality space display system 100 based on projection may be used to acquire the structured light pattern reflected by the furniture selected by the current user, the processing unit 130 calculates the depth information of the furniture selected by the current user according to the structured light pattern, acquires information such as different pattern styles or external textures from the storage area according to the depth information to generate an augmented reality pattern, and then sends the generated augmented reality pattern to the projection unit 110, and the projection unit 110 projects the augmented reality pattern onto the furniture selected by the current user.
Illustratively, as shown in fig. 5, in the furniture store of fig. 5 there is a sofa, it being understood that the sofa has no color, no motif, and no design, the reflected structured light pattern of the sofa may be acquired by the acquisition unit 120 in the projection-based augmented reality space presentation system 100, the processing unit 130 calculates the depth information of the sofa according to the reflected structured light pattern, and selects a suitable pattern, pattern or color from a pre-divided storage area to generate an augmented reality pattern according to the calculated depth information, and then sends the augmented reality pattern to the projection unit 110, and then the projector in the projection unit 110 projects the generated augmented reality pattern to the sofa, further, the projector in fig. 5 selects to project different augmented reality patterns according to the user portrait, during the projection process, the processing unit 130 may play audio information that matches the projected augmented reality pattern.
As another mode, the target display space may include objects between sample plates and objects inside the sample plates, where the objects between the sample plates and the objects inside the sample plates are white, and the processing unit 130 is configured to obtain shapes, sizes, and position states of the objects between the sample plates and the objects inside the sample plates according to depth information of the objects between the sample plates and the objects inside the sample plates, obtain augmented reality patterns with different textures, hues, or pattern styles according to user selection, and project the augmented reality patterns to the target display space through the projection unit 110.
The projection unit 110 may be connected to a multi-channel display output interface of an external terminal server; the external terminal server projects the preset picture or video image onto the wall, floor and furniture among the white sample boards through the projection unit 110, so that the wall, floor and furniture in the white sample boards can be changed into various matching effects. For example, for the material selection of living room decoration, furniture such as a white sofa, a white table, a white carpet and the like is placed in a white sample plate, the augmented reality space display system 100 based on projection can be utilized to display the selected decoration material on the white sofa, the white table and the white carpet between the white sample plates in real time to form corresponding augmented reality patterns, and the matching effect with other materials can be reproduced in real time, so that great convenience is provided for the reasonable selection of the decoration material of a user; the system can be applied to places such as expositions, exhibition and the like, is disposable and can be used for multiple times, so that the decoration cost of the application places is greatly reduced, and different design styles such as European style bedrooms and Japanese style living rooms can be reproduced according to different requirements of users.
Further, the processing unit 130 may seamlessly splice the augmented reality patterns projected by the plurality of projection devices in the projection unit 110 together according to the positions of the walls, floors and furniture in the white sample plate, and the projection pictures are connected and matched with the contour textures of the walls, floors and furniture in the sample plate.
Further, the augmented reality space display system 100 based on projection may be further configured to select an augmented reality pattern corresponding to the current season from the divided storage areas according to the temporal and seasonal variation in combination with the acquired spatial depth information, and perform projection display among the white sample plates.
Referring to fig. 6, an augmented reality space display method based on projection provided in an embodiment of the present application includes:
step S210: the projection unit emits a structured light pattern into a target display space, which includes a partition defining the target display space and a display disposed within the target display space.
As an alternative, the projection unit comprises at least two projection devices, wherein one projection device may be used to project the structured light pattern and the other projection device may be used to project the augmented reality pattern, or both projection devices may be used to project both the structured light pattern and the augmented reality pattern.
Further, the structured light pattern emitted by the projection unit to the target display space may be a visible light pattern or a non-visible light pattern. The visible light pattern may include: binary images, grayscale images, color images, and the like; the invisible light pattern may include: infrared laser images, etc.
Step S220: the acquisition unit acquires the structured light pattern reflected by the separator and/or the exhibit in the target exhibition space in real time.
As a mode, the acquisition unit acquires the structured light pattern reflected by the partition and/or the exhibit in the target exhibition space in real time, and when the acquisition unit detects an instruction of projecting the structured light pattern to the partition and/or the exhibit in the target exhibition space by the projection unit, the acquisition unit starts to acquire the structured light pattern reflected by the partition and/or the exhibit in the target exhibition space in real time.
As another mode, a timer may be set in the collecting unit, a time for the collecting unit to start collecting the structured light pattern reflected by the partition and/or the exhibit of the target display space may be set in the timer, and after the time for collecting the structured light pattern reflected by the partition and/or the exhibit of the target display space is set, the collecting unit performs the operation of collecting the structured light pattern reflected by the partition and/or the exhibit of the target display space according to the time set by the timer.
Step S230: the processing unit calculates depth information of the spacer and/or the exhibit of the target exhibiting space based on the structured light pattern, and obtains the augmented reality pattern according to the depth information.
As one mode, the processing unit establishes a mapping relationship between the projection unit and the acquisition unit pixel through a reverse fringe projection technique. Optionally, the processing unit may further establish a phase-depth mapping table within a measurement range by using a planar calibration technique based on fringe projection, and further obtain depth information of the spacer and/or the exhibit in the target exhibition space based on the mapping table.
In the embodiment of the application, the projection unit and the acquisition unit are in a mapping relationship and a splicing relationship established based on a reverse fringe projection technology, and the spacer and/or the exhibit in the target exhibition space are/is used for modulating the light transmission process between the projection unit and the acquisition unit pixel array, wherein the mapping relationship between the projection unit and the acquisition unit pixel can be established through a phase expansion algorithm, and the mapping relationship between the projection unit and the acquisition unit pixel can be established according to intensity, color, gray scale and the like, wherein the phase expansion algorithm can comprise a time phase expansion algorithm, a space-time phase expansion algorithm and the like.
Alternatively, the acquisition unit may be a different type of camera. The depth information of the target display space can be acquired by adding one more camera and utilizing a binocular stereo vision technology, the depth information of the target display space can also be directly acquired by utilizing a TOF depth camera, and then the mapping relation between the camera acquiring the depth information and the pixels of the projection unit can be obtained.
Optionally, the measurement field of view of the depth information of the partition and/or the display object of the target display space is directly related to the size of the field of view of the acquisition unit, and the field of view of the projection unit is approximately matched with the measurement field of view of the depth information. For example, if the size of the partition and/or the exhibit of the target exhibition space is small, only one depth camera is needed to cover the measurement range, but occlusion may be caused by drastic change of the depth information of the partition and/or the exhibit of the target exhibition space during measurement, a plurality of projectors are needed to cover from multiple directions to solve occlusion and shadow, at the moment, the depth camera and the plurality of projectors form a plurality of three-dimensional measurement devices, a pixel mapping relation needs to be established between the depth camera and each projector, then the problem of splicing and fusion during projection can be solved, and in this moment, in order to solve occlusion and shadow, the overlapping area of multiple projections is necessarily large; if the problem that the measurement view field of a large scene is insufficient is solved, a plurality of depth cameras and projectors are required to cover, meanwhile, a plurality of projections are required by the projectors, the projections need to be spliced and fused at the moment, the overlapping area can be as small as possible, and the workload of splicing and fusing is reduced.
Step S240: the projection unit projects the structured light pattern and the augmented reality pattern to a partition and/or a exhibit of the target display space in an interleaved manner in a time sequence.
As one way, the structured light pattern projected by the projection unit matches the augmented reality pattern, and if the structured light pattern is visible light, the augmented reality pattern is also visible light; and when the structured light pattern is invisible light, adopting an additional mode, and adding corresponding equipment to display the augmented reality pattern.
According to the augmented reality space display method based on projection, the projection unit emits the structured light pattern to the partition and/or the display object in the target display space, the acquisition unit acquires the structured light pattern reflected by the partition and/or the display object in the target display space in real time, the processing unit calculates the depth information of the partition and/or the display object in the target display space based on the structured light pattern, the augmented reality pattern is obtained according to the depth information, and the projection unit projects the structured light pattern and the augmented reality pattern to the partition and/or the display object in the target display space in a staggered mode in time sequence. By the method, the structured light pattern and the augmented reality pattern are projected to the partition and/or the exhibit in the target exhibition space by the plurality of projection devices, so that the projection view field is enlarged, the exhibition effect of the target exhibition space is enhanced, and a user can observe the target exhibition space and the exhibit in the target exhibition space more intuitively.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. A projection-based augmented reality space presentation system, the system comprising:
the projection unit comprises at least two projection devices and is used for projecting a structured light pattern and an augmented reality pattern to a target display space in a staggered manner in time sequence, wherein the target display space comprises a partition for limiting the target display space and a display object arranged in the target display space;
the acquisition unit is used for acquiring the structured light pattern which is projected to the target display space by the projection unit and reflected by the partition and/or the display object of the target display space in real time;
and the processing unit is used for calculating the depth information of the spacer and/or the exhibit in the target exhibiting space according to the structured light pattern, obtaining the augmented reality pattern according to the depth information, and sending the augmented reality pattern to the projection unit.
2. The system according to claim 1, wherein the processing unit is specifically configured to calculate depth information of the partition and/or the display of the target display space according to the structured light pattern, and obtain an augmented reality pattern corresponding to the depth information according to a user representation.
3. The system according to claim 1, wherein the processing unit is specifically configured to calculate depth information of the partition and/or the exhibit of the target exhibition space according to the structured light pattern, select augmented reality patterns corresponding to the depth information from a database in multiple styles, and send the augmented reality patterns selected by the user to the projection unit.
4. The system according to claim 1, wherein the processing unit is specifically configured to calculate depth information of the partition and/or the exhibit in the target exhibition space according to the structured light pattern, obtain time information, obtain a corresponding augmented reality pattern according to the current time information and the depth information, and send the augmented reality pattern to the projection unit.
5. The system of any of claims 1 to 4, wherein the processing unit is further configured to: and acquiring audio information corresponding to the depth information, and playing the audio information.
6. The system of any of claims 2 to 4, wherein the augmented reality pattern corresponding to the depth information comprises: an augmented reality pattern corresponding to the depth information of the spacer, or an augmented reality pattern corresponding to the depth information of the display.
7. The system according to any one of claims 1 to 4, wherein the augmented reality pattern is a texture pattern and/or a pseudo-color pattern.
8. The system according to claim 1, wherein the partition of the target display space comprises a wall of a home store, the display of the target display space comprises furniture in the home store, the processing unit is specifically configured to calculate depth information of the home store and/or the furniture according to the structured light pattern, obtain an augmented reality pattern matched with the depth information according to the depth information of the home store and/or the furniture, and send the augmented reality pattern to the projection unit.
9. The system according to claim 1, wherein the partition of the target display space includes a model between sample plates, the display includes an object in the model between sample plates, the model between sample plates and the object in the model between sample plates are white, the processing unit is specifically configured to calculate depth information of the object between sample plates and between sample plates according to the structured light pattern, obtain an augmented reality pattern matched with the depth information according to the depth information of the object between sample plates and between sample plates, and send the augmented reality pattern to the projection unit.
10. A projection-based augmented reality space display method is applied to a projection-based augmented reality space display system, the system comprises a projection unit, an acquisition unit and a processing unit, and the method comprises the following steps:
the projection unit emits a structured light pattern to a target display space, and the target display space comprises a partition for limiting the target display space and a display object arranged in the target display space;
the acquisition unit acquires the structured light pattern reflected by the separator and/or the display object in the target display space in real time;
the processing unit calculates the depth information of the spacer and/or the exhibit of the target exhibiting space based on the structured light pattern, and obtains the augmented reality pattern according to the depth information;
the projection unit projects the structured light pattern and the augmented reality pattern to the target display space in an interleaved manner in a time sequence.
CN202010674597.3A 2020-07-14 2020-07-14 Augmented reality space display system and method based on projection Pending CN111899347A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010674597.3A CN111899347A (en) 2020-07-14 2020-07-14 Augmented reality space display system and method based on projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010674597.3A CN111899347A (en) 2020-07-14 2020-07-14 Augmented reality space display system and method based on projection

Publications (1)

Publication Number Publication Date
CN111899347A true CN111899347A (en) 2020-11-06

Family

ID=73192990

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010674597.3A Pending CN111899347A (en) 2020-07-14 2020-07-14 Augmented reality space display system and method based on projection

Country Status (1)

Country Link
CN (1) CN111899347A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306088A (en) * 2011-06-23 2012-01-04 北京北方卓立科技有限公司 Solid projection false or true registration device and method
CN105184800A (en) * 2015-09-28 2015-12-23 神画科技(深圳)有限公司 Automatic three-dimensional mapping projection system and method
CN105182662A (en) * 2015-09-28 2015-12-23 神画科技(深圳)有限公司 Projection method and system with augmented reality effect
CN106355479A (en) * 2016-09-22 2017-01-25 京东方科技集团股份有限公司 Virtual fitting method, virtual fitting glasses and virtual fitting system
CN106530343A (en) * 2016-10-18 2017-03-22 深圳奥比中光科技有限公司 Projection device and projection method based on target depth image
CN107067428A (en) * 2017-03-10 2017-08-18 深圳奥比中光科技有限公司 Augmented reality projection arrangement and method
CN109963138A (en) * 2019-02-15 2019-07-02 深圳奥比中光科技有限公司 A kind of depth camera and image acquiring method
CN110286773A (en) * 2019-07-01 2019-09-27 腾讯科技(深圳)有限公司 Information providing method, device, equipment and storage medium based on augmented reality
CN111047947A (en) * 2019-12-10 2020-04-21 塔普翊海(上海)智能科技有限公司 Writing guider based on AR technology and writing guiding method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306088A (en) * 2011-06-23 2012-01-04 北京北方卓立科技有限公司 Solid projection false or true registration device and method
CN105184800A (en) * 2015-09-28 2015-12-23 神画科技(深圳)有限公司 Automatic three-dimensional mapping projection system and method
CN105182662A (en) * 2015-09-28 2015-12-23 神画科技(深圳)有限公司 Projection method and system with augmented reality effect
CN106355479A (en) * 2016-09-22 2017-01-25 京东方科技集团股份有限公司 Virtual fitting method, virtual fitting glasses and virtual fitting system
CN106530343A (en) * 2016-10-18 2017-03-22 深圳奥比中光科技有限公司 Projection device and projection method based on target depth image
CN107067428A (en) * 2017-03-10 2017-08-18 深圳奥比中光科技有限公司 Augmented reality projection arrangement and method
CN109963138A (en) * 2019-02-15 2019-07-02 深圳奥比中光科技有限公司 A kind of depth camera and image acquiring method
CN110286773A (en) * 2019-07-01 2019-09-27 腾讯科技(深圳)有限公司 Information providing method, device, equipment and storage medium based on augmented reality
CN111047947A (en) * 2019-12-10 2020-04-21 塔普翊海(上海)智能科技有限公司 Writing guider based on AR technology and writing guiding method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JUN CHEN ET AL.: "Simultaneous projection mapping using high-frame-rate depth vision", 《2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA)》, pages 4507 - 4510 *
侯颖: "基于投影的三维打印模型纹理着色", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 6, pages 12 - 27 *
纪显俐: "基于增强现实技术的智能沙盘教学演示系统", 《中国优秀硕士学位论文全文数据库 社会科学Ⅱ辑》, no. 2, pages 28 - 53 *

Similar Documents

Publication Publication Date Title
US10587864B2 (en) Image processing device and method
US10873741B2 (en) Image processing apparatus and method
JP5338217B2 (en) Image processing method and projection system
JP2020078079A (en) Image processing apparatus and method
RU2656817C2 (en) Devices, systems and methods of capturing and displaying appearances
US8970693B1 (en) Surface modeling with structured light
US9420253B2 (en) Presenting realistic designs of spaces and objects
JP6669063B2 (en) Image processing apparatus and method
US20140181630A1 (en) Method and apparatus for adding annotations to an image
JP2008182706A (en) Method for charging simulated display limit to light transport matrix t of projector camera system in arbitrary scene and mosaicing first projected image from first projector camera system and second projected image from second projector camera system
KR102300285B1 (en) Method for mapping ar-based content and system using the same
KR101177058B1 (en) System for 3D based marker
JP6679966B2 (en) Three-dimensional virtual space presentation system, three-dimensional virtual space presentation method and program
JP2009087176A (en) Object generation system, object generation device, and object generation program
JP2006040053A (en) Image processing method and program
CN111899347A (en) Augmented reality space display system and method based on projection
CN111899348A (en) Projection-based augmented reality experiment demonstration system and method
JP5549421B2 (en) Projection apparatus, projection method, and program
CN111862024A (en) Workpiece detection system and method based on depth information enhanced projection
JP5516199B2 (en) Image processing apparatus, image processing method, projection apparatus, and program
JP6291382B2 (en) LT matrix generation device, method, program, and recording medium
CN118216136A (en) Information processing apparatus, image processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination