CN109885172B - Object interaction display method and system based on Augmented Reality (AR) - Google Patents

Object interaction display method and system based on Augmented Reality (AR) Download PDF

Info

Publication number
CN109885172B
CN109885172B CN201910143803.5A CN201910143803A CN109885172B CN 109885172 B CN109885172 B CN 109885172B CN 201910143803 A CN201910143803 A CN 201910143803A CN 109885172 B CN109885172 B CN 109885172B
Authority
CN
China
Prior art keywords
display
component
auxiliary display
feature
current image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910143803.5A
Other languages
Chinese (zh)
Other versions
CN109885172A (en
Inventor
张量
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Vocational University
Original Assignee
Suzhou Vocational University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Vocational University filed Critical Suzhou Vocational University
Priority to CN201910143803.5A priority Critical patent/CN109885172B/en
Publication of CN109885172A publication Critical patent/CN109885172A/en
Application granted granted Critical
Publication of CN109885172B publication Critical patent/CN109885172B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application relates to an object interaction display method and system based on Augmented Reality (AR), which belong to the technical field of AR, and comprise the following steps: the projection component acquires an auxiliary display characteristic corresponding to the display requirement; projecting the auxiliary display features to the exhibited object through the projection component to obtain the projected exhibited object; the image acquisition component acquires the current image frame of the projected displayed object in real time; the image identification component identifies whether a target object exists in the current image frame; when a target object exists in a current image frame, the AR display component acquires related information of the target object; the AR display component displays the related information in a target display mode based on the position of the target object in the current image frame; the problem that the failure rate of identifying the target object by the image identification component in the conventional AR interactive display system is high can be solved; the success rate of the image recognition component for recognizing the target object can be improved, and the display effect of the AR interaction display system is further improved.

Description

Object interaction display method and system based on Augmented Reality (AR)
Technical Field
The invention relates to an Augmented Reality (AR) -based object interaction display method, and belongs to the technical field of AR.
Background
The AR technology is a technology for increasing the perception of a user to the real world through information provided by a computer system, applies virtual information to the real world, and superimposes virtual objects, scenes, or system prompt information generated by a computer to the real scene, thereby realizing the enhancement of reality. At present, AR technology can be applied to museums for cultural relic exhibition.
When displaying is carried out through the AR technology, the current image of the displayed object can be collected by using the image collection assembly; identifying a target object in a current image through an image identification component; and when the target object is identified, displaying the related information of the target object through the AR display component.
However, if the ambient light of the scene where the image capturing component is located is weak, too strong, or the capturing angle is not good, the image recognizing component may fail to recognize the current image captured by the image capturing component, and thus, the image recognizing component may not recognize the target object, so that the AR display component may not obtain the related information corresponding to the target object, and may not display the related information based on the position of the target object.
In addition, the image recognition component can only recognize target objects in simple scenes when recognizing the target objects in the current image, such as: only simple geometries are included in the scene and need to have either significant sharp corners or significant contrasting colors; at this time, if the object to be displayed in the scene is an arc contour or the color contrast is not obvious, the success rate of the image recognition module for recognition is reduced, and at this time, the AR display module cannot acquire the related information corresponding to the target object and cannot display the related information based on the position of the target object.
Disclosure of Invention
The invention aims to provide an AR-based object interaction display method and system. In order to achieve the purpose, the invention provides the following technical scheme:
in a first aspect, a real AR-based object interaction display method is provided, where the method includes:
obtain this supplementary show characteristic that show demand corresponds through the projection subassembly, supplementary show characteristic includes: the object texture and/or the object feature anchor point corresponding to the display requirement; wherein the object texture refers to an image for projecting on the surface of the object to be exhibited; the object feature anchor point refers to a specific inflection point located in the texture of the object;
projecting the auxiliary display features to the exhibited object through the projection component at the frequency of 24 frames/second or 12 frames/second to obtain the projected exhibited object; wherein, at least 1 frame of auxiliary display features comprising the object texture and/or the object feature anchor points is inserted into each 24 frames of dynamic auxiliary display features; the displayed object comprises original display characteristics corresponding to the display requirement;
acquiring the current image frame of the projected displayed object in real time through an image acquisition assembly;
identifying whether a target object exists in the current image frame through an image identification component, wherein the auxiliary display feature is used for the image identification component to identify the target object in combination with the original display feature, and the target object is an object comprising the auxiliary display feature and the original display feature;
when the target object exists in the current image frame, acquiring related information of the target object through an AR display component;
and displaying the related information in a target display mode through the AR display component based on the position of the target object identified by the image identification component in the current image frame of the space scene.
Optionally, the obtaining of the auxiliary display feature corresponding to the display requirement through the projection component includes:
acquiring a current image frame of the displayed object through the image acquisition assembly;
identifying, by the image identification component, a current image frame of the displayed object;
and when the image identification component fails to identify the current image frame of the displayed object, acquiring the auxiliary display characteristics corresponding to the display requirement through the projection component.
Optionally, the auxiliary display features corresponding to the display requirement include at least two groups of auxiliary display features;
when the image recognition component fails to recognize the current image frame of the exhibited object, the auxiliary display feature corresponding to the display requirement is acquired through the projection component, and the method comprises the following steps:
acquiring a failure reason for identifying the failure of the current image frame of the exhibited object;
and determining the auxiliary display features corresponding to the failure reasons from the at least two groups of auxiliary display features corresponding to the display requirement.
Optionally, the method further comprises:
when the display requirement is changed, acquiring updated auxiliary display characteristics corresponding to the updated display requirement through the projection assembly;
and projecting the updated auxiliary display features to the displayed object through the projection component to obtain the projected displayed object, and executing the step of acquiring the current image frame of the projected displayed object in real time through the image acquisition component again.
Optionally, the present display requirement includes: chronological characteristics of the related information and/or information identification.
Optionally, the projection assembly is disposed above the object to be spread.
Optionally, the projecting, by the projection component, the auxiliary display feature to the displayed object includes:
and projecting the auxiliary display features to the displayed object through the projection assembly in a water curtain projection mode.
In a second aspect, an object interactive display system of AR is provided, the system comprising:
the projection component is used for acquiring auxiliary display characteristics corresponding to the display requirement; projecting the auxiliary display features to the exhibited object at the frequency of 24 frames/second or 12 frames/second to obtain the projected exhibited object; the displayed object comprises original display characteristics corresponding to the display requirement; the auxiliary display features include: the object texture and/or the object feature anchor point corresponding to the display requirement; wherein the object texture refers to an image for projecting on the surface of the object to be exhibited; the object feature anchor point refers to a specific inflection point located in the texture of the object; wherein, at least 1 frame of auxiliary display features comprising the object texture and/or the object feature anchor points is inserted into each 24 frames of dynamic auxiliary display features;
the image acquisition component is used for acquiring the current image frame of the projected displayed object in real time;
the image identification component is used for identifying whether a target object exists in the current image frame, the auxiliary display feature is used for the image identification component to identify the target object in combination with the original display feature, and the target object is an object comprising the auxiliary display feature and the original display feature;
the AR display component is used for acquiring related information of the target object when the target object exists in the current image frame; and displaying the related information in a target display mode based on the position of the target object in the spatial scene, which is identified by the image identification component.
The invention has the beneficial effects that: acquiring auxiliary display characteristics corresponding to the display requirements through a projection assembly; projecting the auxiliary display features to the exhibited object through the projection component to obtain the projected exhibited object; acquiring a current image frame of the projected displayed object in real time through an image acquisition assembly; identifying whether a target object exists in the current image frame through an image identification component; when a target object exists in a current image frame, acquiring related information of the target object through an AR display component; displaying relevant information in a target display mode through an AR display component based on the position of a target object in a current image frame; the problem that the display effect of an AR interactive display system is poor due to the fact that the failure rate of the image recognition component for recognizing the target object is high in the existing AR interactive display system can be solved; because the projection assembly can strengthen the display characteristics of the displayed object, the current image frame acquired by the image acquisition assembly can meet the requirement of the image identification assembly on identifying the target object, so that the success rate of the image identification assembly on identifying the target object is improved, and the display effect of the AR interactive display system is further improved.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical solutions of the present invention more clearly understood and to implement them in accordance with the contents of the description, the following detailed description is given with reference to the preferred embodiments of the present invention and the accompanying drawings.
Drawings
FIG. 1 is a schematic structural diagram of an AR-based object interaction presentation system according to an embodiment of the present application;
FIG. 2 is a flowchart of an AR-based object interaction presentation method according to an embodiment of the present application;
fig. 3 is a schematic diagram of an object texture and an object feature anchor point provided in an embodiment of the present application.
Detailed Description
The following detailed description of embodiments of the present invention is provided in connection with the accompanying drawings and examples. The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
Fig. 1 is a schematic structural diagram of an AR-based object interaction display system according to an embodiment of the present application, and as shown in fig. 1, the system at least includes: projection component 110, image acquisition component 120, image recognition component 130, AR presentation component 140, and control component 150.
Alternatively, the projection module 110 may be a home theater type, a portable business type, an educational conference type, a mainstream engineering type, a professional type, a Cathode Ray Tube (CRT) projector, a Liquid Crystal Display (LCD) projector, a Digital Light Processor (DLP) projector, a hologram projector, etc., and the present embodiment does not limit the type of the projection module 110.
Schematically, in this embodiment, the projection component 110 is configured to obtain an auxiliary display feature corresponding to the display requirement; and projecting the auxiliary display features to the exhibited object at the frequency of 24 frames/second or 12 frames/second to obtain the projected exhibited object. In one example, the projection component 110 projects the auxiliary display feature to the displayed object in a water curtain projection manner, so that the auxiliary display feature can be displayed on the displayed object in three dimensions. Wherein, at least 1 frame of auxiliary display features comprising object textures and/or object feature anchor points are inserted into the dynamic auxiliary display features every 24 frames. In this way, at least 1 frame projected by the projection component 110 includes the object texture and/or the auxiliary display feature of the object feature anchor point, which cannot be recognized by human eyes, and the image acquisition component can acquire the auxiliary display feature, which can improve the interactive display effect.
And the displayed object comprises the original display characteristics corresponding to the display requirement. The displayed object refers to a displayed object and/or an image and the like placed in the interactive display scene.
Optionally, the projection component 110 obtains an auxiliary display feature corresponding to the display requirement sent by the terminal device. The projection component 110 can be in communication connection with the terminal device in a wired or wireless manner; or, the projection assembly 110 and the terminal device are an integrated machine. The terminal device may be the control component 150; alternatively, it may be a device different from the control assembly 150, such as: may be a mobile phone, a computer, a tablet computer, etc. having a function of controlling the projection assembly 110. The present embodiment does not limit the type of terminal device to which the projection assembly 110 is connected.
Optionally, the projection component 110 is disposed above the displayed object, such that the image capturing component 120 can capture the auxiliary display features projected by the projection component 110 from various angles around the displayed object, thereby assisting the image recognition component 130 in recognizing the target object. Of course, in other embodiments, the projection assembly 110 may be disposed at other positions, such as: the position of the projection assembly 110 is not limited in this embodiment, such as obliquely above the object to be extended.
The image acquisition component 120, the image recognition component 130, the AR presentation component 140 and the control component 150 are communicatively coupled by a wired or wireless means. Optionally, the image capturing component 120, the image recognition component 130, the AR presentation component 140, and the control component 150 may be disposed in the same device (at this time, the communication connection may be realized through inter-process communication); of course, the present invention may also be provided in different devices (in this case, communication connection may be realized based on a wireless network or a wired network), and this embodiment is not limited to this.
Image capture component 120 may be a camera assembly, and image capture component 120 is configured to capture a current image frame in real-time as the user moves. In this embodiment, the image capturing component 120 may capture the current image frame of the displayed object in real time, or capture the projected current image frame of the displayed object in real time.
After the image acquisition component 120 acquires the current image frame, the current image frame is sent to the control component 150; the current image frame is then sent by the control component 150 to the image recognition component 130.
The control unit 150 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The control component 150 may also include a main processor and a coprocessor, the main processor is a processor for Processing data in the wake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the control component 150 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed by the display screen. In some embodiments, the control component 150 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The image recognition component 130 may be run in a hardware device such as a computer, a mobile phone, a tablet computer, a personal computer, etc., and the image recognition component 130 is configured to recognize the current image frame captured by the image capturing component 120 to recognize whether a target object exists in the current image frame. The auxiliary display features projected by the projection component 110 are used for the image recognition component to recognize the target object in combination with the original display features.
The target object is an object comprising the auxiliary display feature and the original display feature. The target object may be a cultural relic in a museum; or, the target object may also be a site in a garden, and certainly, may also be another object for interactive display, and the present embodiment does not limit the type of the target object.
The image recognition component 130 is also configured to send the recognition result to the control component 150. Alternatively, when the target object exists in the current image frame as a result of the recognition, the control component 150 obtains the relevant information of the target object.
Alternatively, the related information of the target object may be text information, image information, video information, and the like, and the content of the related information is not limited in this embodiment. The related information of the target object may be two-dimensional information, or may also be 1-dimensional or more-dimensional information, and the embodiment does not limit the type of the related information.
After the related information of the target image is acquired, the control component 150 is further configured to display the related information in a target display manner through the AR display component 140 based on the position of the target object in the spatial scene.
Optionally, the corresponding target display mode is different according to different displayed objects. Illustratively, when the object to be exhibited is a defect object, the related information of the target object includes a projected restored image of the object to be exhibited, and at this time, the target exhibiting manner is as follows: the restoration image is covered on the projected object to be displayed and displayed in a line-by-line mode (restoration effect); when the exhibited object is a complete object, the target display mode is to display the relevant information at the upper right corner of the projected exhibited object. Of course, the target display mode corresponding to each kind of displayed object may also be other modes, and this embodiment does not limit this. Such as: the object to be exhibited may be an object having a horizontal plane, such as: a desktop, a counter, a stage or a palm, etc., and at this time, the target display mode is to display related information and the like on a horizontal plane of the projected displayed object.
The AR presentation component 140 may be AR glasses, and the content presented by the AR presentation component 140 may be two-dimensional; alternatively, the display device may be three-dimensional, or may be more-dimensional, and this embodiment is not limited to this.
Of course, the AR device may also include other components, such as: AR handles, sensors, positioning components, etc.
In this embodiment, the projection component 110 projects the auxiliary display feature to the displayed object, so that the current image frame acquired by the image acquisition component 120 may include the auxiliary display feature, and the image recognition component 130 may recognize the target object by combining the auxiliary display feature and the original display feature of the projected displayed object, which may improve the recognition success rate of the image recognition component 130, thereby improving the interactive display effect of the AR-based interactive display system.
Fig. 2 is a flowchart of an AR-based object interaction display method according to an embodiment of the present application, where the method is applied to the system shown in fig. 1, and an execution subject of each step is illustrated as an example of the control component 150. The method at least comprises the following steps:
step 201, obtaining an auxiliary display feature corresponding to the display requirement through a projection component.
Optionally, the display requirement includes: chronological characteristics of the relevant information and/or information identification. The information identifier may be a name, a profile, a visual introduction, etc. of the related information of the target object, and the present embodiment does not limit the type of the information identifier. The age characteristics are used to indicate the age of the relevant information that needs to be presented, such as: the past times such as the Tang generation and the Meta generation, of course, may be future times.
Alternatively, the auxiliary presentation feature may be a static feature, such as: a picture; alternatively, it may be a dynamically presented feature, such as: and (6) animation. The auxiliary display features include: the object texture and/or the object feature anchor point corresponding to the current display requirement; the object texture refers to an image projected on the surface of the object to be displayed; an object feature anchor point refers to a specific inflection point located in the texture of an object. Such as: corners with specific geometry, inflection points with significant color differences, etc. in the texture of the object. Reference is made to the schematic illustration of an object texture 301 and an object feature anchor point 302 shown in fig. 3. Of course, the auxiliary display feature may also be another image feature for assisting the image recognition component in recognizing the target object, and the embodiment does not limit the type of the auxiliary display feature.
The auxiliary display features acquired by the projection component can be sent by the control component, at the moment, the control component can receive the display requirement selected or input by a user, and the auxiliary display features corresponding to the display requirement are determined according to the corresponding relation between the display requirement and the auxiliary display features. Optionally, the device to which the control component belongs may store auxiliary display features corresponding to different display requirements, and each display requirement may correspond to at least one set of auxiliary display features.
Such as: the display requirement is the Tang-Dynasty ceramic, and the auxiliary display feature can be an auxiliary display feature 1; the display requirement is Song dynasty ceramics, and the auxiliary display features can be auxiliary display features 2 and 3.
In one example, the control component may also perform image acquisition on the displayed object, and start the projection component when the current image frame of the acquired displayed object fails to be identified, so as to project the auxiliary display features to the displayed object through the projection component. At this moment, obtain this supplementary show characteristic that show demand corresponds through the projection subassembly, include: acquiring a current image frame of the displayed object through an image acquisition assembly; identifying a current image frame of the displayed object through an image identification component; and when the image identification component fails to identify the current image frame of the displayed object, acquiring the auxiliary display characteristics corresponding to the display requirement through the projection component. Therefore, the projection component can be started only when the current interactive display scene does not meet the identification condition of the image identification component, and resources consumed by the interactive display system can be reduced.
Optionally, the auxiliary display features corresponding to the display requirement comprise at least two groups of auxiliary display features; and when the image recognition component fails to recognize the current image frame of the displayed object, the auxiliary display characteristics corresponding to the display requirement are acquired through the projection component, and the auxiliary display characteristics comprise: acquiring a failure reason for identifying the failure of the current image frame of the displayed object; and determining the auxiliary display features corresponding to the failure reasons from at least two groups of auxiliary display features corresponding to the display requirements. At this time, the device to which the control component belongs also stores the corresponding relationship among the display requirement, the failure reason and the auxiliary display feature. Among them, the failure reasons include but are not limited to: collection angle, too strong ambient light intensity, too weak ambient light intensity, and/or fewer original display characteristics, etc.
And step 202, projecting the auxiliary display features to the exhibited object through the projection component at the frequency of 24 frames/second or 12 frames/second to obtain the projected exhibited object.
And the displayed object comprises the original display characteristics corresponding to the display requirement. Every 24 frames of dynamic auxiliary display features are inserted with at least 1 frame of auxiliary display features including object textures and/or object feature anchor points.
In one example, the projection component projects the auxiliary display feature to the displayed object in a water curtain projection manner, so that the auxiliary display feature can be displayed on the displayed object in three dimensions.
When the auxiliary display features are projected to the displayed object at the frequency of 24 frames/second, at least 1 frame of the auxiliary display features projected by the projection component, which comprise object textures and/or object feature anchor points, cannot be identified by human eyes, and the image acquisition component can acquire the auxiliary display features, so that the interactive display effect can be improved.
And step 203, acquiring the current image frame of the projected displayed object in real time through the image acquisition component.
Optionally, the image capturing component moves along with the movement of the user, and at this time, the current image frame captured by the image capturing component in real time is an object currently viewed by the user, that is, a projected displayed object.
And 204, identifying whether a target object exists in the current image frame through the image identification component, wherein the auxiliary display feature is used for the image identification component to identify the target object by combining the original display feature.
The target object in this embodiment is an object having an auxiliary presentation feature and an original presentation feature.
Optionally, the image recognition component compares the current image frame with template information in a template library through an image recognition algorithm, and determines that a target object exists in the current image frame when template information matching with image content in the current image frame exists in the template library, at this time, step 205 is executed; when the template information matched with the image content in the current image frame does not exist in the template library, it is determined that the target object does not exist in the current image frame, and the process is ended, or step 201 is executed again to obtain the auxiliary display feature again.
Optionally, each type of template information corresponds to a set of related information, so that after the image recognition component recognizes the target object, a set of related information corresponding to the template information can be obtained.
In one example, the projected displayed object is a defect object, accordingly, the related information of the target object includes a restored image of the displayed object, and the control component compares the target object with at least one template restored image in the template library; and determining the template restored image with the matching degree higher than a preset threshold value as the restored image of the displayed object. Such as: the object to be shown is a garden site.
In another example, when the projected displayed object is a complete object, the target object related information includes introduction information of the displayed object accordingly.
Step 205, when a target object exists in the current image frame, the relevant information of the target object is obtained through the AR display component.
After the control component acquires the relevant information of the target object, the relevant information is sent to the AR display component; accordingly, the AR presentation component may obtain the relevant information of the target object.
And step 206, displaying the related information in a target display mode through the AR display component based on the position of the target object in the space scene.
Illustratively, the projected displayed object is a defect-like object, and the related information includes a restored image of the displayed object. At the moment, the control component determines the size of the restored image according to the size of the target object in the current image frame; determining the position of a target object in a current image frame as a display position of a restored image; in the presentation position of the current image frame, presentation is performed by the AR presentation component in such a manner that the restored image appears line by line based on the target object.
Optionally, the related information further includes text introduction information of the target object; at this time, the displaying, by the AR displaying component, the related information in the target displaying manner based on the position of the target object in the current image frame may further include: determining a target position of a target object; the target location is displayed in the form of a bubble by the AR display assembly.
Such as: and showing introduction information of the target object in the form of bubbles in the upper right corner of the target object, such as: date of unearthing, action, age, user, etc. of the cultural relic.
The spatial scene refers to a scene where the projected exhibited object is located.
In summary, in the AR-based object interaction display method provided in this embodiment, the projection component is used to obtain the auxiliary display features corresponding to the display requirement; projecting the auxiliary display features to the exhibited object through the projection component to obtain the projected exhibited object; acquiring a current image frame of the projected displayed object in real time through an image acquisition assembly; identifying whether a target object exists in the current image frame through an image identification component; when a target object exists in a current image frame, acquiring related information of the target object through an AR display component; displaying relevant information in a target display mode through an AR display component based on the position of a target object in a current image frame; the problem that the display effect of an AR interactive display system is poor due to the fact that the failure rate of the image recognition component for recognizing the target object is high in the existing AR interactive display system can be solved; because the projection assembly can strengthen the display characteristics of the displayed object, the current image frame acquired by the image acquisition assembly can meet the requirement of the image identification assembly on identifying the target object, so that the success rate of the image identification assembly on identifying the target object is improved, and the display effect of the AR interactive display system is further improved.
Optionally, based on the foregoing embodiments, after step 206, the method further includes: when the display requirement is changed, the updated auxiliary display characteristics corresponding to the updated display requirement are obtained through the projection assembly; and projecting the updated auxiliary display features to the displayed object through the projection component to obtain the projected displayed object, and performing the step of acquiring the current image frame of the projected displayed object in real time through the image acquisition component again.
In this embodiment, through changing the supplementary show characteristic that projection subassembly throws, can realize showing different AR virtual show scenes based on same object that is shown, can reduce the space that is shown the object and occupy, improve the show space utilization ratio. The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (8)

1. An AR-based object interactive display method is characterized by comprising the following steps:
acquiring auxiliary display characteristics corresponding to the display requirements through a projection assembly; the auxiliary display features include: the object texture and/or the object feature anchor point corresponding to the display requirement; wherein the object texture refers to an image projected on the surface of the object to be displayed; the object feature anchor point refers to a specific inflection point located in the texture of the object;
projecting the auxiliary display features to the exhibited object through the projection component at the frequency of 24 frames/second or 12 frames/second to obtain the projected exhibited object; at least 1 frame of auxiliary display features comprising the object texture and/or the object feature anchor point is inserted into each 24 frames of dynamic auxiliary display features, so that at least 1 frame of auxiliary display features comprising the object texture and/or the object feature anchor point projected by the projection component cannot be identified by human eyes and can be acquired by the image acquisition component; the displayed object comprises original display characteristics corresponding to the display requirement;
acquiring the current image frame of the projected displayed object in real time through an image acquisition assembly;
identifying whether a target object exists in the current image frame through an image identification component, wherein the auxiliary display feature is combined with the original display feature, so that the image identification component can identify the target object, and the target object is an object comprising the auxiliary display feature and the original display feature;
when the target object exists in the current image frame, acquiring related information of the target object through an AR display component;
and displaying the related information in a target display mode through the AR display component based on the position of the target object identified by the image identification component in the space scene.
2. The method according to claim 1, wherein the obtaining of the auxiliary display feature corresponding to the display requirement through the projection component comprises:
acquiring a current image frame of the displayed object through the image acquisition assembly;
identifying, by the image identification component, a current image frame of the displayed object;
and when the image identification component fails to identify the current image frame of the displayed object, re-projecting the auxiliary display characteristics corresponding to the display requirement through the projection component.
3. The method according to claim 1, wherein the auxiliary display features corresponding to the current display requirement comprise at least two groups of auxiliary display features;
when the image recognition component fails to recognize the current image frame of the displayed object, the auxiliary display feature corresponding to the display requirement is acquired through the projection component, and the auxiliary display feature comprises:
acquiring a failure reason for identifying the failure of the current image frame of the exhibited object;
and determining the auxiliary display characteristics corresponding to the failure reasons from at least two groups of auxiliary display characteristics corresponding to the display requirement.
4. The method of claim 1, further comprising:
when the display requirement is changed, acquiring updated auxiliary display characteristics corresponding to the updated display requirement through the projection assembly;
and projecting the updated auxiliary display features to the displayed object through the projection component to obtain the projected displayed object, and executing the step of acquiring the current image frame of the projected displayed object in real time through the image acquisition component again.
5. The method according to any one of claims 1 to 4, wherein the present exhibition requirement comprises: chronological characteristics of the related information and/or information identification.
6. The method of any one of claims 1 to 4, wherein the projection assembly is disposed above the object to be exhibited.
7. The method of any one of claims 1 to 4, wherein the projecting the auxiliary presentation feature to the displayed object by the projection component comprises:
and projecting the auxiliary display features to the displayed object through the projection assembly in a water curtain projection mode.
8. An object interaction presentation system for AR, the system comprising:
the projection component is used for acquiring auxiliary display characteristics corresponding to the display requirement; projecting the auxiliary display features to the exhibited object at the frequency of 24 frames/second or 12 frames/second to obtain the projected exhibited object; the displayed object comprises original display characteristics corresponding to the display requirement; the auxiliary display features include: the object texture and/or the object feature anchor point corresponding to the display requirement; wherein the object texture refers to an image for projecting on the surface of the object to be exhibited; the object feature anchor point refers to a specific inflection point located in the texture of the object; at least 1 frame of auxiliary display features comprising the object texture and/or the object feature anchor point is inserted into each 24 frames of dynamic auxiliary display features, so that at least 1 frame of auxiliary display features comprising the object texture and/or the object feature anchor point projected by the projection component cannot be identified by human eyes and can be acquired by the image acquisition component;
the image acquisition component is used for acquiring the current image frame of the projected displayed object in real time;
the image identification component is used for identifying whether a target object exists in the current image frame, the auxiliary display feature is used for the image identification component to identify the target object in combination with the original display feature, and the target object is an object comprising the auxiliary display feature and the original display feature;
the AR display component is used for acquiring related information of the target object when the target object exists in the current image frame; and displaying the related information in a target display mode based on the current image frame position of the target object in the spatial scene, wherein the current image frame position is identified by the image identification component.
CN201910143803.5A 2019-02-27 2019-02-27 Object interaction display method and system based on Augmented Reality (AR) Active CN109885172B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910143803.5A CN109885172B (en) 2019-02-27 2019-02-27 Object interaction display method and system based on Augmented Reality (AR)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910143803.5A CN109885172B (en) 2019-02-27 2019-02-27 Object interaction display method and system based on Augmented Reality (AR)

Publications (2)

Publication Number Publication Date
CN109885172A CN109885172A (en) 2019-06-14
CN109885172B true CN109885172B (en) 2022-04-29

Family

ID=66929508

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910143803.5A Active CN109885172B (en) 2019-02-27 2019-02-27 Object interaction display method and system based on Augmented Reality (AR)

Country Status (1)

Country Link
CN (1) CN109885172B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110418185B (en) * 2019-07-22 2021-08-13 广州市天正科技有限公司 Positioning method and system for anchor point in augmented reality video picture
CN112070903A (en) * 2020-09-04 2020-12-11 脸萌有限公司 Virtual object display method and device, electronic equipment and computer storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103700127A (en) * 2013-09-02 2014-04-02 西安工程大学 Rapid generating method for ancient site virtual scene based on virtual reality technology
CN108829250A (en) * 2018-06-04 2018-11-16 苏州市职业大学 A kind of object interaction display method based on augmented reality AR
CN109326000A (en) * 2018-10-25 2019-02-12 武汉汉博伟业科技有限公司 A kind of historic site ruins 3 d modeling system and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101700817B1 (en) * 2014-01-10 2017-02-13 한국전자통신연구원 Apparatus and method for multiple armas and hands detection and traking using 3d image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103700127A (en) * 2013-09-02 2014-04-02 西安工程大学 Rapid generating method for ancient site virtual scene based on virtual reality technology
CN108829250A (en) * 2018-06-04 2018-11-16 苏州市职业大学 A kind of object interaction display method based on augmented reality AR
CN109326000A (en) * 2018-10-25 2019-02-12 武汉汉博伟业科技有限公司 A kind of historic site ruins 3 d modeling system and method

Also Published As

Publication number Publication date
CN109885172A (en) 2019-06-14

Similar Documents

Publication Publication Date Title
US10832086B2 (en) Target object presentation method and apparatus
WO2018103244A1 (en) Live streaming video processing method, device, and electronic apparatus
KR102474088B1 (en) Method and device for compositing an image
EP3826309A2 (en) Method and apparatus for processing video
US20090257730A1 (en) Video server, video client device and video processing method thereof
EP3166079A1 (en) Augmented reality method and system based on wearable device
US11450044B2 (en) Creating and displaying multi-layered augemented reality
CN108762501B (en) AR display method, intelligent terminal, AR device and AR system
CN109741289B (en) Image fusion method and VR equipment
CN109582122B (en) Augmented reality information providing method and device and electronic equipment
CN110989878B (en) Animation display method and device in applet, electronic equipment and storage medium
WO2018000619A1 (en) Data display method, device, electronic device and virtual reality device
US10986401B2 (en) Image processing apparatus, image processing system, and image processing method
US20200304713A1 (en) Intelligent Video Presentation System
CN109885172B (en) Object interaction display method and system based on Augmented Reality (AR)
CN113453027B (en) Live video and virtual make-up image processing method and device and electronic equipment
CN113645476B (en) Picture processing method and device, electronic equipment and storage medium
CN108093245B (en) Multi-screen fusion method, system, device and computer readable storage medium
CN110267079B (en) Method and device for replacing human face in video to be played
CN113206993A (en) Method for adjusting display screen and display device
WO2023065961A1 (en) Video implantation method and apparatus, device, and computer readable storage medium
CN109949396A (en) A kind of rendering method, device, equipment and medium
JP2020014075A (en) Image projection system, image projection method, and program
CN115202481A (en) Object interaction method, intelligent terminal, electronic device and storage medium
CN113587812B (en) Display equipment, measuring method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant