CN108021226B - AR content interaction realization method and device based on product packaging - Google Patents

AR content interaction realization method and device based on product packaging Download PDF

Info

Publication number
CN108021226B
CN108021226B CN201610974049.6A CN201610974049A CN108021226B CN 108021226 B CN108021226 B CN 108021226B CN 201610974049 A CN201610974049 A CN 201610974049A CN 108021226 B CN108021226 B CN 108021226B
Authority
CN
China
Prior art keywords
content
product
server
label
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610974049.6A
Other languages
Chinese (zh)
Other versions
CN108021226A (en
Inventor
田学礼
黎芝维
徐竞峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jinjia Box Technology Co., Ltd.
Original Assignee
Shenzhen Jinjia Hehe Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jinjia Hehe Technology Co ltd filed Critical Shenzhen Jinjia Hehe Technology Co ltd
Priority to CN201610974049.6A priority Critical patent/CN108021226B/en
Publication of CN108021226A publication Critical patent/CN108021226A/en
Application granted granted Critical
Publication of CN108021226B publication Critical patent/CN108021226B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10821Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
    • G06K7/10861Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices sensing of data fields affixed to objects or articles, e.g. coded labels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Architecture (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The invention relates to a method and a device for realizing AR content interaction based on product packaging, wherein the method comprises the following steps: identifying a product label on a product package to obtain label information; uploading the tag information to a server, so that the server acquires AR content corresponding to the product according to the tag information and returns the AR content; capturing a user image, and overlapping the user image with the AR content; and receiving a shooting instruction, and shooting the AR content overlapped with the user image according to the shooting instruction to generate an interactive effect file. According to the method, the obtained tag information is uploaded to the server, the AR content matched with the tag information and returned by the server can be obtained, and the consumption cost is effectively reduced. By corresponding the product label with the AR content, the configuration of the AR content is flexible and convenient.

Description

AR content interaction realization method and device based on product packaging
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for realizing AR content interaction based on product packaging.
Background
Augmented Reality (AR) can apply virtual information to the real world and is perceived by human senses, so that sense experience beyond Reality is achieved. The conventional approaches for realizing augmented reality mainly include the following two approaches.
The traditional method is to use a Global Positioning System (GPS) in combination with a geomagnetic sensor to realize augmented reality. The orientation and the inclination angle are obtained through the geomagnetic sensor, and relevant information is obtained according to the GPS position information to be superposed and displayed. This method requires a large amount of data to be processed, and the geomagnetic sensor is susceptible to peripheral equipment such as a ferromagnetic device, so that it is necessary to ensure GPS positioning accuracy. The same position information can only obtain a fixed AR content, and the configuration of the AR content is not flexible enough
The second traditional method is to implement augmented reality by using a method of pre-defining a logo image. This approach requires that the identification image be predefined. By means of image recognition techniques, a predefined identification image is looked up in the current image, and then the AR content associated with the identification image is displayed. The algorithm has high complexity and cannot acquire the AR content of images other than the predefined identification image, so the acquired AR content is limited.
Therefore, the interaction is performed by adopting the traditional augmented reality method, the configuration of the AR content is limited, and the interaction method is not flexible enough. How to conveniently and flexibly interact through an augmented reality technology becomes a technical problem to be solved at present.
Disclosure of Invention
Therefore, it is necessary to provide a method and an apparatus for implementing AR content interaction based on product packaging, which can perform interaction conveniently and flexibly through an augmented reality technology, in order to solve the above technical problems.
A product packaging-based AR content interaction implementation method comprises the following steps:
identifying a product label on a product package to obtain label information;
uploading the tag information to a server, so that the server acquires AR content corresponding to the product according to the tag information and returns the AR content;
capturing a user image, and overlapping the user image with the AR content;
and receiving a shooting instruction, and shooting the AR content overlapped with the user image according to the shooting instruction to generate an interactive effect file.
In one embodiment, before the step of identifying the product label on the product package, the method further comprises:
detecting whether a function of identifying a product label is started;
if not, displaying a quick operation panel, wherein the quick operation panel comprises an identification function starting control;
acquiring a trigger event for starting a control for the identification function;
responding to the trigger event and starting the function of detecting the product label.
In one embodiment, the method for implementing AR content interaction based on product packaging further includes:
and acquiring the position information of the product, and uploading the position information to the server, so that the server acquires the AR content corresponding to the product according to the position information and the label information.
In one embodiment, the AR content includes AR advertising, further comprising, prior to the step of identifying the product label on the product packaging:
obtaining an AR advertisement for a second product;
associating the AR advertisement of the second product with the tag information of the first product;
and uploading the associated tag information of the first product and the AR advertisement of the second product to a server, so that the server publishes the associated AR advertisement of the second product.
In one embodiment, the method for implementing AR content interaction based on product packaging further includes:
sending an AR content query request to a server so that the server returns an AR list according to the AR content query request;
selecting an AR content identifier to be processed in an AR list;
and generating an AR content processing request by using the AR content identifier to be processed, and sending the AR content processing request to the server, so that the server acquires the corresponding AR content to be processed according to the AR content identifier to be processed, and processing the AR content to be processed.
An AR content interaction implementation device based on product packaging comprises:
the acquisition module is used for identifying a product label on a product package and acquiring label information;
the uploading module is used for uploading the label information to the server so that the server acquires the AR content corresponding to the product according to the label information and returns the AR content;
the superposition module is used for capturing the user image and superposing the user image and the AR content;
and the shooting module is used for receiving the shooting instruction, shooting the AR content overlapped with the user image according to the shooting instruction, and generating an interactive effect file.
In one embodiment, the product package-based AR content interaction implementing apparatus further includes:
the detection module is used for detecting whether the function of identifying the product label is started or not;
the display module is used for displaying a quick operation panel if the function of identifying the product label is not started, wherein the quick operation panel comprises an identification function starting control;
the acquisition module is also used for acquiring a trigger event for starting the control of the identification function;
and the response module is used for responding to the trigger event and starting the function of detecting the product label.
In one embodiment, the obtaining module is further configured to obtain location information of the product, and the uploading module is further configured to upload the location information to the server, so that the server obtains the AR content corresponding to the product according to the location information and the tag information.
In one embodiment, the AR content includes an AR advertisement acquisition module further configured to acquire an AR advertisement for the second product;
the AR content interaction realizing device based on product packaging further comprises:
the association module is used for associating the AR advertising language of the second product with the label information of the first product;
the uploading module is further used for uploading the associated tag information of the first product and the AR advertisement of the second product to the server, so that the server publishes the associated AR advertisement of the second product.
In one embodiment, the product package-based AR content interaction implementing apparatus further includes:
the sending and inquiring module is used for sending an AR content inquiring request to the server so that the server returns an AR list according to the AR content inquiring request;
the selection module is used for selecting the AR content identification to be processed in the AR list;
the uploading module is further used for generating an AR content processing request by using the AR content identifier to be processed, and sending the AR content processing request to the server, so that the server obtains the corresponding AR content to be processed according to the AR content identifier to be processed, and processing the AR content to be processed.
According to the method and the device for realizing AR content interaction based on product packaging, the product label attached to the product packaging is identified, the label information is obtained, the label information is uploaded to the server, and then the AR content matched with the label information and returned by the server can be obtained. By corresponding the product label with the AR content, the configuration of the AR content is flexible and convenient. And the cost of the product label is lower, and the consumption cost can be effectively reduced. During presentation of AR content, user images may also be captured. And overlapping the user image with the AR content, and shooting the AR content overlapped with the user image to generate an interactive effect file. Because the configuration mode of the AR content is flexible, the interaction effect can be effectively improved when the user participates in the interaction.
Drawings
FIG. 1 is a diagram of an embodiment of an application environment of a method for implementing AR content interaction based on product packaging;
FIG. 2 is a flow chart illustrating a method for implementing AR content interaction based on product packaging in one embodiment;
FIG. 3 is an internal block diagram of a terminal in one embodiment;
FIG. 4 is a schematic structural diagram of an AR content interaction implementation device based on product packaging in one embodiment;
FIG. 5 is a schematic structural diagram of an AR content interaction implementation device based on product packaging in another embodiment;
FIG. 6 is a schematic structural diagram of an AR content interaction implementation device based on product packaging in yet another embodiment;
fig. 7 is a schematic structural diagram of an AR content interaction implementation apparatus based on product packaging in yet another embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the following detailed description of the embodiments of the method and apparatus for augmented reality according to the present invention is provided by way of example with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
FIG. 1 is a diagram of an application environment of a method for implementing AR content interaction based on product packaging in one embodiment. As shown in fig. 1, the application environment includes a terminal 102 and a server 104, where the terminal 102 performs information identification on a product label on a product package, acquires label information of the product label after the identification information is successful, and sends the acquired label information to the server 104. The server 104 receives the tag information, searches for AR content matching the tag information uploaded by the terminal 102 from a plurality of pre-stored AR contents according to the received tag information, returns the acquired AR content to the terminal 102, and the terminal 102 displays the received AR content. When displaying the AR image, the terminal 102 captures the user image, superimposes the captured user image and the AR image, displays the superimposed image, and when receiving a shooting instruction, the terminal 102 shoots the superimposed image to generate an interactive effect file. The terminal comprises a smart phone, a tablet computer, a personal digital assistant and the like.
Fig. 2 is a schematic flow chart of a method for implementing AR content interaction based on product packaging in an embodiment. The embodiment is exemplified by applying the method to the terminal 102 in fig. 1, and specifically includes the following steps:
step 202, identify the product label on the product package, and obtain label information.
And product labels are stuck on the product packages, and comprise NFC labels and RFID labels. The product label carries label information. The terminal has a product tag identification function, and the tag information is acquired by identifying the product tag through the function.
The product tag identification function comprises an NFC tag identification function and an RFID tag identification function, and when the product tag is an NFC tag, the terminal identifies the NFC tag through the NFC tag identification function to acquire tag information of the NFC tag.
When the product label is an RFID label, the terminal identifies the RFID label through the RFID identification function, and the label information of the RFID label is obtained.
And 204, uploading the label information to a server, so that the server acquires the AR content corresponding to the product according to the label information, and returning the AR content.
The terminal is provided with an application program in advance, and after the terminal acquires the label information, the label information is uploaded to the server through the application program. And after receiving the tag information, the server searches the AR content matched with the tag information in the database, acquires the AR content matched with the tag information, and returns the acquired AR content to the terminal. The AR content includes three-dimensional dynamic video, two-dimensional planar animation, and the like. The terminal may present the AR content using the application. The NFC label and the RFID label are low in cost and easy to manufacture, and consumption cost can be effectively reduced. The product tag is associated with the AR content, thereby making the configuration of the AR content more flexible.
Compared with the traditional mode, the first embodiment does not need to acquire the facing direction and the inclination angle through the geomagnetic sensor, and does not need the GPS position information. And multiple different AR contents can be configured for multiple product labels corresponding to the same position information, so that the limitation that one position information can only obtain one fixed AR content is broken through. The AR content configuration mode in the embodiment is more flexible and convenient.
Compared with the traditional mode two, the embodiment does not need to define the identification image in advance, and the product label corresponds to the AR content, so that the configuration of the AR content is more flexible. The problem that the content of the AR is limited in the traditional mode two is effectively solved.
Step 206, capturing the user image, and overlaying the user image with the AR content.
The terminal also includes a camera through which the user image is captured. The camera comprises a front camera and a rear camera. The user image includes a person image, an animal image, a landscape image, and the like. The user image may be a moving image or a still image.
And after receiving the AR content returned by the server, the terminal displays the returned AR content through the screen and interacts with the user. In the interaction process, the terminal can capture the user image through the front camera and can also capture the user image through the rear camera.
The terminal acquires the coordinate information of the central area of the screen, calculates by using the coordinate information of the central area to obtain the target position of the AR content in the screen, and displays the obtained AR content at the target position in the screen. The terminal captures a user image by using a camera and superimposes the captured user image with the AR content.
And step 208, receiving a shooting instruction, shooting the AR content overlapped with the user image according to the shooting instruction, and generating an interactive effect file.
The terminal receives a shooting instruction input by a user, shoots according to the shooting instruction, and generates a file with an interactive effect, wherein the file with the interactive effect comprises a picture and a video, and the file is superposed with the AR content.
The terminal can realize the interaction with the user through the front camera and also can realize the interaction with the user through the rear camera. The terminal can capture the figure image of the user through the front camera, superpose the captured figure image and the AR content, and display the figure image through the screen. And when the terminal receives a shooting instruction, shooting the image displayed on the screen to generate a file with an interactive effect. For example, the user may perform a self-photographing group photo with the AR content displayed on the screen through the front camera, or may record a video interacting with the AR content.
The terminal can also capture animal images, landscape images, figure images and the like preferred by the user through the rear camera, superpose the captured figure images and the AR content, and display the superposed figure images through the screen. And when the terminal receives a shooting instruction, shooting the image displayed on the screen to generate a file with an interactive effect. For example, the user may take a favorite user image and an AR content interactive photo through the rear camera, or record a favorite user image and an AR content interactive video.
In this embodiment, tag information is obtained by identifying a product tag attached to a product package, and the tag information is uploaded to a server, so that AR content matched with the tag information and returned by the server can be obtained. By corresponding the product label with the AR content, the configuration of the AR content is flexible and convenient. And the cost of the product label is lower, and the consumption cost can be effectively reduced. During presentation of AR content, user images may also be captured. And overlapping the user image with the AR content, and shooting the AR content overlapped with the user image to generate an interactive effect file. Because the configuration mode of the AR content is flexible, the interaction effect can be effectively improved when the user participates in the interaction.
In one embodiment, the method further comprises: and acquiring the position information of the product, and uploading the acquired position information to the server, so that the server acquires the AR content corresponding to the product according to the uploaded position information and the tag information.
Specifically, the terminal acquires the tag information, positions the position of the terminal, and acquires the position information. Since the information transmission between the terminal and the product tag belongs to short-distance communication, the position information of the terminal can be regarded as the position information of the product. After the terminal acquires the position information, the acquired position information and the tag information are uploaded to the server. And the server retrieves the AR content matched with the uploaded label information and position information from the database according to the label information and position information uploaded by the terminal, and returns the acquired AR content matched with the label information and position information to the terminal.
For example, the terminal identifies a product label of a certain beverage, acquires label information, positions the terminal to obtain position information of the terminal, uploads the acquired label information and position information to the server, and the server retrieves AR content matched with the uploaded label information and position information from the database according to the label information and position information uploaded by the terminal. When the position information uploaded by the terminal indicates that the product is located in the Shenzhen, the AR content returned by the server is the fairy image. When the position information uploaded by the terminal indicates that the product is located in Guangzhou, the AR content returned by the server is a food image and the like. Therefore, different AR contents can be acquired through one product label to be displayed, the acquisition mode of the AR contents is more flexible and convenient, and the interaction effect is further enhanced.
The server can also obtain the AR content according to the tag information and the uploading time of the tag information, the uploading time of the tag information is the time for the terminal to upload the tag information, and the server obtains the matched AR content according to the tag information and the uploading time and returns the AR content to the terminal.
For example, after the terminal acquires the tag information of a certain beverage, the terminal uploads the acquired tag information to the server, and the server acquires the matched AR content according to the tag information and the uploading time of the tag information. When the uploading time is wednesday and the position information indicates that the product is located in Guangzhou, the AR content returned by the server is a herb image. When the time of uploading information by the terminal is friday and the position information indicates that the product is located in Guangzhou, the AR content returned by the server is a gourmet image. The uploading time of the label information can be calculated according to the month, the day, the morning and the afternoon, the hour and the like.
Further, the server can also obtain the matched AR content according to the label information, the position information and the uploading time, and return the AR content to the terminal. For example, after the terminal acquires tag information of a certain beverage, the terminal uploads the acquired tag information and location information to the server. And the server acquires the matched AR content according to the label information, the position information and the uploading time. When the uploading time is wednesday and the position information indicates that the product is located in Guangzhou, the AR content returned by the server is a fairy image. When the uploading time is friday and the position information shows that the product is located in Shenzhen, the AR content returned by the server is a food image. The uploading time can also be calculated according to the month, the day, the morning and the afternoon, the hour and the like.
In the first conventional method, after GPS positioning, only fixed AR content can be acquired, the configuration of the AR content is not flexible enough, and the AR content acquired by a user is also very limited. In this embodiment, the server may return the corresponding AR content to the terminal according to the tag information, the location information, and/or the upload time. Different tag information, different location information, and different upload times, the server may return different AR content. The configuration mode of the AR content is more flexible, and the user can acquire various AR contents to participate in interaction. The interaction effect is effectively improved.
In one embodiment, before the step of identifying the product label on the product package, the method further comprises: detecting whether a function of identifying a product label is started; if not, displaying a quick operation panel, wherein the quick operation panel comprises an identification function starting control; acquiring a trigger event for the identification function starting control; and responding to the trigger event and starting a function for detecting the product label.
The opening of the label identification function can be set on a software interface, the label identification function can be opened in advance in the function of a mobile phone 'switch', whether the label identification function is opened or not is automatically detected when the APP software interface at the terminal is started, if the label identification function is not opened, a shortcut operation panel is displayed, a user is reminded to open the label identification function switch, and a control part is opened in the shortcut operation panel according to the label identification function.
The trigger event is a trigger event initiated by a user, acts on a label identification function starting control in the shortcut operation panel, and responds to the trigger event and starts a function of detecting a product label when the trigger event occurs.
In this embodiment, whether the product label function is opened or not is identified through automatic detection, and a quick operation panel is provided when the identification function is not opened, so that convenience of user operation is improved, a user can directly start the identification function to scan a product label, an operation interface is prevented from being withdrawn, the user enters the operation interface again after the function is manually started, and the operation flow is simplified.
In one embodiment, the AR content includes AR advertising, the tag information of the product tag corresponds to a first product, and prior to the step of identifying the product tag on the product packaging, includes: obtaining an AR advertisement for a second product; associating the AR advertisement of the second product with the tag information of the first product; and uploading the associated tag information of the first product and the AR advertisement of the second product to a server, so that the server publishes the associated AR advertisement of the second product.
In this embodiment, a product label is affixed to the first product package. The terminal obtains the tag information by identifying the product tag on the first product package. The terminal may also obtain an AR advertisement for the second product. And the terminal uploads the AR advertisement of the second product to the server. The terminal can upload the label information of the product label of the first product to the server, the server establishes the corresponding relation between the advertisement of the second product and the label information of the product label corresponding to the first product, associates the AR advertisement of the second product with the label information corresponding to the first product, and releases the associated AR advertisement of the second product. In the case that the first product is not allowed to be advertised or the related second product is promoted by the first product, the advertisement of the second product can be published by using the tag information of the first product, thereby realizing self-service publication of the advertisement of other products by using the existing tag information resource.
The AR ad for the first product may be overlaid or may continue to be retained after the ad for the second product is published. After the terminal identifies the product label and acquires the label information, the server can return the AR advertisement of the second product and the AR advertisement of the first product to the terminal, and can also only return the AR advertisement of the second product. When the AR advertisement of the first product continues to be reserved, after the terminal acquires the tag information, the server can return the AR advertisement of the first product and the AR advertisement content of the second product to the terminal according to the advertisement downloading request of the terminal. The user can select the AR advertisement to be presented through the terminal.
In this embodiment, the tag information corresponding to the AR advertisement of the second product is acquired through the AR advertisement of the second product uploaded by the terminal, the AR advertisement of the second product is associated with the corresponding tag information, and the associated AR advertisement is released. After the product advertisement is released, the first terminal can acquire corresponding AR advertisement content by acquiring the label information of the product label on the product package, interaction between a user and a product is deepened, and convenience is provided for the user to know the product advertisement in time.
In one embodiment, the terminal sends an AR content query request to the server, so that the server returns an AR list according to the AR content query request, wherein the AR list comprises an AR content identifier; selecting an AR content identifier to be processed in an AR list; and generating an AR content processing request by using the AR content identifier to be processed, and sending the AR content processing request to the server, so that the server acquires the corresponding AR content to be processed according to the AR content identifier to be processed, and processing the AR content to be processed.
In this embodiment, the AR content includes AR content information, and after the AR content of the product is uploaded to the server, the server may associate the AR content with the tag information, and then store the associated AR content, and temporarily not release the AR content. The server generates an AR list according to the AR content information of the AR content. The AR list includes an AR content identifier. The user can upload the AR content through the terminal for processing, including publishing, deleting, modifying, editing and the like. The user sends an AR content query request to the server through the terminal. And the server acquires the AR list according to the AR content query request.
The server returns the AR list to the terminal. The user can select the to-be-processed AR content identification in the AR list in the terminal, namely the to-be-processed AR content identification, and the to-be-processed AR content identification is uploaded to the server. And the server acquires the corresponding AR content to be processed according to the AR content identification to be processed and performs corresponding processing operation on the AR content to be processed.
For example, the user sends an AR content query request to the server through the terminal, and the server acquires an AR list according to the AR content query request and returns the AR list to the terminal. And the user selects the AR content to be deleted from the AR list in the terminal and uploads the AR content identification to be deleted to the server. And the server acquires the advertisement to be deleted according to the identifier of the AR content to be deleted and deletes the AR content to be deleted.
By selecting the identifier of the AR content to be processed in the AR list, the AR content to be processed can be issued, deleted, modified or edited, and the like, so that the user can conveniently perform self-service processing on the AR content according to the self-requirement.
Fig. 3 is an internal structure diagram of a terminal in an embodiment, and as shown in fig. 3, the terminal 102 includes a processor, a graphics processing unit, a storage medium, a memory, a network interface, a touch display, and an input device, which are connected by a system bus, where the storage medium stores an operating system and an AR content interaction implementation apparatus based on product packaging, and the apparatus is used to implement a method for implementing AR content interaction based on product packaging. The processor is used for improving the calculation and control capacity and supporting the operation of the terminal. The terminal comprises a graphic processing unit, a memory and a network interface, wherein the graphic processing unit is used for at least providing drawing capability for displaying an application operation interface, the memory is used for providing an environment for the operation of the AR content interaction device based on the product package in the storage medium, and the network interface is used for carrying out network communication with the server, receiving and sending data, for example, sending the position information of the terminal and the identified product label information to the server and the like. The touch screen is used for displaying icons and interfaces of various applications, for example, to superimpose the acquired AR content on an image of a user captured by a camera, and to display the superimposed image on a display interface.
As shown in fig. 4, in one embodiment, there is provided an AR content interaction implementation apparatus based on product packaging, the apparatus including: an acquisition module 402, an upload module 404, a superimposition module 406, and a capture module 408, wherein:
an obtaining module 402, configured to identify a product label on a product package, and obtain label information;
an uploading module 404, configured to upload the tag information to a server, so that the server obtains the AR content corresponding to the product according to the tag information, and returns the AR content;
an overlay module 406 for capturing a user image, overlaying the user image with the AR content;
and the shooting module 408 is configured to receive a shooting instruction, and shoot the AR content superimposed with the user image according to the shooting instruction to generate an interactive effect file.
As shown in fig. 5, in one embodiment, the product packaging-based AR content interaction implementation apparatus includes: an acquisition module 502, an upload module 504, a superposition module 506, a capture module 508, a detection module 510, and a response module 512.
A detection module 510 for detecting whether a function of identifying a product tag is turned on;
if not, displaying a quick operation panel, wherein the quick operation panel comprises an identification function starting control;
the obtaining module 502 is further configured to obtain a trigger event for identifying a function start control;
and a response module 512, configured to respond to the trigger event and start a function of detecting the product tag.
In one embodiment, the obtaining module is further configured to obtain location information of the product, and upload the location information to the server, so that the server obtains the AR content corresponding to the product according to the location information and the tag information.
As shown in fig. 6, in one embodiment, the AR content includes AR advertisement, and the product packaging-based AR content interaction implementing device includes: an acquisition module 602, an upload module 604, a superimposition module 606, a capture module 608, and an association module 610.
The obtaining module 602 is further configured to obtain an AR advertisement for a second product;
an association module 610 for associating the AR advertisement of the second product with the tag information of the first product;
the uploading module 604 is further configured to upload the associated tag information of the first product and the AR advertisement of the second product to the server, so that the server publishes the associated AR advertisement of the second product.
As shown in fig. 7, in one embodiment, the apparatus further comprises: an acquisition module 702, an upload module 704, a superimposition module 706, a capture module 708, a query module 710, and a selection module 712.
The query module 710 is configured to send an AR content query request to the server, so that the server returns an AR list according to the AR content query request;
a selecting module 712, configured to select an AR content identifier to be processed in the AR list;
the uploading module 704 is further configured to generate an AR content processing request by using the to-be-processed AR content identifier, and send the AR content processing request to the server, so that the server obtains the corresponding to-be-processed AR content according to the to-be-processed AR content identifier, and performs a processing operation on the to-be-processed AR content.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only show some embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (9)

1. A product packaging-based AR content interaction implementation method, the method comprising:
identifying a product label on a product package of a first product, acquiring label information, and determining uploading time of the label information; the uploading time of the label information is the time for the terminal to upload the label information; the product tag comprises an NFC tag and an RFID tag;
positioning the position of the terminal, acquiring the position information of the terminal, and regarding the position information of the terminal as the position information of the first product;
uploading the position information of the first product, the label information and the uploading time of the label information to a server, so that the server acquires the matched AR content according to the position information of the first product, the label information and the uploading time of the label information;
receiving the AR content returned by the server; when the AR content of the first product is released and not covered, the returned AR content comprises the AR content of the first product; when the AR content of the first product is covered by the published AR content of the second product, the returned AR content comprises the AR content of the second product;
capturing a user image, superimposing the user image with the AR content;
receiving a shooting instruction, and shooting the AR content overlapped with the user image according to the shooting instruction to generate an interactive effect file;
when processing the AR content in the server, sending an AR content query request to the server, so that the server returns an AR list according to the AR content query request; selecting an AR content identifier to be processed in the AR list, generating an AR content processing request by using the AR content identifier to be processed, and sending the AR content processing request to the server, so that the server acquires corresponding AR content to be processed according to the AR content identifier to be processed, and processing the AR content to be processed, wherein the processing operation comprises at least one of releasing, deleting, modifying and editing.
2. The method of claim 1, further comprising, prior to the step of identifying the product label on the product package:
detecting whether a function of identifying a product label is started;
if not, displaying a quick operation panel, wherein the quick operation panel comprises an identification function starting control;
acquiring a trigger event for the identification function starting control;
and responding to the trigger event and starting a function for detecting the product label.
3. The method of claim 1, further comprising:
and acquiring the position information of the product, and uploading the position information to a server, so that the server acquires the AR content corresponding to the product according to the position information and the label information.
4. The method of claim 1, wherein the AR content comprises an AR advertisement, further comprising, prior to the step of identifying a product label on a product package of the first product:
obtaining an AR advertisement for a second product;
associating the AR advertisement of the second product with the tag information of the first product;
uploading the associated tag information of the first product and the AR advertisement of the second product to a server, so that the server publishes the associated AR advertisement of the second product.
5. The method of claim 1, wherein the AR content comprises three-dimensional motion video and two-dimensional planar animation.
6. An AR content interaction implementation device based on product packaging, the device comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for identifying a product label on a product package of a first product, acquiring label information and determining uploading time of the label information; the uploading time of the label information is the time for the terminal to upload the label information; the product tag comprises an NFC tag and an RFID tag; positioning the position of the terminal, acquiring the position information of the terminal, and regarding the position information of the terminal as the position information of the first product;
the uploading module is used for uploading the position information of the first product, the label information and the uploading time of the label information to a server, so that the server can obtain the matched AR content according to the position information of the first product, the label information and the uploading time of the label information;
receiving the AR content returned by the server; when the AR content of the first product is released and not covered, the returned AR content comprises the AR content of the first product; when the AR content of the first product is covered by the published AR content of the second product, the returned AR content comprises the AR content of the second product;
a superposition module for capturing a user image, superposing the user image with the AR content;
the shooting module is used for receiving a shooting instruction, shooting the AR content overlapped with the user image according to the shooting instruction, and generating an interactive effect file;
the sending module is used for sending an AR content query request to the server so that the server returns an AR list according to the AR content query request;
the selection module is used for selecting the AR content identification to be processed in the AR list;
the uploading module is further configured to generate an AR content processing request by using the to-be-processed AR content identifier, and send the AR content processing request to the server, so that the server obtains the corresponding to-be-processed AR content according to the to-be-processed AR content identifier and performs a processing operation on the to-be-processed AR content.
7. The apparatus of claim 6, further comprising:
the detection module is used for detecting whether the function of identifying the product label is started or not;
the display module is used for displaying a quick operation panel if the function of identifying the product label is not started, and the quick operation panel comprises an identification function starting control;
the acquisition module is also used for acquiring a trigger event for the identification function starting control;
and the response module is used for responding to the trigger event and starting the function of detecting the product label.
8. The apparatus according to claim 6, wherein the obtaining module is further configured to obtain location information of a product, and the uploading module is further configured to upload the location information to a server, so that the server obtains the AR content corresponding to the product according to the location information and the tag information.
9. The apparatus of claim 6, wherein the AR content comprises an AR advertisement, and wherein the obtaining module is further configured to obtain the AR advertisement for the second product;
the device further comprises:
the association module is used for associating the AR advertising language of the second product with the label information of the first product;
the uploading module is further used for uploading the associated tag information of the first product and the AR advertisement of the second product to a server, so that the server publishes the associated AR advertisement of the second product.
CN201610974049.6A 2016-11-03 2016-11-03 AR content interaction realization method and device based on product packaging Active CN108021226B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610974049.6A CN108021226B (en) 2016-11-03 2016-11-03 AR content interaction realization method and device based on product packaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610974049.6A CN108021226B (en) 2016-11-03 2016-11-03 AR content interaction realization method and device based on product packaging

Publications (2)

Publication Number Publication Date
CN108021226A CN108021226A (en) 2018-05-11
CN108021226B true CN108021226B (en) 2021-07-20

Family

ID=62083732

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610974049.6A Active CN108021226B (en) 2016-11-03 2016-11-03 AR content interaction realization method and device based on product packaging

Country Status (1)

Country Link
CN (1) CN108021226B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109872800B (en) * 2019-03-13 2021-05-14 京东方科技集团股份有限公司 Diet accompanying system and diet accompanying method
CN111127669A (en) * 2019-12-30 2020-05-08 北京恒华伟业科技股份有限公司 Information processing method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110242134A1 (en) * 2010-03-30 2011-10-06 Sony Computer Entertainment Inc. Method for an augmented reality character to maintain and exhibit awareness of an observer
CN102739872A (en) * 2012-07-13 2012-10-17 苏州梦想人软件科技有限公司 Mobile terminal, and augmented reality method used for mobile terminal
CN103218731A (en) * 2013-03-25 2013-07-24 深圳市精彩明天科技有限公司 Method and system utilizing two-dimension code to advertise
CN104951498A (en) * 2014-03-26 2015-09-30 施耐德电器工业公司 Method for generating a content in augmented reality mode
CN105632263A (en) * 2016-03-29 2016-06-01 罗昆 Augmented reality-based music enlightenment learning device and method
US20160188943A1 (en) * 2014-12-30 2016-06-30 Hand Held Products, Inc. Augmented reality vision barcode scanning system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110242134A1 (en) * 2010-03-30 2011-10-06 Sony Computer Entertainment Inc. Method for an augmented reality character to maintain and exhibit awareness of an observer
CN102739872A (en) * 2012-07-13 2012-10-17 苏州梦想人软件科技有限公司 Mobile terminal, and augmented reality method used for mobile terminal
CN103218731A (en) * 2013-03-25 2013-07-24 深圳市精彩明天科技有限公司 Method and system utilizing two-dimension code to advertise
CN104951498A (en) * 2014-03-26 2015-09-30 施耐德电器工业公司 Method for generating a content in augmented reality mode
US20160188943A1 (en) * 2014-12-30 2016-06-30 Hand Held Products, Inc. Augmented reality vision barcode scanning system and method
CN105632263A (en) * 2016-03-29 2016-06-01 罗昆 Augmented reality-based music enlightenment learning device and method

Also Published As

Publication number Publication date
CN108021226A (en) 2018-05-11

Similar Documents

Publication Publication Date Title
EP3167446B1 (en) Apparatus and method for supplying content aware photo filters
US9639988B2 (en) Information processing apparatus and computer program product for processing a virtual object
US20120081393A1 (en) Apparatus and method for providing augmented reality using virtual objects
US10831334B2 (en) Teleportation links for mixed reality environments
US10127724B2 (en) System and method for providing augmented reality on mobile devices
US10264207B2 (en) Method and system for creating virtual message onto a moving object and searching the same
US20170337747A1 (en) Systems and methods for using an avatar to market a product
US20160253843A1 (en) Method and system of management for switching virtual-reality mode and augmented-reality mode
WO2019105274A1 (en) Method, device, computing device and storage medium for displaying media content
KR20050078136A (en) Method for providing local information by augmented reality and local information service system therefor
US11880999B2 (en) Personalized scene image processing method, apparatus and storage medium
US9135735B2 (en) Transitioning 3D space information to screen aligned information for video see through augmented reality
US9491368B2 (en) Information obtaining and viewing device, data processing method thereof, and non-transitory computer readable medium storing a program
CN112330819B (en) Interaction method and device based on virtual article and storage medium
CN108021226B (en) AR content interaction realization method and device based on product packaging
CN108171801A (en) A kind of method, apparatus and terminal device for realizing augmented reality
CN108961424B (en) Virtual information processing method, device and storage medium
CN113891166A (en) Data processing method, data processing device, computer equipment and medium
US20210142573A1 (en) Viewing system, model creation apparatus, and control method
CN113609358A (en) Content sharing method and device, electronic equipment and storage medium
JP6617547B2 (en) Image management system, image management method, and program
JP6001057B2 (en) Method, apparatus, and terminal device for information generation and processing
JP2021163287A (en) Augmenting reality system
US10069984B2 (en) Mobile device and method for controlling the same
US20140063240A1 (en) Systems, apparatuses, and methods for branding and/or advertising through immediate user interaction, social networking, and image sharing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20181219

Address after: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant after: Shenzhen Jinjia Box Technology Co., Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant before: Shenzhen Jinjia Medium Technology Co., Ltd.

GR01 Patent grant
GR01 Patent grant