CN111836025A - Augmented reality projection guide system - Google Patents

Augmented reality projection guide system Download PDF

Info

Publication number
CN111836025A
CN111836025A CN201910304960.XA CN201910304960A CN111836025A CN 111836025 A CN111836025 A CN 111836025A CN 201910304960 A CN201910304960 A CN 201910304960A CN 111836025 A CN111836025 A CN 111836025A
Authority
CN
China
Prior art keywords
mobile terminal
projection
picture
matching
projection picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910304960.XA
Other languages
Chinese (zh)
Other versions
CN111836025B (en
Inventor
叶江华
尹福灵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Yijia Creative Digital Technology Co ltd
Original Assignee
Dongguan Yijia Digital Creative Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Yijia Digital Creative Co ltd filed Critical Dongguan Yijia Digital Creative Co ltd
Priority to CN201910304960.XA priority Critical patent/CN111836025B/en
Publication of CN111836025A publication Critical patent/CN111836025A/en
Application granted granted Critical
Publication of CN111836025B publication Critical patent/CN111836025B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/141Setup of application sessions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/16Implementation or adaptation of Internet protocol [IP], of transmission control protocol [TCP] or of user datagram protocol [UDP]
    • H04L69/161Implementation details of TCP/IP or UDP/IP stack architecture; Specification of modified or new header fields
    • H04L69/162Implementation details of TCP/IP or UDP/IP stack architecture; Specification of modified or new header fields involving adaptations of sockets based mechanisms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to an augmented reality projection navigation system, which comprises a computer end, a mobile terminal and a projector connected with the computer end, wherein the mobile terminal is in communication connection with the computer end and synchronously and interactively controls a projection picture by the mobile terminal, and the steps for realizing the synchronous interactive control are as follows: s1, making a projection picture with the identification characteristic pattern by using the illusion engine through the computer end, and projecting the projection picture through the projector; s2, scanning the projection picture by using a camera of the mobile terminal, acquiring and identifying the projection picture in real time by using the mobile terminal, and generating a plurality of pattern buttons in the display picture of the mobile terminal, wherein the pattern buttons can be used for controlling the direction rotation of the projection picture or adding projection patterns; s3, when the corresponding pattern button is operated, the mobile terminal sends the corresponding control signal to the computer end in real time, the computer end outputs the return signal and changes the projection picture in real time.

Description

Augmented reality projection guide system
Technical Field
The invention relates to the technical field of augmented reality projection, in particular to an augmented reality projection navigation system.
Background
The augmented reality technology is a technology developed on the basis of Virtual Reality (VR), and is a technology for enhancing the perception of a user to the real world by using information provided by a computer system, and specifically, virtual information is superimposed on the same picture or space in real time and exists at the same time, that is, virtual objects, scenes or system prompt information generated by a computer is superimposed on real scenes, so that the augmented reality is realized, and the augmented reality technology is widely applied to the field of display or teaching. The existing augmented reality projection navigation system directly projects and superimposes a three-dimensional graph generated by a computer into the same projection display area by using a projector, so that a viewer sees a scene with fusion of reality and reality, but the existing augmented reality projection navigation system only realizes the projection effect, only projects the three-dimensional graph into the projection area and cannot interact with the viewer, for example, the viewer adds an object into the projection picture or controls the direction rotation of the three-dimensional projection picture, and the user interaction effect is poor.
Disclosure of Invention
In order to solve the above problems, the present invention provides an augmented reality projection navigation system, which realizes synchronous interactive control on a projection picture by accessing a mobile terminal to communicate with a computer terminal, and has a better user interaction effect.
In order to solve the above-mentioned purpose, the following technical scheme is adopted in the invention.
An augmented reality projection navigation system comprises a computer terminal, a mobile terminal and a projector connected with the computer terminal, wherein the mobile terminal is in communication connection with the computer terminal and synchronously and interactively controls a projection picture by the mobile terminal, and the step of realizing the synchronous interactive control is as follows: s1, making a projection picture with the identification characteristic pattern by using the illusion engine through the computer end, and projecting the projection picture through the projector; s2, scanning the projection picture by using a camera of the mobile terminal, acquiring and identifying the projection picture in real time by using the mobile terminal, and generating a plurality of pattern buttons in the display picture of the mobile terminal, wherein the pattern buttons can be used for controlling the direction rotation of the projection picture or adding projection patterns; and S3, when the corresponding pattern button is operated, the mobile terminal sends a corresponding control signal to the computer terminal in real time, and the computer terminal outputs a return signal and changes the projection picture in real time.
Furthermore, the mobile terminal and the computer end establish communication connection through a socket communication protocol, the computer end accesses a wireless router and establishes a socket, the mobile terminal accesses the same local area network and broadcasts an OnValidQueryPacket message to the local area network, and the computer end returns an OnValidResponsePack message to the mobile terminal after receiving the message and starts to establish socket connection.
Further, feature map identification areas with low texture repetition degree are added on the periphery of the projection picture, non-repetitive patterns are added on four corners of the projection picture for identifying directions, and the step of acquiring and identifying the projection picture in real time by the mobile terminal in step S2 is as follows: s21, the mobile terminal predefines the graphics of the four corners of the projection picture, the feature map identification areas of the projection picture and the pattern buttons corresponding to each graphic/identification area; s22, the mobile terminal reads each frame of data of the projection picture shot by the camera in real time to realize real-time acquisition and identification; s23, the mobile terminal carries out gray level conversion processing on the collected projection picture; s24, the mobile terminal carries out the zoom distortion correction processing on the projection picture processed in the step S23; s25, carrying out Gaussian filter processing on the projection picture after the step S24; s26, extracting the image characteristic points of the projection picture after the step S25, and performing matching operation on the extracted image characteristic points and the graphics defining four corners of the projection picture in advance; s27, if the matching is successful, the mobile terminal determines four corners and frames in the display picture and generates a plurality of corresponding pattern buttons in the display picture of the mobile terminal; if the matching fails, identifying the feature map identification area around the projection picture, performing matching operation with the feature map identification area of the predefined projection picture, if the matching succeeds, determining four corners and frames in the display picture by the mobile terminal, generating a plurality of corresponding pattern buttons in the display picture of the mobile terminal, and if the matching fails, returning to the step S21 to perform collection and identification again.
Further, in step S26 and step S27, the matching operation uses a squared difference matching method, and the matching algorithm is as follows:
Figure BDA0002029500280000021
wherein, R (x, y) is a result matching value, T (x ', y') is a preset template image for searching matching, and I (x + x ', y + y') is an input image; the larger the matching value is, the worse the matching is, if the matching value is less than 0.5, the matching is successful, and if the matching value is more than 0.5 and less than 1, the matching is failed.
Further, in step S23, in step S23, the gray scale conversion algorithm is:
Figure BDA0002029500280000031
wherein, gray is the gray value.
Further, in the scaling distortion correction processing in step S24, the relationship between the scaled image and the original image is as follows:
Figure BDA0002029500280000032
wherein newWidth is the width of the zoomed image, newHeight is the height of the zoomed image, width is the width of the original image, height is the height of the original image, SxFor horizontal scaling factor, SyA vertical zoom system; the coordinate mapping relation of image scaling is as follows:
Figure BDA0002029500280000033
wherein, the horizontal scaling coefficient is sx, the vertical coefficient is sy, (x)0,y0) For pre-zoom coordinates, (x, y) are post-zoom coordinates.
Further, the algorithm of the gaussian filtering process in step S25 is as follows:
Figure BDA0002029500280000034
wherein u is2And v2Respectively the distance between other pixels in the neighborhood and the central pixel in the neighborhood.
Further, in step S3, when the corresponding pattern button is operated, the mobile terminal sends an OnValueChange message to the computer terminal in real time, and the computer terminal returns an OnValueChangeResponse message to the mobile terminal according to the received OnValueChange message, and simultaneously changes the projection screen in real time.
Further, the mobile terminal may be a mobile phone or a tablet.
Furthermore, the number of the mobile terminals can be multiple, when the mobile terminals are connected with the computer terminal at the same time, the default mobile terminal with the first access has the function of controlling the direction rotation of the projection picture, and other mobile terminals only have the function of adding the projection pattern.
The invention has the following beneficial effects:
on the basis of an augmented reality technology and a projection technology, the mobile terminal is accessed to communicate with a computer terminal, the mobile terminal scans and identifies a projection picture, a pattern button is generated in a display picture of the mobile terminal, the projection picture can be synchronously and interactively controlled by operating the pattern button, the user interaction effect is better, the interactivity of the augmented reality projection navigation system is improved, and if the mobile terminal scans and identifies the projection picture, the mobile terminal displays the projection picture taken by a camera and also adds the pattern button for controlling the direction to control the projection content, such as the left, right, up and down visual angles move; or adding a pattern button which can be used for adding a projection pattern, and after the preset projection pattern is moved to a display picture through the pattern button, the projection picture can be superposed with the projection pattern in real time.
Drawings
FIG. 1 is a schematic system architecture of one embodiment of the present invention;
fig. 2 is a flowchart illustrating steps of implementing synchronous interactive control by the augmented reality projection navigation system according to an embodiment of the present invention.
Detailed Description
The invention will be further described with reference to the accompanying drawings.
Referring to fig. 1 to 2, an augmented reality projection navigation system includes a computer terminal 1, a mobile terminal 2 and a projector 3 connected to the computer terminal 1, the mobile terminal 2 is in communication connection with the computer terminal 1, and the mobile terminal 2 performs synchronous interactive control on a projection picture 31, and the step of performing the synchronous interactive control is as follows:
s1, using the illusion engine to make the projection picture 31 with the identification characteristic pattern through the computer 1, and projecting the projection picture 31 through the projector 3;
s2, scanning the projection picture 31 with the camera of the mobile terminal 2, the mobile terminal 2 performing real-time acquisition and identification on the projection picture 31, and generating a plurality of pattern buttons 211 in the display picture 21 of the mobile terminal 2, where the pattern buttons 211 may be used to control the direction rotation of the projection picture 31 or add projection patterns;
s3, when the corresponding pattern button 211 is operated, the mobile terminal 2 sends the corresponding control signal to the computer terminal 1 in real time, and the computer terminal 1 outputs the return signal and changes the projection screen 31 in real time.
Therefore, on the basis of the augmented reality technology and the projection technology, the mobile terminal 2 is accessed to communicate with the computer terminal 1, the mobile terminal 2 scans the projection picture 31 and identifies the projection picture 31, the pattern button 211 is generated in the display picture 21 of the mobile terminal 2, the projection picture 31 can be synchronously and interactively controlled by operating the pattern button 211, the user interaction effect is better, the interactivity of the augmented reality projection navigation system is greatly improved, and after the projection picture 31 is scanned and identified by the mobile terminal 2, the mobile terminal 2 displays the projection picture 31 taken by a camera and also adds the pattern button 211 for controlling the direction to control the projection content, for example, the left view angle, the right view angle and the up view angle move; or adding a pattern button 211 which can be used for adding a projection pattern, and after the preset projection pattern is moved to the display screen 21 through the pattern button 211, the projection screen 31 will superpose the projection pattern in real time.
In this embodiment, the mobile terminal 2 and the computer terminal 1 establish a communication connection through a socket communication protocol, the computer terminal 1 accesses the wireless router 4 and establishes a socket, the mobile terminal 2 accesses the same lan and broadcasts an OnValidQueryPacket message to the lan, and the computer terminal 1 returns an onvalidresponsepack message to the mobile terminal 2 after receiving the message, and starts to establish a socket connection.
Referring to fig. 1, in the embodiment, a feature map identification area 311 with low texture repetition degree is added around the projection picture 31, and a non-repetitive pattern 312 is added at four corners of the projection picture 31 for identifying a direction, and in step S2, the step of the mobile terminal 2 performing real-time acquisition and identification on the projection picture 31 is as follows:
s21, the mobile terminal 2 predefines the graphics 312 at the four corners of the projection screen 31, the feature map identification area 311 of the projection screen 31, and the pattern button 211 corresponding to each graphic 312/identification area 311;
s22, the mobile terminal 2 reads each frame of data of the projection picture 31 shot by the camera in real time to realize real-time acquisition and identification;
s23, the mobile terminal 2 carries out gray scale conversion processing on the collected projection picture 31; after the projection picture 31 is converted into gray scale, the identification is convenient;
s24, the mobile terminal 2 performs the distortion correction process on the projection screen 31 after the step S23; the situation that the camera of the mobile terminal 2 deforms or the projection picture 31 is zoomed to influence the identification is avoided;
s25, performing gaussian filtering processing on the projection screen 31 subjected to the step S24; thereby obtaining smoother projection pictures 31 with different distance and near dimensions;
s26, extracting the image feature points of the projection screen 31 obtained in step S25, and performing matching operation on the extracted image feature points and the pattern 312 defining the four corners of the projection screen 31 in advance; the extracted feature points are 100 at most to reduce resource occupation;
s27, if the matching is successful, the mobile terminal 2 determines four corners and frames in the display screen 21, and generates a plurality of corresponding pattern buttons 211 in the display screen 21 of the mobile terminal 2; if the matching fails, the feature map identification area 311 around the projection picture 31 is identified, the feature map identification area 311 is matched with the feature map identification area 311 of the predefined projection picture 31, if the matching succeeds, the mobile terminal 2 determines four corners and frames in the display picture 21, and generates a plurality of corresponding pattern buttons 211 in the display picture 21 of the mobile terminal 2, and if the matching fails, the step S21 is skipped to perform the acquisition and identification again. In the scanning process, the camera of the mobile terminal 2 captures the whole projection picture 31, the mobile terminal 2 performs overall positioning identification on the projection picture 31, the display picture 21 is synchronous with the projection picture 31, accurate synchronous interactive control is realized, and the experience effect is better.
In the above steps S26 and S27, the matching operation uses a square error matching method, and the matching algorithm is as follows:
Figure BDA0002029500280000061
wherein, R (x, y) is a result matching value, T (x ', y') is a preset template image for searching matching, and I (x + x ', y + y') is an input image; the larger the matching value is, the worse the matching is, if the matching value is less than 0.5, the matching is successful, and if the matching value is more than 0.5 and less than 1, the matching is failed.
In step S23, the gray-scale conversion algorithm is:
Figure BDA0002029500280000062
wherein, gray is a gray value, and the gray values converted by coefficients 014, 0.587 and 0.299 are adopted in this embodiment, so as to facilitate identification.
In the above-described scaling distortion correction processing in step S24, the relationship between the scaled image and the original image is as follows:
Figure BDA0002029500280000063
wherein newWidth is the width of the zoomed image, newHeight is the height of the zoomed image, width is the width of the original image, height is the height of the original image, SxFor horizontal scaling factor, SyA vertical zoom system; the coordinate mapping relation of image scaling is as follows:
Figure BDA0002029500280000064
wherein, the horizontal scaling coefficient is sx, the vertical coefficient is sy, (x)0,y0) For pre-zoom coordinates, (x, y) are post-zoom coordinates.
The algorithm of the gaussian filtering process in the above step S25 is as follows:
Figure BDA0002029500280000071
wherein u is2And v2Respectively the distance between other pixels in the neighborhood and the central pixel in the neighborhood.
In step S3 of this embodiment, when the corresponding pattern button 211 is operated, the mobile terminal 2 sends an OnValueChange message to the computer terminal 1 in real time, and the computer terminal 1 returns an OnValueChangeResponse message to the mobile terminal 2 according to the received OnValueChange message, and simultaneously changes the projection screen 31 in real time. The mobile terminal 2 may be a mobile phone or a tablet. When a plurality of mobile terminals 2 are connected to the computer 1 at the same time, the default mobile terminal 2 has a function of controlling the direction rotation of the projection screen 31, and the other mobile terminals 2 only have a function of adding projection patterns.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. The augmented reality projection navigation system is characterized by comprising a computer end, a mobile terminal and a projector connected with the computer end, wherein the mobile terminal is in communication connection with the computer end and synchronously and interactively controls a projection picture by the mobile terminal, and the step of realizing the synchronous interactive control is as follows:
s1, making a projection picture with the identification characteristic pattern by using the illusion engine through the computer end, and projecting the projection picture through the projector;
s2, scanning the projection picture by using a camera of the mobile terminal, acquiring and identifying the projection picture in real time by using the mobile terminal, and generating a plurality of pattern buttons in the display picture of the mobile terminal, wherein the pattern buttons can be used for controlling the direction rotation of the projection picture or adding projection patterns;
and S3, when the corresponding pattern button is operated, the mobile terminal sends a corresponding control signal to the computer terminal in real time, and the computer terminal outputs a return signal and changes the projection picture in real time.
2. The augmented reality projection navigation system of claim 1, wherein the mobile terminal establishes a communication connection with the computer terminal through a socket communication protocol, the computer terminal accesses a wireless router and establishes a socket, the mobile terminal accesses the same local area network and broadcasts an OnValidQueryPacket message to the local area network, and the computer terminal returns an OnValidResponsePack message to the mobile terminal after receiving the message and starts to establish a socket connection.
3. The augmented reality projection navigation system of claim 1, wherein the projection screen is added with feature map identification areas with low texture repetition degree at the periphery, and the projection screen is added with non-repetitive patterns at four corners for identifying directions, and the step of the mobile terminal performing real-time acquisition and identification on the projection screen in step S2 is as follows:
s21, the mobile terminal predefines the graphics of the four corners of the projection picture, the feature map identification areas of the projection picture and the pattern buttons corresponding to each graphic/identification area;
s22, the mobile terminal reads each frame of data of the projection picture shot by the camera in real time to realize real-time acquisition and identification;
s23, the mobile terminal carries out gray level conversion processing on the collected projection picture;
s24, the mobile terminal carries out the zoom distortion correction processing on the projection picture processed in the step S23;
s25, carrying out Gaussian filter processing on the projection picture after the step S24;
s26, extracting the image characteristic points of the projection picture after the step S25, and performing matching operation on the extracted image characteristic points and the graphics defining four corners of the projection picture in advance;
s27, if the matching is successful, the mobile terminal determines four corners and frames in the display picture and generates a plurality of corresponding pattern buttons in the display picture of the mobile terminal; if the matching fails, identifying the feature map identification area around the projection picture, performing matching operation with the feature map identification area of the predefined projection picture, if the matching succeeds, determining four corners and frames in the display picture by the mobile terminal, generating a plurality of corresponding pattern buttons in the display picture of the mobile terminal, and if the matching fails, returning to the step S21 to perform collection and identification again.
4. The augmented reality projection navigation system of claim 3, wherein the matching operation uses a squared error matching method in steps S26 and S27, and the matching algorithm is as follows:
Figure FDA0002029500270000021
wherein, R (x, y) is a result matching value, T (x ', y') is a preset template image for searching matching, and I (x + x ', y + y') is an input image; the larger the matching value is, the worse the matching is, if the matching value is less than 0.5, the matching is successful, and if the matching value is more than 0.5 and less than 1, the matching is failed.
5. An augmented reality projection navigation system as claimed in claim 3, wherein in step S23, the gray scale transformation algorithm is:
Figure FDA0002029500270000022
wherein, gray is the gray value.
6. An augmented reality projection navigation system as claimed in claim 3, wherein in the scaling distortion correction process in step S24, the relationship between the scaled image and the original image is as follows:
Figure FDA0002029500270000023
wherein newWidth is the width of the zoomed image, newHeight is the height of the zoomed image, width is the width of the original image, height is the height of the original image, SxFor horizontal scaling factor, SyA vertical zoom system; the coordinate mapping relation of image scaling is as follows:
Figure FDA0002029500270000024
wherein, the horizontal scaling coefficient is sx, the vertical coefficient is sy, (x)0,y0) For pre-zoom coordinates, (x, y) are post-zoom coordinates.
7. The augmented reality projection navigation system of claim 3, wherein the algorithm of the Gaussian filter process in step S25 is as follows:
Figure FDA0002029500270000031
wherein u is2And v2Respectively the distance between other pixels in the neighborhood and the central pixel in the neighborhood.
8. The augmented reality projection navigation system of claim 1, wherein in step S3, when the corresponding pattern button is operated, the mobile terminal sends an onvaluchange message to the computer terminal in real time, and the computer terminal returns an onvaluchange response message to the mobile terminal according to the received onvaluchange message, and simultaneously changes the projection screen in real time.
9. The augmented reality projection navigation system of claim 1, wherein the mobile terminal is a mobile phone or a tablet.
10. The augmented reality projection navigation system as claimed in claim 1, wherein the number of the mobile terminals is plural, and when the plural mobile terminals are connected to the computer terminal simultaneously, the default mobile terminal that is connected first has a function of controlling the direction rotation of the projection screen, and the other mobile terminals only have a function of adding the projection pattern.
CN201910304960.XA 2019-04-16 2019-04-16 Augmented reality projection guide system Active CN111836025B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910304960.XA CN111836025B (en) 2019-04-16 2019-04-16 Augmented reality projection guide system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910304960.XA CN111836025B (en) 2019-04-16 2019-04-16 Augmented reality projection guide system

Publications (2)

Publication Number Publication Date
CN111836025A true CN111836025A (en) 2020-10-27
CN111836025B CN111836025B (en) 2022-05-20

Family

ID=72915656

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910304960.XA Active CN111836025B (en) 2019-04-16 2019-04-16 Augmented reality projection guide system

Country Status (1)

Country Link
CN (1) CN111836025B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014175671A (en) * 2013-03-05 2014-09-22 Nippon Telegr & Teleph Corp <Ntt> Photographing system, method and program
CN104932700A (en) * 2015-07-17 2015-09-23 焦点科技股份有限公司 Methods for achieving object projection by means of intelligent terminal
CN106775525A (en) * 2016-12-02 2017-05-31 北京小米移动软件有限公司 Control the method and device of projecting apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014175671A (en) * 2013-03-05 2014-09-22 Nippon Telegr & Teleph Corp <Ntt> Photographing system, method and program
CN104932700A (en) * 2015-07-17 2015-09-23 焦点科技股份有限公司 Methods for achieving object projection by means of intelligent terminal
CN106775525A (en) * 2016-12-02 2017-05-31 北京小米移动软件有限公司 Control the method and device of projecting apparatus

Also Published As

Publication number Publication date
CN111836025B (en) 2022-05-20

Similar Documents

Publication Publication Date Title
CN110009561B (en) Method and system for mapping surveillance video target to three-dimensional geographic scene model
CN107852487B (en) Electronic device for generating 360-degree three-dimensional image and method for the same
CN107484428B (en) Method for displaying objects
EP2343685B1 (en) Information processing device, information processing method, program, and information storage medium
CN102724503B (en) Video compression method and system
JP7387434B2 (en) Image generation method and image generation device
CN112132836A (en) Video image clipping method and device, electronic equipment and storage medium
CN112017222A (en) Video panorama stitching and three-dimensional fusion method and device
JP5677229B2 (en) Video subtitle detection apparatus and program thereof
CN110598139A (en) Web browser augmented reality real-time positioning method based on 5G cloud computing
WO2018010549A1 (en) Method and device for video local zooming
US7139439B2 (en) Method and apparatus for generating texture for 3D facial model
CN108492381A (en) A kind of method and system that color in kind is converted into 3D model pinup pictures
US11043019B2 (en) Method of displaying a wide-format augmented reality object
CN111836025B (en) Augmented reality projection guide system
CN102819846B (en) Method and system for playing high-definition video
CN112351325A (en) Gesture-based display terminal control method, terminal and readable storage medium
Leung et al. Realistic video avatar
CN115908755A (en) AR projection method, system and AR projector
JP2003512802A (en) System and method for three-dimensional modeling
JP2013143725A (en) Image display device, image display method, and program
KR100289703B1 (en) Face Image Matching Method in Model-Based Coding System
CN116055708B (en) Perception visual interactive spherical screen three-dimensional imaging method and system
CN116860112B (en) Combined scene experience generation method, system and medium based on XR technology
JP5694060B2 (en) Image processing apparatus, image processing method, program, imaging apparatus, and television receiver

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 523000 room 706, building 1, No. 2, headquarters Second Road, Songshanhu Park, Dongguan City, Guangdong Province

Patentee after: Dongguan Yijia Creative Digital Technology Co.,Ltd.

Address before: 523000 room 706, building 2, No. 2, headquarters 2nd Road, Songshanhu Park, Dongguan City, Guangdong Province

Patentee before: Dongguan Yijia digital creative Co.,Ltd.