CN111290566A - AR-based intelligent home experience method and experience system - Google Patents

AR-based intelligent home experience method and experience system Download PDF

Info

Publication number
CN111290566A
CN111290566A CN201811484389.6A CN201811484389A CN111290566A CN 111290566 A CN111290566 A CN 111290566A CN 201811484389 A CN201811484389 A CN 201811484389A CN 111290566 A CN111290566 A CN 111290566A
Authority
CN
China
Prior art keywords
spot array
smart home
light spot
array
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811484389.6A
Other languages
Chinese (zh)
Other versions
CN111290566B (en
Inventor
于思博
张宁
咸竞天
郑家宁
吴瑾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN201811484389.6A priority Critical patent/CN111290566B/en
Publication of CN111290566A publication Critical patent/CN111290566A/en
Application granted granted Critical
Publication of CN111290566B publication Critical patent/CN111290566B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application relates to the technical field of smart home, and particularly discloses an AR-based smart home experience method and system, comprising the following steps: generating a light spot array through a laser and an optical element which are arranged in a current scene; collecting the light spot array and identifying the characteristics of the light spot array; acquiring a virtual model of the smart home corresponding to the light spot array according to the identification result; acquiring the central position of the light spot array to establish a three-dimensional coordinate system at the central position; and creating and displaying the virtual model of the smart home in the three-dimensional coordinate system. The method of using the light spot array generated by combining the laser and the diffractive optical element as the marker also solves the problems of high difficulty, great danger and the like when the marker is pasted and fixed in specific environments such as high walls, roofs and the like, and has very important significance and application value in augmented reality smart homes or other augmented reality application scenes.

Description

AR-based intelligent home experience method and experience system
Technical Field
The application relates to the technical field of smart home, in particular to an intelligent home experience method and an experience system based on AR.
Background
Augmented Reality (AR) is a new technology that is popular in recent years, and the AR is a function of fusing virtual information with a real environment and realizing real-time display to enhance information of a real scene. The AR is widely applied to the industries of aircraft manufacturing, teaching, medical treatment and the like, and in an AR intelligent home, a user can use the display device, the virtual furniture built above the marker can be viewed on the screen, and by this means, the user can enjoy the post-decoration effect in advance before purchasing the furniture or decoration, thereby enabling the user to experience a "real" experience, when realizing AR home experience, a fixed marker, usually a two-dimensional code, needs to be pasted in a real environment, because some furniture is bigger or the decoration position is higher, the problem of difficult and dangerous laying is often faced when the fixed mark is stuck, when the user experiences the virtual home, the fixed marker is difficult to move, so that the virtual object is too rigid at the building position in the real world and cannot be controlled by the user at will, and poor virtual home experience is brought to the user.
Disclosure of Invention
In view of this, embodiments of the present application provide an AR-based smart home experience method and an AR-based smart home experience system, so as to solve the problems in the prior art that placement of a label on a smart home is inconvenient and the label is not movable after placement.
The embodiment of the application provides an AR-based smart home experience method in a first aspect, and the experience method comprises the following steps:
generating a light spot array through a laser and an optical element which are arranged in a current scene;
collecting the light spot array and identifying the characteristics of the light spot array;
acquiring a virtual model of the smart home corresponding to the light spot array according to the identification result;
acquiring the central position of the light spot array to establish a three-dimensional coordinate system at the central position;
and creating and displaying the virtual model of the smart home in the three-dimensional coordinate system.
Optionally, before the obtaining the virtual model of the smart home corresponding to the light spot array according to the identification result, the method includes:
and creating a corresponding relation between the light spot array and a virtual model of the smart home, and storing the corresponding relation to a specified position.
Optionally, the generating the array of spots by the laser and the optical element disposed in the current scene includes:
controlling the light spots emitted by the laser with specified intensity and in a specified arrangement form;
and controlling the light spots to pass through the optical element to generate a light spot array corresponding to the intensity of the light spots.
Optionally, a character of a specified typeface is superposed at the central position of the light spot array,
accordingly, the acquiring the central position of the array of light spots comprises:
and acquiring the central position of the light spot array by identifying the characters of the specified typeface.
Optionally, the characters of the specified typeface comprise cross-shaped characters.
Optionally, the array of spots is an N X array, wherein N and X are both positive integers not less than 3.
A second aspect of the embodiments of the present application provides an experience system for an AR-based smart home, where the experience system includes:
the light spot array generating module is used for generating a light spot array through a laser and an optical element which are arranged in the current scene;
the acquisition module is used for acquiring the light spot array and identifying the characteristics of the light spot array;
the acquisition module is used for acquiring a virtual model of the smart home corresponding to the light spot array according to the identification result; the system is also used for acquiring the central position of the light spot array so as to establish a three-dimensional coordinate system at the central position;
and the virtual model display module is used for creating and displaying the virtual model of the smart home in the three-dimensional coordinate system.
Optionally, before the obtaining the virtual model of the smart home corresponding to the light spot array according to the identification result, the method includes:
and creating a corresponding relation between the light spot array and a virtual model of the smart home, and storing the corresponding relation to a specified position.
Optionally, the light spot array generating module is specifically configured to:
controlling the light spots emitted by the laser with specified intensity and in a specified arrangement form;
and controlling the light spots to pass through the optical element to generate a light spot array corresponding to the intensity of the light spots.
Optionally, a character of a specified typeface is superposed at the central position of the light spot array,
correspondingly, when the acquiring module is configured to acquire the central position of the spot array, the acquiring module is specifically configured to:
and acquiring the central position of the light spot array by identifying the characters of the specified typeface.
In the embodiment provided by the application, the laser and the optical element which are arranged in the current scene generate the facula array; collecting the light spot array and identifying the characteristics of the light spot array; acquiring a virtual model of the smart home corresponding to the light spot array according to the identification result; acquiring the central position of the light spot array to establish a three-dimensional coordinate system at the central position; and creating and displaying the virtual model of the smart home in the three-dimensional coordinate system. The invention skillfully solves the problems of labor consumption, difficult marker arrangement, certain danger in pasting fixed markers and the like in a special environment by means of laser diffraction spot arrays. The method of using the light spot array generated by combining the laser and the diffractive optical element as the marker also solves the problems of high difficulty, great danger and the like when the marker is pasted and fixed in specific environments such as high walls, roofs and the like, and has very important significance and application value in augmented reality smart homes or other augmented reality application scenes.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below.
Fig. 1 is a schematic flow chart of an implementation process of an AR-based smart home experience method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a laser and diffractive optical element according to the present application;
fig. 3 is a schematic diagram of an array of laser spots provided by an embodiment of the present application;
FIG. 4 is a schematic structural diagram of a laser and an image capturing unit according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an AR-based smart home experience system according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the application and do not constitute a limitation on the application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
The first embodiment is as follows:
fig. 1 shows a schematic implementation flow diagram of an AR-based smart home experience method provided in an embodiment of the present application, including steps S11-S15, where:
in step S11, an array of spots is generated by the laser and optical elements provided in the current scene.
Optionally, the generating the array of spots by the laser and the optical element disposed in the current scene includes:
controlling the laser to emit light spots with specified intensity and in specified and arranged forms;
and controlling the light spots to pass through the optical element to generate a light spot array corresponding to the intensity of the light spots.
Optionally, the array of spots is an N X array, wherein N and X are both positive integers not less than 3.
The invention uses the laser and the diffraction optical element to emit the light spot array as the characteristic mark, thereby realizing the virtual furniture experience of augmented reality.
Specifically, the laser (U1) splits the light through the diffractive optical element (U2) to divide one laser spot into several laser spot arrays, and the splitting principle is shown in fig. 1. In order to meet the requirement of feature identification in the augmented reality process, the laser spots diffracted by the laser need to be in specific number and size. In order to facilitate feature extraction, after the laser spot array is identified, a coordinate system is established, a cross wire is used for generating the laser spot array, the three-dimensional coordinate origin is determined by identifying the center of the cross wire, and the laser spot array to be designed is shown in fig. 2.
The number of the light spots in the laser light spot array is not limited to that shown in fig. 2, and the arrangement, the number of the light spots, and the wavelength of the light are determined according to the scene, the size of the virtual object, and other factors, but not limited thereto. In the image processing process, in order to accurately extract the coordinate information of the laser spot, the size of the laser spot in the image is required to be more than 3 × 3.
A bundle of facula that laser instrument emitted in this application passes through behind the diffraction beam splitter, can divide into a plurality of bundles with a laser, except power and transmission angle, every beam of light all has the same attribute with initial laser. In the embodiment of the invention, in order to ensure that the indicating laser can be identified by the camera and human eyes and simultaneously the human eyes are not damaged, the red light indicating laser of a new Changchun industry company is selected, and the wavelength is 532 nm. The laser diffraction optical element is customized according to the spot array or selects a general product, for example, a product related to Wuhan New Tet photoelectric company can be selected.
Step S12, collecting the light spot array and identifying the characteristics of the light spot array;
step S13, acquiring a virtual model of the smart home corresponding to the light spot array according to the identification result;
step S14, acquiring the central position of the light spot array to establish a three-dimensional coordinate system at the central position;
and step S15, creating and displaying the virtual model of the smart home in the three-dimensional coordinate system.
When a user experiences smart home through an AR technology, the light spot array existing in the current scene is obtained through the image recognition device, and the characteristics of the light spot array are identified. For example, images can be acquired through a binocular stereo vision camera, and the acquired light spot array is subjected to feature recognition. The light spots in the array are subjected to Hough transformation, the positions of the light spots and the centers of the cross hairs can be determined by adopting an edge detection method, so that a three-dimensional coordinate system is established, and then a virtual article is established under the coordinate system to realize virtual-real fusion.
When the augmented reality smart home is realized, the laser is used for projecting the characteristic mark to a real scene, the spot array in the real scene is acquired through equipment with an image shooting unit, such as a PC, a mobile phone, a notebook computer or wearable equipment, and the like, the binocular stereo vision camera of the ZED is adopted, the camera can process a real-time depth map with the highest 4416x1242 resolution ratio at the speed of 15fps, the object depth in the range of 20 meters can be drawn, after the camera acquires the image, the characteristic extraction is carried out on the spot array through Hough transformation and edge detection methods, and aiming at the situation of the spot array shown in the figure 2, the central position of a cross wire is used for establishing a coordinate origin, and a three-dimensional coordinate system is established.
For the above process, in order to more conveniently enable the image capturing unit to acquire the feature identifier, the laser may be mechanically connected to the image capturing unit, and fig. 3 is a structural diagram of the laser connected to the binocular stereoscopic camera.
Furthermore, when the image shooting unit provided with the laser is used for collecting the characteristic mark, the irradiation area of the image shooting unit coincides with the projection area of the characteristic mark, so that the collection direction of the image shooting device does not need to be adjusted manually. When a user wants to change the position of the light spot array, only the projection direction of the laser needs to be adjusted, and the image shooting unit can acquire the current light spot array image without adjusting the direction.
In the embodiment provided by the application, the laser and the optical element which are arranged in the current scene generate the facula array; collecting the light spot array and identifying the characteristics of the light spot array; acquiring a virtual model of the smart home corresponding to the light spot array according to the identification result; acquiring the central position of the light spot array to establish a three-dimensional coordinate system at the central position; and creating and displaying the virtual model of the smart home in the three-dimensional coordinate system. The invention skillfully solves the problems of labor consumption, difficult marker arrangement, certain danger in pasting fixed markers and the like in a special environment by means of laser diffraction spot arrays. The method of using the light spot array generated by combining the laser and the diffractive optical element as the marker also solves the problems of high difficulty, great danger and the like when the marker is pasted and fixed in specific environments such as high walls, roofs and the like, and has very important significance and application value in augmented reality smart homes or other augmented reality application scenes.
Example two:
fig. 5 shows a schematic structural diagram of an AR-based smart home experience system according to another embodiment of the present application, where the system includes:
the light spot array generating module is used for generating a light spot array through a laser and an optical element which are arranged in the current scene;
the acquisition module is used for acquiring the light spot array and identifying the characteristics of the light spot array;
the acquisition module is used for acquiring a virtual model of the smart home corresponding to the light spot array according to the identification result; the system is also used for acquiring the central position of the light spot array so as to establish a three-dimensional coordinate system at the central position;
and the virtual model display module is used for creating and displaying the virtual model of the smart home in the three-dimensional coordinate system.
Optionally, before the obtaining the virtual model of the smart home corresponding to the light spot array according to the identification result, the method includes:
and creating a corresponding relation between the light spot array and a virtual model of the smart home, and storing the corresponding relation to a specified position.
Optionally, the light spot array generating module is specifically configured to:
controlling the light spots emitted by the laser with specified intensity and in a specified arrangement form;
and controlling the light spots to pass through the optical element to generate a light spot array corresponding to the intensity of the light spots.
Optionally, a character of a specified typeface is superposed at the central position of the light spot array,
correspondingly, when the acquiring module is configured to acquire the central position of the spot array, the acquiring module is specifically configured to:
and acquiring the central position of the light spot array by identifying the characters of the specified typeface.
Optionally, the characters of the specified typeface comprise cross-shaped characters.
Optionally, the array of spots is an N X array, wherein N and X are both positive integers not less than 3.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention. Any other corresponding changes and modifications made according to the technical idea of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. An AR-based smart home experience method, the experience method comprising:
generating a light spot array through a laser and an optical element which are arranged in a current scene;
collecting the light spot array and identifying the characteristics of the light spot array;
acquiring a virtual model of the smart home corresponding to the light spot array according to the identification result;
acquiring the central position of the light spot array to establish a three-dimensional coordinate system at the central position;
and creating and displaying the virtual model of the smart home in the three-dimensional coordinate system.
2. The AR-based smart home experience method according to claim 1, wherein before obtaining the virtual model of the smart home corresponding to the spot array according to the recognition result, the method comprises:
and creating a corresponding relation between the light spot array and a virtual model of the smart home, and storing the corresponding relation to a specified position.
3. The AR-based smart home experience method of claim 1, wherein the generating an array of spots by lasers and optical elements disposed in a current scene comprises:
controlling the light spots emitted by the laser with specified intensity and in a specified arrangement form;
and controlling the light spots to pass through the optical element to generate a light spot array corresponding to the intensity of the light spots.
4. The AR-based smart home experience method of claim 1, wherein a character of a specified typeface is superimposed at the spot array center position,
accordingly, the acquiring the central position of the array of light spots comprises:
and acquiring the central position of the light spot array by identifying the characters of the specified typeface.
5. The AR-based smart home experience method of claim 4, wherein the characters of the specified typeface comprise cross-shaped characters.
6. The AR-based smart home experience method according to any of claims 1-5, wherein the array of light spots is an N X array, wherein N and X are both positive integers not less than 3.
7. An AR-based smart home experience system, the experience system comprising:
the light spot array generating module is used for generating a light spot array through a laser and an optical element which are arranged in the current scene;
the acquisition module is used for acquiring the light spot array and identifying the characteristics of the light spot array;
the acquisition module is used for acquiring a virtual model of the smart home corresponding to the light spot array according to the identification result; the system is also used for acquiring the central position of the light spot array so as to establish a three-dimensional coordinate system at the central position;
and the virtual model display module is used for creating and displaying the virtual model of the smart home in the three-dimensional coordinate system.
8. The AR-based smart home experience system of claim 1, wherein prior to obtaining the virtual model of the smart home corresponding to the array of light spots according to the recognition result, the AR-based smart home experience system comprises:
and creating a corresponding relation between the light spot array and a virtual model of the smart home, and storing the corresponding relation to a specified position.
9. The AR-based smart home experience system of claim 1 or 2, wherein the spot array generation module is specifically configured to:
controlling the light spots emitted by the laser with specified intensity and in a specified arrangement form;
and controlling the light spots to pass through the optical element to generate a light spot array corresponding to the intensity of the light spots.
10. The AR based smart home experience system of claim 1, wherein a character of a specified typeface is superimposed at the spot array center position,
correspondingly, when the acquiring module is configured to acquire the central position of the spot array, the acquiring module is specifically configured to:
and acquiring the central position of the light spot array by identifying the characters of the specified typeface.
CN201811484389.6A 2018-12-06 2018-12-06 AR-based intelligent home experience method and experience system Active CN111290566B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811484389.6A CN111290566B (en) 2018-12-06 2018-12-06 AR-based intelligent home experience method and experience system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811484389.6A CN111290566B (en) 2018-12-06 2018-12-06 AR-based intelligent home experience method and experience system

Publications (2)

Publication Number Publication Date
CN111290566A true CN111290566A (en) 2020-06-16
CN111290566B CN111290566B (en) 2021-09-17

Family

ID=71017526

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811484389.6A Active CN111290566B (en) 2018-12-06 2018-12-06 AR-based intelligent home experience method and experience system

Country Status (1)

Country Link
CN (1) CN111290566B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339654A (en) * 2007-07-04 2009-01-07 北京威亚视讯科技有限公司 Reinforced real environment three-dimensional registering method and system based on mark point
CN103460256A (en) * 2011-03-29 2013-12-18 高通股份有限公司 Anchoring virtual images to real world surfaces in augmented reality systems
CN106339090A (en) * 2016-08-31 2017-01-18 广东虹勤通讯技术有限公司 Keycap, gloves and input system
CN108431736A (en) * 2015-10-30 2018-08-21 奥斯坦多科技公司 The system and method for gesture interface and Projection Display on body

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339654A (en) * 2007-07-04 2009-01-07 北京威亚视讯科技有限公司 Reinforced real environment three-dimensional registering method and system based on mark point
CN103460256A (en) * 2011-03-29 2013-12-18 高通股份有限公司 Anchoring virtual images to real world surfaces in augmented reality systems
CN108431736A (en) * 2015-10-30 2018-08-21 奥斯坦多科技公司 The system and method for gesture interface and Projection Display on body
CN106339090A (en) * 2016-08-31 2017-01-18 广东虹勤通讯技术有限公司 Keycap, gloves and input system

Also Published As

Publication number Publication date
CN111290566B (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN106157359B (en) Design method of virtual scene experience system
CN104933718B (en) A kind of physical coordinates localization method based on binocular vision
US10089794B2 (en) System and method for defining an augmented reality view in a specific location
CN108492356A (en) Augmented reality system and its control method
CN111046725B (en) Spatial positioning method based on face recognition and point cloud fusion of surveillance video
CN105222717B (en) A kind of subject matter length measurement method and device
JP2006325165A (en) Device, program and method for generating telop
CN104599317A (en) Mobile terminal and method for achieving 3D (three-dimensional) scanning modeling function
KR20090117531A (en) System for constructing mixed reality and method thereof
CN110568934B (en) Low-error high-efficiency multi-marker-diagram augmented reality system
US10271038B2 (en) Camera with plenoptic lens
CN105898287B (en) Machine vision analytical equipment based on bore hole stereoscopic display and method
CN105739106B (en) A kind of true three-dimensional display apparatus of body-sensing multiple views large scale light field and method
CN109389634A (en) Virtual shopping system based on three-dimensional reconstruction and augmented reality
CN105894571B (en) Method and device for processing multimedia information
CN109584361B (en) Equipment cable virtual preassembling and track measuring method and system
CN108430032A (en) A kind of method and apparatus for realizing that VR/AR device locations are shared
CN111290566B (en) AR-based intelligent home experience method and experience system
CN112862973A (en) Real-time remote training method and system based on fault site
CN109445598A (en) A kind of augmented reality system and device of view-based access control model
CN105893452B (en) Method and device for presenting multimedia information
CN104821135A (en) Method and device for realizing combination display of paper map and electronic map
Mori et al. An overview of augmented visualization: observing the real world as desired
CN105894581B (en) Method and device for presenting multimedia information
CN210488499U (en) Low-error high-efficiency multi-label-graph augmented reality system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant