CN107016733A - Interactive system and exchange method based on augmented reality AR - Google Patents
Interactive system and exchange method based on augmented reality AR Download PDFInfo
- Publication number
- CN107016733A CN107016733A CN201710133993.3A CN201710133993A CN107016733A CN 107016733 A CN107016733 A CN 107016733A CN 201710133993 A CN201710133993 A CN 201710133993A CN 107016733 A CN107016733 A CN 107016733A
- Authority
- CN
- China
- Prior art keywords
- augmented reality
- user
- user interaction
- information
- interaction device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 75
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 61
- 238000000034 method Methods 0.000 title claims description 22
- 230000003993 interaction Effects 0.000 claims abstract description 69
- 230000004048 modification Effects 0.000 claims abstract description 13
- 238000012986 modification Methods 0.000 claims abstract description 13
- 230000008859 change Effects 0.000 claims abstract description 12
- 230000000007 visual effect Effects 0.000 claims description 16
- 238000004891 communication Methods 0.000 claims description 11
- 230000033001 locomotion Effects 0.000 claims description 10
- 230000003287 optical effect Effects 0.000 claims description 10
- 238000003491 array Methods 0.000 claims description 5
- 235000007926 Craterellus fallax Nutrition 0.000 abstract description 2
- 240000007175 Datura inoxia Species 0.000 abstract description 2
- 230000006870 function Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 239000011800 void material Substances 0.000 description 2
- ZYXYTGQFPZEUFX-UHFFFAOYSA-N benzpyrimoxan Chemical compound O1C(OCCC1)C=1C(=NC=NC=1)OCC1=CC=C(C=C1)C(F)(F)F ZYXYTGQFPZEUFX-UHFFFAOYSA-N 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Computational Linguistics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides a kind of interactive system based on augmented reality AR, it includes:User interaction device, it is to receive the interactive instruction of user and carry out parsing the information obtained on creating dummy object;Augmented reality AR projector equipments, it communicates with user interaction device, to create three-dimensional virtual object according to the information on establishment dummy object, and project it onto projection area and shown, wherein, user interaction device includes voice recognition unit, its user speech to be identified, the three-dimensional information of dummy object that has been created with three-dimensional information and/or modification that the object for wanting virtual is determined according to the content of voice and the spatial positional information that shows of projection.It is combined by man-machine interaction with AR augmented realities, the experience more horn of plenty that user can be made to be played.Dummy object in augmented reality can be changed in game environment with the interactive instruction of user, change the function for the single display that original augmented reality is brought.
Description
Technical field
The present invention relates to field in intelligent robotics, in particular to the interactive system based on augmented reality AR and interaction
Method.
Background technology
AR (Augmented Reality), i.e. augmented reality.It is arrived virtual Information application by computer technology
Real world, real environment and virtual object have been added to same picture in real time or space exists simultaneously.Current AR
Technology is had application in some practical matters.For example, when carrying out mechanical erection, maintenance, mode, can by Helmet Mounted Display
With the machine inner structure that can not will be presented originally, and its relevant information, data are showed completely by AR technologies.And
Can according to computer prompting, to be operated.In terms of medical treatment, doctor can utilize AR, and operative site is carried out easily
Be accurately positioned.And in military aspect, army can utilize AR, the identification in orientation is carried out, the geography of current site is obtained
The important military-specific data such as data.
However, not occurring also therewith entering AR technologies especially in man-machine interaction occasion in field in intelligent robotics
The practical application that row is combined.
The content of the invention
The present invention is in order to solve the above technical problems, there is provided a kind of man-machine interactive system based on augmented reality AR.The friendship
Mutual system includes:
User interaction device, it is obtained on creating dummy object to receive the interactive instruction of user and parse
Information;
Augmented reality AR projector equipments, it communicates with the user interaction device, to according to described virtual on creating
The information of object creates three-dimensional virtual object, and projects it onto throwing vision area and shown,
Wherein, user interaction device at least includes voice recognition unit, and it is identified to the voice to user, with root
According to the content of voice determine the three-dimensional information of dummy object that the three-dimensional information for the object for wanting virtual and/or modification created with
And project the spatial positional information of display.
According to the interactive system of the present invention, user not only can carry out conventional chat with intelligent robot and interact, and pass through
Common mode exports the content to be shown to user, and user can also be allowed by way of augmented reality is shown in display
Have more real experience.Such as, user is when carrying out the chat on tourism with intelligent robot, and robot can pass through
The mode of augmented reality shows the scenery of proposed tourist famous-city to user.
According to one embodiment of the invention, in the interactive system based on augmented reality AR, the user interaction device is also
Including visual identity unit, it is identified to the object in the gesture or visual range made to user in cog region,
To determine or change the three-dimensional information and/or spatial positional information of the object for wanting virtual.
According to one embodiment of present invention, in the interactive system based on augmented reality AR, in addition to structured light
Machine and to follow the trail of the structure light tracker that the structured light machine emits beam.
According to one embodiment of present invention, in the interactive system based on augmented reality AR, the structured light machine
For handheld device, the structure light tracker is arranged on the augmented reality AR projector equipments, the structure light tracker with
The user interaction device communication, the structure optical information tracked is sent in the user interaction device, when the structure
The light that optical sender is sent is motion, and the track of the light of tracking is sent to the user by the structure light tracker
In interactive device, information that user interaction device generation three-dimensional virtual object is moved according to the track is simultaneously sent to
The augmented reality AR projector equipments are projected.
According to one embodiment of present invention, in the interactive system based on augmented reality AR, the user interaction device
Individually or it is collectively integrated into the augmented reality AR projector equipments in robot entity.
According to one embodiment of present invention, in the interactive system based on augmented reality AR, the voice recognition unit
Middle setting MIC arrays are to capture the sound of user.
By the way that the Display Technique of augmented reality is attached in intelligent robot so that the multi-modal output of intelligent robot
Mode occur in that variation, except by existing voice feedback, action feedback output, display screen feedback output in addition to, intelligence
Can robot dummy object can also be combined with currently practical scene, allow user to have deeper by the display of augmented reality
The sensory experience of level.
According to another aspect of the present invention, additionally provide and a kind of handed over using the interactive system based on augmented reality AR
Mutual method, this method comprises the following steps:
The interactive instruction that user sends is received by user interaction device, what is included in parsing interactive instruction is empty on creating
Intend the information of object;
Using the communication of augmented reality AR projector equipments and the user interaction unit, to according to described empty on creating
Intend the information of object to create three-dimensional virtual object, and project it onto throwing vision area to be shown,
Wherein, the interactive instruction is phonetic order, by using the voice recognition unit root of the user interaction device
The three-dimensional information for the dummy object that the three-dimensional information and/or modification that the object for wanting virtual is determined according to the content of voice have been created
And project the spatial positional information of display.
According to one embodiment of present invention, in the method interacted using the interactive system based on augmented reality AR
In, in addition to the gesture or visual range made by the visual identity unit in the interactive device to user in cog region
Interior object is identified, to determine or change the three-dimensional information and/or spatial positional information of the object for wanting virtual.
According to one embodiment of present invention, in the method interacted using the interactive system based on augmented reality AR
In, the structure light that the interactive system includes structured light machine and emitted beam to follow the trail of the structured light machine is followed the trail of
Device.
According to one embodiment of present invention, in the method interacted using the interactive system based on augmented reality AR
In, the structured light machine is handheld device, and the structure light tracker is arranged on the augmented reality AR projector equipments,
The structure light tracker is communicated with the user interaction device, and the structure optical information tracked is sent into the user mutual
In equipment, when the light that the structured light machine is sent is motion, methods described is also chased after using the structure light
The track of the light of tracking is sent in the user interaction device by track device, and the user interaction device generates three-dimensional thing
Information that body is moved according to the track is simultaneously sent to the augmented reality AR projector equipments and projected.The present invention's has
Sharp part is, is combined by man-machine interaction with AR augmented realities, can cause the experience that user is played more
Horn of plenty.For example, the dummy object in augmented reality can be changed in game environment with the interactive instruction of user, change
The function for the single display that original augmented reality is brought is become.Due to can as the dummy object in augmented reality
With with user interaction, therefore user for dummy object experience it is more true.In addition, the invention provides a variety of interaction modalities
Such as voice, action and handheld device to carry out interaction with dummy object.
Other features and advantages of the present invention will be illustrated in the following description, also, partly becomes from specification
Obtain it is clear that or being understood by implementing the present invention.The purpose of the present invention and other advantages can be by specification, rights
Specifically noted structure is realized and obtained in claim and accompanying drawing.
Brief description of the drawings
Accompanying drawing is used for providing a further understanding of the present invention, and constitutes a part for specification, the reality with the present invention
Apply example to be provided commonly for explaining the present invention, be not construed as limiting the invention.In the accompanying drawings:
Fig. 1 shows the overall frame of the intelligent interactive system projected according to an embodiment of the invention based on augmented reality
Figure;
Fig. 2 shows the total of the intelligent interactive system projected based on augmented reality according to another embodiment of the invention
Body block diagram;
Fig. 3 shows the block diagram of the user interaction device according to the present invention;
Fig. 4 is shown carries out man-machine interaction using according to the present invention based on the intelligent interactive system that augmented reality is projected
Overview flow chart;And
Fig. 5 shows that dummy object is completed between user and interactive device, augmented reality projector equipment to be created and modification behaviour
The signal stream of work.
Embodiment
To make the object, technical solutions and advantages of the present invention clearer, the embodiment of the present invention is made below in conjunction with accompanying drawing
Further describe in detail.
As shown in figure 1, which show the intelligent interactive system 100 with augmented reality projecting function according to the present invention
Structured flowchart.Intelligent interactive system 100 includes user interaction device 101 and augmented reality AR projector equipments 102.In a reality
Apply in example, when augmented reality AR projector equipments 102 are fixedly mounted on some position, by adjusting projector equipment 102, Ke Yishe
Put the position where projection area.For example, an indoor colourless wall is set into projection area, or by some area on desktop or ground
Domain is set to projection area.
User interaction device 101 can obtain the interactive instruction that user sends.Normally, user is by saying one section of voice
For example " build a Rome castle " and be used as instruction and be sent to the intelligent interactive system 100 with augmented reality projecting function.With
Family interactive device 101 is received after the phonetic order, by speech recognition and is parsed to obtain the intention of user.Connect down
Come, the three-dimensional data in intelligent interactive system search thesaurus on Rome castle, and these three-dimensional informations are sent to enhancing
On real AR projector equipments 102.Or, intelligent interactive system resolves according to the two-dimension picture information stored in picture library and obtains three
Information is tieed up, then the virtual scene of Rome castle is reproduced and projected in the projection area pre-set by projector equipment, to
User shows.
In the game that a virtual chessboard of use is played chess, when user says the voice of " I will play Chess "
When, user interaction device 101 receives the instruction and parsed, and the three-dimensional information of the dummy object of the chessboard of chess is sent out
Give augmented reality AR projector equipments 102.By projector equipment 102 by the scene simulation of virtual chessboard to default projection area example
As on desktop.So, user just can at once be in the environment of virtual chessboard after the sentence to be played Chess is said
Under.
User interaction device 101 can also obtain the gesture that user makes and recognize the implication representated by it.Such as user
The gesture made is some chess piece in mobile chessboard, and user interaction device 101 can capture the letter by visually-perceptible unit
Breath, calculating user needs the information of mobile chess piece, and the destination locations information moved.User interaction device 101 will
These information are immediately communicated to projector equipment 102.Projector equipment 102 will change some chess piece in virtual chessboard according to what is received
Position instruction, the chess piece is virtually shown again.The virtual scene so shown seems that it is user to just look like
The chess piece is moved to destination locations the same.
As shown in Fig. 2 which show the structural frames for another application scheme that modification is carried out according to the principle of the present invention
Figure.In the present example, augmented reality AR projector equipments 102 are provided separately as independent equipment and user interaction device.It will use
Family interactive device is located in intelligent robot, or using existing some functional units in intelligent robot in combination as
The user interaction device of the present invention is used.
Specifically, as shown in Fig. 2 the system include the hand-held structured light machine 300 of robot 200, user and
Augmented reality AR projector equipments 102.Structure light tracker 301 is fixedly mounted on augmented reality AR projector equipments, it goes back and machine
Device people 200 is communicated.The light sent in the structured light machine 300 hand-held to track user of structure light tracker 301
Movement locus.Need exist in explanation, figure it is shown that structure light tracker is arranged on projector equipment or it nearby
Vicinity, but this it is not necessary to, different according to the scene of application, structure light tracker can also be arranged on robot 200
Inside, is used as a part for visual identifying system.
Operating system 203 is installed in robot 200 in the present invention, under the operating system, robot various pieces
Hardware the respective function of execution is called by corresponding driver.For user in certain limit or or in certain region
The voice sent or the action made, the speech recognition system and visual identifying system of robot will work and to these information
Caught.For example, when user sends one section of voice, the speech recognition system of robot will obtain these information, by each
Speech analysis algorithm is planted to parse semanteme therein.Simplest mode is that robot can be by engine 205 in language thesaurus
The sentence that the language said with user matches is found in 206, and makes corresponding accordingly.
If for example, in robot speech recognition system analysis user sentence be create certain dummy object order,
The essential information required for the dummy object, such as three-dimensional information will be created by then then analyzing and calculating.The voice of robot
The hardware composition part of identifying system can be MIC arrays.Being arranged to the robot of MIC arrays can obtain in very large range
The voice of user.The three-dimensional information of dummy object to be created can then be transmitted by the communication interface 202 set in robot
Projection is carried out to augmented reality AR projector equipments 102 to show.
When user needs to change the locus of certain dummy object created, user can point to the void by gesture
Intend object, then move the position desired by finger sensing user.Void can also be controlled by hand held structures optical sender 300
Intend the movement locus of object.
In another example, user can also point to some material object by using hand held structures optical sender 300, make the reality
Thing is as by virtual object.In this case, robotic vision identifying system startup work, obtains the material object in real time
The three-dimensional information such as physical size and outward appearance, and these information are sent in augmented reality AR projector equipments by communication interface
Carry out virtual projection display.With the ray trajectory of structured light machine, produced dummy object can also be moved with the track,
So that the effect seemed is that the material object is passed through into handheld mobile device to another physical locations.
Structured light machine 300 can be designed to the shape of magic wand, baton or pen according to practical application in shape,
The user mutual that the tracking user of structure light tracker 301 sends by structured light machine 300 instructs to complete dummy object
Establishment or the modification of locus.
In the present invention, the structured light machine is handheld device, and it is existing that the structure light tracker is arranged on the enhancing
On real AR projector equipments, the structure light tracker is communicated with the user interaction device, and the structure optical information tracked is sent out
It is sent in the user interaction device, when the light that the structured light machine is sent is motion, the structure light is followed the trail of
The track of the light of tracking is sent in the user interaction device by device, and the user interaction device generates three-dimensional virtual object
The information moved according to the track is simultaneously sent to the augmented reality AR projector equipments and projected.In the implementation, tie
Structure light tracker can be that structure light follows the trail of camera.
In one embodiment, according to the present invention user interaction device 101 in can be with integrated several functional units
Including voice recognition unit 101a, visual identity unit 101b and structure light tracker 101c, they are used for receiving use jointly
The multi-modal interactive instruction that family is sent, as shown in Figure 3.Visual identity unit 101b can use camera in the implementation, and it is used
It is identified with the object in the gesture or visual range made to user in cog region, so that determine or change will be virtual
The three-dimensional information and/or spatial positional information of object.Structure light tracker 101C is the three-dimensional in kind that light reflects to be believed
Breath is sent in user interaction device to create for dummy object information in kind and by augmented reality AR projector equipments exist
Throw in vision area and show.
In another embodiment, structure light tracker follows the trail of the knot that structured light machine is projected directly on projected area
Structure optical information, follows the trail of its movement locus and returns to user interaction device.
According to one embodiment of present invention, because user interaction device 101 can carry out speech recognition and action recognition,
Therefore it at least includes voice recognition unit 101a and 101b.When holding structured light equipment in user's hand, user mutual
Equipment 101 should also include the optical tracking picture pick-up device for being used for following the trail of structure light.This several functional unit operationally can be with
Mutually aid in.For example, when being instructed according to interactive voice, when creating a dummy object, gesture motion or control can be passed through
Structure light is modified come the shaped position to dummy object.Can also be, when virtually creating a virtual object according to structure light
During body, the change of shape and position is carried out by voice.
Fig. 4 is shown carries out man-machine interaction using according to the present invention based on the intelligent interactive system that augmented reality is projected
Overview flow chart.Fig. 4 man-machine interaction method starts from step S401.In this step, intelligent interactive system is handed over by user
Mutual equipment receives the interactive instruction that user sends, the information in parsing interactive instruction on establishment dummy object.For example, by using
The dummy object that the path matching for the gesture that family is made is pre-seted, by the shape for recognizing the object that user is pointed to by handheld device
The physical sizes such as shape to show actual object progress virtualization.Further, user mutual instruction might also be to
The instruction modified to the dummy object existed, such as changing zoom, position.
Next, in step S402, communicated using augmented reality AR projector equipments with user interaction device, obtain on
Create the information of dummy object.In a specific example, user interaction device has a communication interface, such as 802.11b without
Line local area network communication interface, zigbee communication interfaces or mobile communications network interface etc..Pass through local area network communication interface, Yong Hujiao
Mutual equipment, which with AR projector equipments nearby can communicate, transmits the information of dummy object.When the computing energy of user interaction device
Power can not meet requirement and when can not correctly identify user view, can by mobile communications network interface, to be communicated with high in the clouds,
To solve the computational problem of its identification.
Finally, in step S403, augmented reality AR projector equipments are received for creating or changing dummy object
After information, the establishment operation of conventional dummy object is carried out, and carries out projection showing.
Because the method for the present invention describes what is realized in computer systems.The computer system can for example be set
In the control core processor of robot.For example, method described herein can be implemented as what can be performed with control logic
Software, it is performed by the CPU in robot control system.Function as described herein, which can be implemented as being stored in non-transitory, to be had
Programmed instruction set in shape computer-readable medium.When implemented in this fashion, the computer program includes one group of instruction,
When group instruction is run by computer, it, which promotes computer to perform, can implement the method for above-mentioned functions.FPGA can be temporary
When or be permanently mounted in non-transitory tangible computer computer-readable recording medium, for example ROM chip, computer storage,
Disk or other storage mediums.In addition to being realized with software, logic as described herein can utilize discrete parts, integrated electricity
Road, programmable the patrolling with programmable logic device (such as, field programmable gate array (FPGA) or microprocessor) combined use
Volume, or embodied including any other equipment that they are combined.All such embodiments are intended to fall under the model of the present invention
Within enclosing.
For the example of robot, the information needed for creating dummy object is calculated and can locally carried out in robot, or
High in the clouds can be sent to by robot to focus on.When robot local computing capability is not enough, it can be by sending out to high in the clouds
Request is sent to obtain the three-dimensional information of establishment dummy object.
Fig. 5 show in more detail the signal stream interacted between user, user interaction device, augmented reality projector equipment
Cheng Tu.
In this example, it regard the movie theatre of certain science fiction movies just to deliver as the scene virtually combined.User enters just
In the movie theatre for playing certain film, when its position is in the speech recognition area and visual identity area of user interaction device, he can
To send phonetic order, such as " I needs an airplane ", the phonetic order is exactly the instruction of an establishment dummy object.
In speech recognition area, user interaction device smoothly obtains the section audio data, by speech recognition system to this
Section audio data are parsed, such as match statement is searched in corpus, and obtaining user needs to create " aircraft " this object
Instruction.After correctly phonetic order has been parsed, user interaction device can return to the information confirmed to user, then next enter
One step calculates the shape data for the aircraft to be created, such as by picture searching, finds corresponding aircraft picture and carries out two dimension
Conversion of the data to space three-dimensional data.Then virtually flown to what the transmission of augmented reality projector equipment was obtained by communication interface
The three-dimensional information of machine.
Augmented reality projector equipment creates the virtual aircraft of a frame according to the three-dimensional information of the virtual aircraft received simultaneously
Shown in projected area.
At this time, if the user find that the size of shown virtual aircraft needs change, then can by gesture or
The same interactive instruction that modification is sent with language.The hand-held luminaire of control can also be further advanced by virtual to move this
Aircraft is to specified location.
User interaction device further parses the modification instruction of user's transmission, is instructed by analytic modification, calculates modification amount
And be transmitted.
When user needs some spectators in movie theatre are virtual into scene, can be pointed to by hand-held light-emitting device should
The three-dimensional data of spectators, then the visual identity elements capture spectators in user interaction device, while sending augmented reality to
In projector equipment.As the light tracker in user interaction device tracks the light movement track that hand-held light-emitting device is sent,
The personage fictionalized can be come on virtual aircraft as the light of light-emitting device is pointed to.This virtual and reality knot
Credit union brings beautiful experience to user.
In the present invention, the recognition unit of user interaction device is extremely important, and its recognition capability determines the wound of dummy object
Build and whether succeed.Therefore, for speech recognition, MIC arrays are set to expand speech recognition scope, for visual identity,
Increase structured light and light tracking cell to carry out auxiliary visual identity.
The present invention is applied not only to cinema's scene, may be also used in and conductor's floor manager symphony orchestra is imitated in concert
Scene.In addition, in magic show, common people can be helped to turn into, and magician's performance is aerial to move the technical ability such as thing.
It should be understood that disclosed embodiment of this invention is not limited to specific structure disclosed herein, process step
Or material, and the equivalent substitute for these features that those of ordinary skill in the related art are understood should be extended to.It should also manage
Solution, term as used herein is only used for describing the purpose of specific embodiment, and is not intended to limit.
" one embodiment " or " embodiment " mentioned in specification means special characteristic, the structure described in conjunction with the embodiments
Or during characteristic is included at least one embodiment of the present invention.Therefore, the phrase " reality that specification various places throughout occurs
Apply example " or " embodiment " same embodiment might not be referred both to.
While it is disclosed that embodiment as above, but described content is only to facilitate understanding the present invention and adopting
Embodiment, is not limited to the present invention.Any those skilled in the art to which this invention pertains, are not departing from this
On the premise of the disclosed spirit and scope of invention, any modification and change can be made in the implementing form and in details,
But the scope of patent protection of the present invention, still should be subject to the scope of the claims as defined in the appended claims.
Claims (10)
1. a kind of interactive system based on augmented reality AR, it is characterised in that the interactive system includes:
User interaction device, it is to receive the interactive instruction of user and carry out parsing the letter obtained on creating dummy object
Breath;
Augmented reality AR projector equipments, it communicates with the user interaction device, described on creating dummy object to basis
Information create three-dimensional virtual object, and project it onto projection area and shown,
Wherein, user interaction device at least includes voice recognition unit, and it is identified to the voice to user, with according to language
The content of sound determines three-dimensional information and the institute for the dummy object that the three-dimensional information for the object for wanting virtual and/or modification have been created
Project the spatial positional information of display.
2. the interactive system as claimed in claim 1 based on augmented reality AR, it is characterised in that the user interaction device is also
Including visual identity unit, it is identified to the object in the gesture or visual range made to user in cog region,
To determine or change the three-dimensional information and/or spatial positional information of the object for wanting virtual.
3. the interactive system as claimed in claim 1 based on augmented reality AR, it is characterised in that also including structured light machine
With to follow the trail of the structure light tracker that the structured light machine emits beam.
4. the interactive system based on augmented reality AR as claimed in claim 3, it is characterised in that the structured light machine is
Handheld device, the structure light tracker is arranged on the augmented reality AR projector equipments, the structure light tracker and institute
User interaction device communication is stated, the structure optical information tracked is sent in the user interaction device, when the structure light
The light that emitter is sent is motion, and the track of the light of tracking is sent to the user and handed over by the structure light tracker
In mutual equipment, information that the user interaction device generation three-dimensional virtual object is moved according to the track is simultaneously sent to institute
Augmented reality AR projector equipments are stated to be projected.
5. the interactive system as claimed in claim 1 based on augmented reality AR, it is characterised in that the user interaction device list
Solely or it is collectively integrated into the augmented reality AR projector equipments in robot entity.
6. the interactive system as claimed in claim 1 based on augmented reality AR, it is characterised in that in the voice recognition unit
MIC arrays are set to capture the sound of user.
7. a kind of method interacted using the interactive system based on augmented reality AR, it is characterised in that methods described includes
Following steps:
The interactive instruction that sends of user is received by user interaction device, included in parsing interactive instruction on creating virtual object
The information of body;
It is described on creating virtual object to basis using the communication of augmented reality AR projector equipments and the user interaction unit
The information of body creates three-dimensional virtual object, and projects it onto projection area and shown,
Wherein, the interactive instruction is phonetic order, by using the voice recognition unit of the user interaction device according to language
The three-dimensional information of dummy object that the content of sound has created the three-dimensional information and/or modification of determining the object for wanting virtual and
Project the spatial positional information of display.
8. the method interacted as claimed in claim 7 using the interactive system based on augmented reality AR, it is characterised in that
Methods described also includes the gesture made by the visual identity unit in the interactive device to user in cog region or regarded
Object in the range of feel is identified, to determine or change the three-dimensional information and/or spatial positional information of the object for wanting virtual.
9. the method interacted as claimed in claim 7 using the interactive system based on augmented reality AR, it is characterised in that
The interactive system includes structured light machine and to follow the trail of the structure light tracker that the structured light machine emits beam.
10. the method interacted as claimed in claim 9 using the interactive system based on augmented reality AR, its feature is existed
In, the structured light machine is handheld device, and the structure light tracker is arranged on the augmented reality AR projector equipments,
The structure light tracker is communicated with the user interaction device, and the structure optical information tracked is sent into the user mutual
In equipment, when the light that the structured light machine is sent is motion, methods described is also chased after using the structure light
The track of the light of tracking is sent in the user interaction device by track device, and the user interaction device generates three-dimensional thing
Information that body is moved according to the track is simultaneously sent to the augmented reality AR projector equipments and projected.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710133993.3A CN107016733A (en) | 2017-03-08 | 2017-03-08 | Interactive system and exchange method based on augmented reality AR |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710133993.3A CN107016733A (en) | 2017-03-08 | 2017-03-08 | Interactive system and exchange method based on augmented reality AR |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107016733A true CN107016733A (en) | 2017-08-04 |
Family
ID=59440194
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710133993.3A Pending CN107016733A (en) | 2017-03-08 | 2017-03-08 | Interactive system and exchange method based on augmented reality AR |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107016733A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108519816A (en) * | 2018-03-26 | 2018-09-11 | 广东欧珀移动通信有限公司 | Information processing method, device, storage medium and electronic equipment |
CN109407899A (en) * | 2018-09-12 | 2019-03-01 | 北京星云城科技有限公司 | A kind of desktop alternative projection system |
CN109669541A (en) * | 2018-09-04 | 2019-04-23 | 亮风台(上海)信息科技有限公司 | It is a kind of for configuring the method and apparatus of augmented reality content |
CN109697918A (en) * | 2018-12-29 | 2019-04-30 | 深圳市掌网科技股份有限公司 | A kind of percussion instrument experiencing system based on augmented reality |
CN109976519A (en) * | 2019-03-14 | 2019-07-05 | 浙江工业大学 | A kind of interactive display unit and its interactive display method based on augmented reality |
CN110286754A (en) * | 2019-06-11 | 2019-09-27 | Oppo广东移动通信有限公司 | Projective techniques and relevant device based on eyeball tracking |
CN110794959A (en) * | 2019-09-25 | 2020-02-14 | 苏州联游信息技术有限公司 | Gesture interaction AR projection method and device based on image recognition |
CN110941416A (en) * | 2019-11-15 | 2020-03-31 | 北京奇境天成网络技术有限公司 | Interaction method and device for human and virtual object in augmented reality |
CN111275731A (en) * | 2020-01-10 | 2020-06-12 | 杭州师范大学 | Projection type real object interactive desktop system and method for middle school experiment |
CN111290729A (en) * | 2018-12-07 | 2020-06-16 | 阿里巴巴集团控股有限公司 | Man-machine interaction method, device and system |
CN112017488A (en) * | 2020-08-28 | 2020-12-01 | 济南浪潮高新科技投资发展有限公司 | AR-based education robot system and learning method |
CN112868023A (en) * | 2018-10-15 | 2021-05-28 | 艾玛迪斯简易股份公司 | Augmented reality system and method |
CN113012300A (en) * | 2021-04-02 | 2021-06-22 | 北京隐虚等贤科技有限公司 | Immersive interactive content creation method and device and storage medium |
CN113313836A (en) * | 2021-04-26 | 2021-08-27 | 广景视睿科技(深圳)有限公司 | Method for controlling virtual pet and intelligent projection equipment |
CN115033997A (en) * | 2022-07-06 | 2022-09-09 | 山东诺环建工有限公司 | Building construction design system based on AR |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102789313A (en) * | 2012-03-19 | 2012-11-21 | 乾行讯科(北京)科技有限公司 | User interaction system and method |
CN103649676A (en) * | 2011-04-15 | 2014-03-19 | 法罗技术股份有限公司 | Six degree-of-freedom laser tracker that cooperates with a remote structured-light scanner |
CN103914129A (en) * | 2013-01-04 | 2014-07-09 | 云联(北京)信息技术有限公司 | Man-machine interactive system and method |
CN104898276A (en) * | 2014-12-26 | 2015-09-09 | 成都理想境界科技有限公司 | Head-mounted display device |
US20160205378A1 (en) * | 2015-01-08 | 2016-07-14 | Amir Nevet | Multimode depth imaging |
CN205450970U (en) * | 2016-02-17 | 2016-08-10 | 福建师范大学 | Guide holographically projected robot system |
CN106408480A (en) * | 2016-11-25 | 2017-02-15 | 山东孔子文化产业发展有限公司 | Sinology three-dimensional interactive learning system and method based on augmented reality and speech recognition |
-
2017
- 2017-03-08 CN CN201710133993.3A patent/CN107016733A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103649676A (en) * | 2011-04-15 | 2014-03-19 | 法罗技术股份有限公司 | Six degree-of-freedom laser tracker that cooperates with a remote structured-light scanner |
CN102789313A (en) * | 2012-03-19 | 2012-11-21 | 乾行讯科(北京)科技有限公司 | User interaction system and method |
CN103914129A (en) * | 2013-01-04 | 2014-07-09 | 云联(北京)信息技术有限公司 | Man-machine interactive system and method |
CN104898276A (en) * | 2014-12-26 | 2015-09-09 | 成都理想境界科技有限公司 | Head-mounted display device |
US20160205378A1 (en) * | 2015-01-08 | 2016-07-14 | Amir Nevet | Multimode depth imaging |
CN205450970U (en) * | 2016-02-17 | 2016-08-10 | 福建师范大学 | Guide holographically projected robot system |
CN106408480A (en) * | 2016-11-25 | 2017-02-15 | 山东孔子文化产业发展有限公司 | Sinology three-dimensional interactive learning system and method based on augmented reality and speech recognition |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108519816A (en) * | 2018-03-26 | 2018-09-11 | 广东欧珀移动通信有限公司 | Information processing method, device, storage medium and electronic equipment |
CN109669541A (en) * | 2018-09-04 | 2019-04-23 | 亮风台(上海)信息科技有限公司 | It is a kind of for configuring the method and apparatus of augmented reality content |
CN109669541B (en) * | 2018-09-04 | 2022-02-25 | 亮风台(上海)信息科技有限公司 | Method and equipment for configuring augmented reality content |
CN109407899A (en) * | 2018-09-12 | 2019-03-01 | 北京星云城科技有限公司 | A kind of desktop alternative projection system |
CN112868023A (en) * | 2018-10-15 | 2021-05-28 | 艾玛迪斯简易股份公司 | Augmented reality system and method |
CN112868023B (en) * | 2018-10-15 | 2024-05-14 | 艾玛迪斯简易股份公司 | Augmented reality system and method |
CN111290729A (en) * | 2018-12-07 | 2020-06-16 | 阿里巴巴集团控股有限公司 | Man-machine interaction method, device and system |
CN109697918A (en) * | 2018-12-29 | 2019-04-30 | 深圳市掌网科技股份有限公司 | A kind of percussion instrument experiencing system based on augmented reality |
CN109976519A (en) * | 2019-03-14 | 2019-07-05 | 浙江工业大学 | A kind of interactive display unit and its interactive display method based on augmented reality |
CN109976519B (en) * | 2019-03-14 | 2022-05-03 | 浙江工业大学 | Interactive display device based on augmented reality and interactive display method thereof |
CN110286754A (en) * | 2019-06-11 | 2019-09-27 | Oppo广东移动通信有限公司 | Projective techniques and relevant device based on eyeball tracking |
CN110286754B (en) * | 2019-06-11 | 2022-06-24 | Oppo广东移动通信有限公司 | Projection method based on eyeball tracking and related equipment |
CN110794959A (en) * | 2019-09-25 | 2020-02-14 | 苏州联游信息技术有限公司 | Gesture interaction AR projection method and device based on image recognition |
CN110941416A (en) * | 2019-11-15 | 2020-03-31 | 北京奇境天成网络技术有限公司 | Interaction method and device for human and virtual object in augmented reality |
CN111275731A (en) * | 2020-01-10 | 2020-06-12 | 杭州师范大学 | Projection type real object interactive desktop system and method for middle school experiment |
CN111275731B (en) * | 2020-01-10 | 2023-08-18 | 杭州师范大学 | Projection type physical interaction desktop system and method for middle school experiments |
CN112017488A (en) * | 2020-08-28 | 2020-12-01 | 济南浪潮高新科技投资发展有限公司 | AR-based education robot system and learning method |
CN113012300A (en) * | 2021-04-02 | 2021-06-22 | 北京隐虚等贤科技有限公司 | Immersive interactive content creation method and device and storage medium |
CN113313836A (en) * | 2021-04-26 | 2021-08-27 | 广景视睿科技(深圳)有限公司 | Method for controlling virtual pet and intelligent projection equipment |
WO2022227290A1 (en) * | 2021-04-26 | 2022-11-03 | 广景视睿科技(深圳)有限公司 | Method for controlling virtual pet and intelligent projection device |
CN115033997A (en) * | 2022-07-06 | 2022-09-09 | 山东诺环建工有限公司 | Building construction design system based on AR |
CN115033997B (en) * | 2022-07-06 | 2023-08-11 | 山东诺环建工有限公司 | AR-based building construction design system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107016733A (en) | Interactive system and exchange method based on augmented reality AR | |
US11514653B1 (en) | Streaming mixed-reality environments between multiple devices | |
CN103561829B (en) | Action triggers does gesture | |
JP7315318B2 (en) | Combining Physical and Virtual Objects in Augmented Reality | |
CN102163077B (en) | Capturing screen objects using a collision volume | |
US20220383634A1 (en) | 3d object annotation | |
US20240087261A1 (en) | Session manager | |
Thomas et al. | First person indoor/outdoor augmented reality application: ARQuake | |
US20130219357A1 (en) | Coherent presentation of multiple reality and interaction models | |
CN102688603A (en) | System of and method for real-time magic-type stage performance based on technologies of augmented reality and action recognition | |
US20100194762A1 (en) | Standard Gestures | |
US20130286004A1 (en) | Displaying a collision between real and virtual objects | |
CN103608750A (en) | Action selection gesturing | |
CN108337573A (en) | A kind of implementation method that race explains in real time and medium | |
EP3938065A1 (en) | Systems and methods for training an artificial intelligence model for competition matches | |
JP6242473B1 (en) | Method for providing virtual space, program for causing computer to execute the method, and information processing apparatus for executing the program | |
KR102054148B1 (en) | system for playing sports-related interactive contents software inducing player's kinetic behavior | |
US20240267480A1 (en) | Distributed command execution in multi-location studio environments | |
CN103608073A (en) | Shape trace gesturing | |
CN113711162A (en) | System and method for robotic interaction in mixed reality applications | |
Ghandeharizadeh | Holodeck: Immersive 3D Displays Using Swarms of Flying Light Specks | |
CN110180167B (en) | Method for tracking mobile terminal by intelligent toy in augmented reality | |
CN208723929U (en) | A kind of AR remote controler | |
CN108646578A (en) | A kind of no medium floating projected virtual picture and real interaction technique | |
KR101881227B1 (en) | Flight experience method using unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170804 |