CN107977082A - A kind of method and system for being used to AR information be presented - Google Patents
A kind of method and system for being used to AR information be presented Download PDFInfo
- Publication number
- CN107977082A CN107977082A CN201711377194.7A CN201711377194A CN107977082A CN 107977082 A CN107977082 A CN 107977082A CN 201711377194 A CN201711377194 A CN 201711377194A CN 107977082 A CN107977082 A CN 107977082A
- Authority
- CN
- China
- Prior art keywords
- information
- user
- presented
- current
- current pose
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Abstract
The purpose of the application is to provide a kind of method and system for being used to AR information be presented.Wherein, the described method includes:During AR information is presented for user, the current pose information of the user is obtained;The projection angle information of the AR information is adjusted based on the current pose information, so that the projection angle information after adjustment and the current pose information match;The AR information is presented to by the user based on the projection angle information.Compared with prior art, the application realizes the purpose for providing the AR information being consistent with its posture to the user by obtaining user's current pose information to adjust the projection angle of AR information.
Description
Technical field
This application involves computer realm, more particularly to a kind of method and system for being used to AR information be presented.
Background technology
Augmented reality (Augmented Reality, abbreviation AR), is that one kind is sleeved on virtual world now on the screen
The real world and the technology for carrying out interaction, virtual three-dimensional model animation, video, word, picture digital information are superimposed by it in real time
It is shown in real scene, and realizes with real world object or user the human-computer interaction technology of the innovation of nature interaction, emphasizes
The naturally man-machine visual interactive of virtual reality fusion.
At present, AR technologies can be in certain visual field, such as in the visual field of face actual scene, using the teaching of the invention it is possible to provide virtual information
The superimposed visual effect with actual scene.But when user move certain angle or adjustment apart from when, be superimposed vision effect
Fruit can't change according to moving for user, this causes user to watch the Overlay and can not meet that user's posture becomes
Change.
The content of the invention
The purpose of the application is to provide a kind of method and system for being used to AR information be presented, to solve the vision that AR is presented
The problem of effect can not follow the movement of user and correspond to adjustment.
According to the one side of the application, there is provided a kind of method for AR information to be presented, wherein, this method includes:
During AR information is presented for user, the current pose information of the user is obtained;Based on the current pose information tune
The projection angle information of the whole AR information, so that the projection angle information and the current pose information phase after adjustment
Matching;The AR information is presented to by the user based on the projection angle information.
In some embodiments, the current pose information further includes the current location information of the user;Wherein, institute
The method of stating further includes:The scalability information of the AR information is determined based on the current location information;Wherein, it is described to be based on the throwing
The AR information is presented to the user by shadow angle information to be included:Will based on the projection angle information and the scalability information
The AR information is presented to the user.
In some embodiments, it is described to be in by the AR information based on the projection angle information and the scalability information
Now include to the user:Processing is zoomed in and out to the AR information according to the scalability information;Believed based on the projection angle
The AR information after breath handles scaling is presented to the user.
In some embodiments, the scalability information bag that the AR information is determined based on the current location information
Include:The current distance information of the user and projection plane is determined based on acquired current location information, wherein, the projection
Plane is used to the AR information be presented;It is opposite described based on the user and the current distance information of projection plane and the user
The initial distance information of projection plane, determines the scalability information of the AR information.
In some embodiments, it is described that the user and projection plane are determined based on acquired current location information
Current distance information, wherein, the projection plane includes for the AR information to be presented:According to acquired current location information
The first affiliated coordinate system, determines correspondence position of the acquired current location information in the second coordinate system belonging to projection plane
Confidence ceases, wherein, the projection plane is used to the AR information be presented;According to acquired current location information described second
Correspondence position information in coordinate system, determines the current distance information of the user and projection plane.
In some embodiments, it is described during AR information is presented for user, obtain the current appearance of the user
State information includes:During AR information is presented for user, the current depth image information of the user is obtained;According to described
Current depth image information determines the current pose information of the user.
In some embodiments, it is described during AR information is presented for user, obtain the current appearance of the user
State information includes:During AR information is presented for user, the current pose information of user's relative measurement device is obtained;
Wherein, the projection angle information that the AR information is adjusted based on the current pose information, so that described after adjustment
Projection angle information includes with the current pose information match:Relative position based on the measuring device and projection arrangement
Relation adjusts the current pose information;The projection angle of the AR information is adjusted based on the current pose information after adjustment
Information, so that the projection angle information and the current pose information match after adjustment.
In some embodiments, the method further includes:Before detecting the relatively described user of the current pose information
Whether the attitudes vibration of one attitude information is equal to or more than attitudes vibration threshold value;Wherein, it is described to be based on the current pose information
The projection angle information of the AR information is adjusted, so that the projection angle information after adjustment and the current pose information
Match including:If the attitudes vibration is equal to or more than the attitudes vibration threshold value, adjusted based on the current pose information
The projection angle information of the AR information, so that the projection angle information and the current pose information phase after adjustment
Match somebody with somebody.
In some embodiments, the method further includes:The tagged object obtained according to user equipment, determines the mark
Remember the corresponding AR information of object;User corresponding to the user equipment is presented into the AR information;Wherein, it is described for user
During AR information is presented, obtaining the current pose information of the user includes:The AR information is being presented for the user
During, obtain the current pose information of the user.
The another aspect of the application provides a kind of system for AR information to be presented, wherein, which includes:First mould
Block, for during AR information is presented for user, obtaining the current pose information of the user;Second module, for base
The projection angle information of the AR information is adjusted in the current pose information, so that the projection angle information after adjustment
With the current pose information match;3rd module, for being presented to the AR information based on the projection angle information
The user.
In some embodiments, the current pose information further includes the current location information of the user;Wherein, institute
State the scalability information that the 3rd module is additionally operable to determine the AR information based on the current location information;And for based on described
The AR information is presented to the user by projection angle information and the scalability information.
In some embodiments, the 3rd module is used to zoom in and out the AR information according to the scalability information
Processing;And it is presented to the user for the AR information after scaling is handled based on the projection angle information.
In some embodiments, the 3rd module is used to determine the user based on acquired current location information
With the current distance information of projection plane, wherein, the projection plane be used for the AR information is presented;And for based on described
The initial distance information of user's projection plane opposite with the current distance information of projection plane and the user, determines described
The scalability information of AR information.
In some embodiments, the 3rd module is used for the first seat according to belonging to acquired current location information
Mark system, determines correspondence position information of the acquired current location information in the second coordinate system belonging to projection plane;And
For the correspondence position information according to acquired current location information in second coordinate system, determine the user with throwing
The current distance information of shadow plane;Wherein, the projection plane is used to the AR information be presented.
In some embodiments, first module is used for during AR information is presented for user, described in acquisition
The current depth image information of user, and determine that the current pose of the user is believed according to the current depth image information
Breath.
In some embodiments, first module obtains the user during AR information is presented for user
The current pose information of relative measurement device;Wherein, second module is opposite based on the measuring device and projection arrangement
Position relationship adjusts the current pose information;And the AR information is adjusted based on the current pose information after adjustment
Projection angle information, so that the projection angle information and the current pose information match after adjustment.
In some embodiments, second module is additionally operable to detect the relatively described user's of the current pose information
Whether the attitudes vibration of previous attitude information is equal to or more than attitudes vibration threshold value;If the attitudes vibration is equal to or more than described
Attitudes vibration threshold value, the projection angle information of the AR information is adjusted based on the current pose information, so that after adjustment
The projection angle information and the current pose information match.
In some embodiments, further include:4th module is used for the tagged object obtained according to user equipment, determines institute
State the corresponding AR information of tagged object;Wherein, first module is used to the process of the AR information be presented for the user
In, obtain the current pose information of the user;And the 3rd module be used for the AR information is presented to the user equipment
Corresponding user.
The application another further aspect provides a kind of system for AR information to be presented, including:Measuring device, for obtaining user
Current pose information;Projection arrangement, for the AR received information to be projected;Computer equipment, fills with the measurement
Put and be connected with projection arrangement, for performing as above any method based on acquired current pose information.
In some embodiments, further include:User equipment, for obtaining the tagged object of corresponding A R information;The meter
Calculate machine equipment and the corresponding AR information of the correspondence mark is supplied to projection arrangement.
The application another further aspect provides a kind of computer equipment, including:One or more processors;Memory, for depositing
The one or more computer programs of storage;When one or more of computer programs are performed by one or more of processors
When so that one or more of processors realize as above any one of them method.
The application another further aspect provides a kind of computer-readable recording medium, and the computer-readable recording medium storage has
Computer code, when the computer code is performed, as above any one of them method is performed.
The application another further aspect provides a kind of computer program product, when the computer program product is by computer equipment
During execution, as above any one of them method is performed.
Compared with prior art, the application adjusts the projection angle of AR information by obtaining user's current pose information,
Realize the purpose for providing the AR information being consistent with its posture to the user.
Brief description of the drawings
By reading the detailed description made to non-limiting example made with reference to the following drawings, the application's is other
Feature, objects and advantages will become more apparent upon:
Fig. 1 shows a kind of configuration diagram of system for being used to present AR information according to the application one side;
Fig. 2 shows the flow chart according to a kind of another method for being used to present AR information of the application;
Fig. 3 shows the diagram of the projection angle information according to herein described system call interception AR information;
Fig. 4 shows a kind of block schematic illustration of system for being used to present AR information according to the application another aspect.
The same or similar reference numeral represents the same or similar component in attached drawing.
Embodiment
The application is described in further detail below in conjunction with the accompanying drawings.
In one typical configuration of the application, terminal, the equipment of service network and trusted party include one or more
Processor (CPU), input/output interface, network interface and memory.
Memory may include computer-readable medium in volatile memory, random access memory (RAM) and/or
The forms such as Nonvolatile memory, such as read-only storage (ROM) or flash memory (flash RAM).Memory is computer-readable medium
Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method
Or technology come realize information store.Information can be computer-readable instruction, data structure, the module of program or other data.
The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), moves
State random access memory (DRAM), other kinds of random access memory (RAM), read-only storage (ROM), electric erasable
Programmable read only memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc read-only storage (CD-ROM),
Digital versatile disc (DVD) or other optical storages, magnetic cassette tape, magnetic disk storage or other magnetic storage apparatus or
Any other non-transmission medium, the information that can be accessed by a computing device available for storage.Defined according to herein, computer
Computer-readable recording medium does not include non-temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
Fig. 1 shows a kind of system for AR information to be presented according to the application one side.Wherein, the system includes
Measuring device 11, computer equipment 13 and projection arrangement 12.
Wherein, measuring device 11 is used for the current pose information for obtaining user.Here, in order to capture the current of user
Attitude information, the measuring device 11 need to be placed in the position for being capable of user in real attitude information.For this reason, the measurement
Device 11 includes but not limited to portable attitude information acquisition device, is arranged on the attitude information acquisition dress of fixed position in space
Put.
Wherein, the portable attitude information acquisition device citing includes:Mobile phone, intelligent watch, tablet computer etc..These
Portable attitude information acquisition device be placed on user oriented viewing AR images position or other can get user work as
The position of preceding attitude information.For example, obtain the image of user using the camera module in portable attitude information acquisition device
Data;Described image data can be used to describe user's current pose information.Wherein, the current pose information includes but unlimited
In:Such as midstance, sitting posture, the head pose of user, its using human body symmetry axis for being identified in the picture relative to
The drift angle of the default axis of image describes.The portable attitude information acquisition device can also include similar Intelligent glove etc.
Wearable intelligent apparatus, is obtained using its internal such as pressure sensor, angular transducer, displacement sensor sensing unit
User's current pose information.
Be arranged on the attitude information acquisition device of fixed position in space can be fixedly installed to projection plane and towards with
Family, or even can be with the multiple corner locations of distributing installation in doors, in order to follow the trail of the view data for including user, or panorama
Obtain indoor all view data, or even depth data etc..This kind of attitude information acquisition device includes but not limited to image dress
Put, depth sensing array, ultrasonic distance-measuring sensor, deep image information etc..The image of user is obtained using camera device
Data;Depth data (i.e. distance) of the user at a distance of the depth sensing array is obtained using depth sensing array.User's
Current pose information can be described by means of acquired view data and depth data.Here, the current pose information is also
The current location of user can be included;It can be used to describe the current location of user with reference to described image data and depth data.Example
Such as, the measuring device 11 can also be a kind of TOF sensor, it gets the depth letter of each pixel in correspondence image data
Breath, and with the current pose information of the data structure description user of RGB-D.
The measuring device 11 can also include any of the above-described kind of portable attitude information acquisition device and with it is described portable
Other attitude information acquisition device that formula attitude information acquisition device is used cooperatively.Other described attitude information acquisition device are used for
Supplement obtains user's current pose information.For example, measuring device 11 includes the intelligent wearable device being worn on user and sets
Camera device (as having the camera device or binocular camera shooting device of depth detection) in the room is put, is worn and filled using intelligence
Put and collect caused such as angle when user changes posture, height sensed data, and user's figure is obtained using camera device
The position of picture and user in the room;Accordingly, user's current pose information measured by measuring device 11 includes:User's posture
Change and the sensed data such as the angle of generation, height, and for representing angle, the depth information of the position of user in the room
Deng.Measuring device used above is only for example rather than the limitation to the application, for example, measuring device can also be comprising built-in
Blue-tooth device A1 in portable equipment and the inertial sensor of drift angle is moved for measuring user, and be arranged on computer and set
The blue-tooth device A2 with blue-tooth device A1 pairings in standby, wherein, the blue-tooth device A1 and A2 of pairing can be used for measurement to use
Family relative to computer equipment distance, the measurable user of inertial sensor relative to previous moment position or initial position shifting
Drift angle is moved, thus measuring device measures positional information of the user relative to computer equipment.Technical staff can be based on above-mentioned measurement
The example of device and its it is interchangeable other can measure user's attitude information equipment or instrument to obtain corresponding user current
Attitude information.
It should be noted that the acquisition modes of above-mentioned current pose information are only for example, rather than the limitation to the application.Profit
The current pose information that user is obtained with other known Posture acquisition mode also belongs to example under present techniques thought.
Here, the measuring device 11 need to be with 13 data communication of computer equipment.For example, measuring device 11 passes through data cable
Connect computer equipment 13.And for example measuring device 11 passes through wireless network connection computer equipment 13.Computer equipment 13 is controllable
Measuring device 11 processed makes it obtain the current pose information of the user in AR information processes are presented for user.Alternatively, measurement
Device 11 obtains the current pose information of user all the time and acquired current pose is believed in the request based on computer equipment 13
Breath is sent to computer equipment 13.
The computer equipment 13 mainly has digital operation and logic processing capability, it includes but not limited to:PC,
Laptop, other computer equipments based on embedded OS etc..The computer equipment 13 includes at least:One
Or multiple processors, memory, with the first interface unit of 11 data communication of measuring device, with projection arrangement into row data communication
Second interface unit etc..Wherein, the memory includes high-speed memory and slow memory, and the high-speed memory includes
But memory is not limited to, slow memory includes but not limited to such as hard disk nonvolatile memory, is stored with journey in memory
Sequence, acquired current pose information and AR information for user's presentation etc..Processor passes through each indicated by executive program
Timing instructions are to control each hardware being connected with processor.The when sort run of each hardware performs the work of the computer equipment 13
Make process.The first interface unit and second interface unit can be shared or be configured separately, it includes but not limited to:Network connects
Mouth, data line interface etc..Wherein, the AR information includes but not limited to:Video, 3D dynamic models, word, picture etc..
The projection arrangement is used to be projected the AR information received.Here, projection arrangement can be fixedly mounted on room
Between projection plane before, or adjustable be placed on room optional position.The projection arrangement can pass through data cable or wireless communication
Mode connect computer equipment 13.The projection plane can be special projection screen, wall, even desktop, actual physics
Any of virtual plane in space.
The AR information that the projection arrangement 12 is projected can be default.In some embodiments, the projection dress
Putting the 12 AR information projected can be provided by user's operation user equipment.(not given for example, the system also includes user equipment
Diagram), it is used to obtain tagged object and is supplied to computer equipment 13.Wherein, the tagged object includes but not limited to:Two
Dimension code, photo, scene image, material picture, three-dimensional body have given shape object etc..The user equipment includes but unlimited
In:Scanner, mobile phone, tablet computer, camera or other user equipmenies.Scanned the two-dimensional code for example, user holds mobile phone, utilize mobile phone
In built-in Quick Response Code resolver obtain the link of the AR information to be launched, the link is supplied to computer equipment 13 then
Can be by corresponding AR presentation of information on a projection plane.And for example, user holds mobile phone shooting photo, and is supplied to photo using mobile phone
Computer equipment, extracts the identification code in photo by computer equipment and obtains corresponding AR information.
It should be noted that disposal ability by user equipment, design such as need to be influenced at the factor, the user equipment may be used also
To obtain tagged object, the AR information corresponding to the tagged object is determined, and identified AR information is passed into calculating
Obtained AR information, the processing of described process as described below is carried out by computer equipment 13 by machine equipment 13, and is passed through
Projection arrangement is shown.For example, user holds mobile phone shooting photo, the photo is transmitted in computer equipment 13, then is counted
Calculating machine equipment 13 can be using the photo as AR presentation of information on a projection plane.And for example, the graphics that user will store in computer
As data, video, document or other visualization files are transmitted in computer equipment 13, then computer equipment 13 can will be corresponding
Visualize file as AR presentation of information on a projection plane.
Step in the course of work of the computer equipment 13 method as shown in Figure 2:
In step s 110, during AR information is presented for user, the current pose information of the user is obtained.
Here, according to program enabled instruction, computer equipment 13 believes the AR information stored or the AR downloaded from network
Breath is supplied to projection arrangement.At the beginning of starting, the AR information is supplied to by computer equipment 13 according to default initial attitude
Projection arrangement, and start to obtain the current pose information of user.
Here, in order to obtain user in actual physics space according to view data, sensing data (such as depth data)
In current pose information, prestore in the computer equipment 13 be related to the coordinate pair of image pixel and physical space should
Relation.The computer equipment 13 can obtain more directly describing the current of user's current pose based on acquired view data
Attitude information.For example, computer equipment 13 is tracked people's object area in view data using visual pursuit technology, thus
User is obtained on such as attitude information such as station, seat, torticollis.So the current pose information of user is mainly believed comprising head pose
Breath, body posture information etc..Said current pose information can be the attitudes vibration information relative to previous moment, also may be used
To be the absolute pose information relative to default fixed pose.
Further, the computer equipment 13 can also utilize depth data obtain user relative to launch plane away from
From;And combination view data and depth data can also obtain the angle letter that user is moved from previous moment to current time
Breath and positional information.Wherein, initial angle information and initial position are preset with the computer equipment 13, then the computer
Current location information is further included in the current pose information that equipment 13 is provided.The current location information is included relative to initial
The current location information of angle and initial distance.
When the current depth image information that the current pose information is provided by measuring device 11 provides, the step
S110 may include:During AR information is presented for user, the current depth image information of the user is obtained;And according to
The current depth image information determines the step of current pose information of the user.
For example, the computer equipment 13 is filled in the start-up operation operation AR based on user in application, obtaining measurement in real time
11 deep image informations provided are provided, and are determined based on the feature extraction to RGB information in deep image information taken
The head of user, the feature such as body, and using each pixel in deep image information depth data determine it is taken
The current location information of user's face, limbs etc..Then, the computer equipment 13 is according to the symmetry axis phase on obtained head
Determine the current drift angle of user for the angle of image symmetrical axis, and with the beeline on head to measuring device 11 (or
Average distance or longest distance etc.) distance as user's head relative to measuring device 11.In some instances, the head
Portion's symmetry axis can be described relative to the angle of image symmetrical axis using the position relationship of user's eyes, the computer equipment 13
The current drift angle of user's head can be determined by analyzing user's eyes in image relative to the angle of image symmetrical axis, and be based on institute
The distances of the correspondence user's eyes measured and direction obtain the position of active user in a room.
And for example, measuring device 11 is TOF (Time of Flight) depth transducer.TOF depth transducers be by
Target in scene continuously transmits light pulse, the light returned from target object is then received using sensor, by detecting light arteries and veins
The flight time of punching obtains the distance of target object.TOF depth transducers can be in addition to the two dimensional image for obtaining scene objects, also
The depth information (distance) of each pixel in scene can be obtained real-time.These information that sensor obtains are sent to
Computer equipment 13.Computer equipment 13 can complete identification to target by identification and track algorithm, posture judges and with
Track.When position and the attitudes vibration of user, depth transducer can obtain the two dimensional image and depth information of user automatically, transmission
To computer equipment 13, by the recognizer (such as gesture recognition algorithms based on deep learning etc.) in computer equipment 13
It can obtain the attitudes vibration information of user.
In the step s 120, the projection angle information of the AR information is adjusted based on the current pose information, so that
The projection angle information and the current pose information match after adjustment.
In some embodiments, computer equipment 13 can determine that AR information is being thrown according to acquired current pose information
Deflection angle information in shadow plane, and then adjust the projection angle information of AR information.For example, computer equipment 13 is according to being obtained
The head pose information taken, which determines from the drift angle A1 in initial state information to become when user watches AR information, turns to current pose letter
Drift angle A2 in breath.Poor relative to the drift angle of initial state information according to the head pose information, computer equipment 13 is by AR
The projection angle information of information switchs to drift angle B2 from drift angle B1, the change of its AR information (as letter H) presented as shown in figure 3,
Wherein, the AR information that solid line H is projected by user under initial attitude, the AR that dotted line H is projected by user under current pose
Information.
In other embodiment, computer equipment 13 can determine AR information according to acquired current pose information
Virtual projection plane.For example, computer equipment 13 is according to user current location, angle, range information etc. in current pose information,
Determine the virtual projection plane that user faces under current pose.Computer equipment 13 is projected according to default projection arrangement
Projection plane and the angle of virtual projection plane determine to project to AR information into the projection angle information of virtual projection plane.Example
Such as, using the projection plane and the angle of virtual projection plane as projection angle information.And for example, projection plane is thrown with virtual
The supplementary angle of the angle of shadow plane is as projection angle information.Then, computer equipment 13 is according to obtained projection angle information
Font, image deformation etc. in the presented AR information of adjustment so that user is seen from previous moment posture to current time posture
The same AR information arrived is basically identical.
Specifically, relative position relation of the computer equipment 13 based on the measuring device 11 and projection arrangement adjusts
The current pose information;And the projection angle letter of the AR information is adjusted based on the current pose information after adjustment
Breath, so that the projection angle information and the current pose information match after adjustment.
Here, computer equipment 13 is default or can pre-enter measuring device 11 and projection arrangement in actual physics space
Relative position relation, and the initial distance information of the relatively described projection plane of the user.Wherein described relative position is closed
System includes but not limited to:The optical axis of spacing and projection arrangement between measuring device 11 and projection arrangement and measuring device 11 it
Between angle etc..The initial distance can prompt the erect-position of user when starting AR applications by computer equipment 13 and obtain.Example
Such as, AR is being started in application, computer equipment shows the initial position that should locate of user by projection arrangement, and at the beginning of thereby determining that
Beginning distance.When computer equipment 13 obtains user relative to measuring device by visual identity, tracking technique and location technology
During 11 current pose information, obtained current pose information is turned using default relative position relation and initial distance
Change current pose information of the user relative to projection arrangement into.The computer equipment 13 is believed further according to transformed current pose
Breath adjusts the projection angle information of the AR information, scaling etc..
The frequent adjustment of AR information in order to prevent, for example, preventing the synchronous adjustment AR information in user's attitudes vibration in short-term.
The step S120, which is further included, to be detected the attitudes vibration of the previous attitude information of the relatively described user of current pose information and is
No the step of being equal to or more than attitudes vibration threshold value.
Here, the computer equipment 13 not necessarily adjusts the projection angle information of AR information frame by frame, but with default
Time interval or image spacing obtain the attitude information of user, even positional information, and the attitude information of temporary previous moment and
Positional information.When computer equipment 13 gets current pose information and current location information, the current pose letter is detected
Whether manner of breathing is equal to or more than attitudes vibration threshold value to the attitudes vibration of the previous attitude information of the user.Wherein, the appearance
State change threshold includes the angle change threshold value in actual physics space, the pixel separation threshold value in image.If the posture becomes
Change is equal to or more than the attitudes vibration threshold value, and the projection angle information of the AR information is adjusted based on the attitudes vibration, with
So that the projection angle information and the current pose information match after adjustment.
On this basis, computer equipment 13 can also obtain user relative to upper by acquired current location information
The change in location information of one position, and the position for detecting the prior location information of the relatively described user of the current location information becomes
Whether change is equal to or more than change in location threshold value.Wherein, the distance that the change in location threshold value is included in actual physics space becomes
Pixel separation threshold value in change threshold value, image.If the change in location is equal to or more than the change in location threshold value, based on described
Change in location adjusts the scaling of the AR information, so that the projection angle information and scaling and institute after adjustment
State current pose information match.
It should be noted that the mode of the above-mentioned projection angle information respectively based on user's attitude information adjustment AR information is only
Example, those skilled in the art are regarded as the tool of the application based on independent, combination or the scheme used for reference above-mentioned example and designed
Body example, is no longer described in detail one by one herein.
In step s 130, the AR information is presented to by the user based on the projection angle information.
Here, computer equipment 13 adjusts AR information according to projection angle information.Wherein, according to AR information to be projected
Type carries out the adjustment based on projection angle information to AR information.For example, AR information is 3-D view, then will according to projection angle
AR information carries out rotation processing.And for example, AR information is two dimensional image, and very strong solid can be given people in terms of certain angle by being drawn with reference to 3D
Sense, deformation process is carried out according to projection angle information to the imagery exploitation transformation matrix (such as homography matrix) to be presented.Again
Such as, AR information includes text information, then can be first sharp again into image by text conversion according to default font (such as artistic font)
The AR information comprising the text information is given at deformation according to projection angle information with transformation matrix (such as homography matrix)
Reason.
Deformed AR images are sent to projection arrangement 12 by computer equipment 13, and it is flat to project projection by projection arrangement 12
On face.Thus the AR information that user is seen in attitudes vibration maintains the AR information seen under user's front viewing angle all the time.
In other embodiments, computer equipment 13 also perform determined based on acquired current location information it is described
The step of scalability information of AR information.Here, the computer equipment 13 determines to use according to depth data or deep image information
The current distance at a distance of projection plane in family, and the correspondence based on default image as unit pixel Yu unit physical size, will
AR information is amplified, reduced or is remained unchanged.For example, present bit of the computer equipment 13 in current pose information
Confidence breath can obtain the distance between user and measuring device 11, and combine the position between default measuring device and projection plane
The relation of putting can determine distance of the user at a distance of projection plane.Each distance segment and AR can be also preset with the computer equipment 13
The correspondence of the scaling of information, the scalability information of AR information is determined based on the correspondence.
It should be noted that the mode of the above-mentioned scalability information that the AR information is determined based on the current location information is only
For citing, change and pair of the scaling of AR information of the front and rear two moment user at a distance of projection plane are in fact also based on
It should be related to the scalability information of definite AR information.Those skilled in the art are on the basis of the determination mode of any of the above-described kind of scalability information
It is improved, and the mode for obtaining the scalability information of AR information should be regarded as the specific example of the application.
In some embodiments, in actual installation scene, measuring device 11 can not necessarily be installed with projection arrangement
In same position, this cause the current pose information acquired in computer equipment 13 to be the measurement angle based on measuring device 11 and
Obtain.But the AR information that user is watched need to be presented based on the projection angle of projection arrangement.So computer equipment
13 also perform following steps:The current distance information of user and projection plane is determined based on acquired current location information, its
In, the projection plane is used to the AR information be presented;And current distance information based on the user and projection plane and
The initial distance information of the relatively described projection plane of the user, determines the scalability information of the AR information.
In some specific examples, the computer equipment 13 performs following steps to determine working as user and projection plane
Front distance information:The first coordinate system according to belonging to acquired current location information, determines acquired current location information
Correspondence position information in the second coordinate system belonging to projection plane;According to acquired current location information described second
Correspondence position information in coordinate system, determines the current distance information of the user and projection plane.Wherein, the projection plane
For the AR information to be presented.
For example, measuring device 11 is TOF camera devices, the computer equipment 13 is built to be imaged with TOF in advance and filled respectively
It is set to the first coordinate system of origin, with the second coordinate system of plane where projection plane and the correspondence of two coordinate systems.Its
In, the correspondence be based on determined by advance through calibration TOF camera devices and projection plane in actual physics space
Position relationship and establish.Computer equipment 13 can obtain in deep image information acquired in TOF camera devices to application
Current location information of each pixel in family face-image region in the first coordinate system;This is determined further according to the correspondence
A little correspondence position information of the pixel in the second coordinate system, thereby determine that the user's face that TOF camera devices can photograph
The current distance information of profile and projection plane.Thus, computer equipment 13 can be according to obtained user's face profile with throwing
The current distance information of shadow plane determines the scalability information of AR information.
AR information after scaling is supplied to projection arrangement 12 by the computer equipment 13 with corresponding projection angle, for
Projection arrangement 12 is presented.
Fig. 4 shows a kind of system for AR information to be presented according to the application another aspect.This is used to AR information be presented
System be mainly arranged in foregoing computer equipment, or other include following each devices computer equipments in.Wherein, institute
State includes for the system 2 of AR information to be presented:First module 21, the second module 22 and the 3rd module 23.Wherein, the system can
The current pose information of user is obtained by measuring device and corresponding AR information is presented by projection arrangement.
First module 21 is used for the current pose information for during AR information is presented for user, obtaining the user.
Here, according to program enabled instruction, the first module 21 is by the AR information stored or the AR information downloaded from network
It is supplied to projection arrangement.Wherein, the AR information can also be obtained by the 4th module.4th module (being unillustrated) is used for
The tagged object obtained according to user equipment, determines the corresponding AR information of the tagged object.
In some embodiments, the operable user equipment of user obtains tagged object, and the label information is provided
To the 4th module.4th module determines the AR information corresponding to the tagged object.Wherein, the tagged object include but
It is not limited to:Quick Response Code, photo, scene image, material picture, three-dimensional body have given shape object etc..The user equipment
Including but not limited to:Mobile phone, tablet computer, camera or other user equipmenies.
Scanned the two-dimensional code for example, user holds mobile phone, obtain what is launched using Quick Response Code resolver built-in in mobile phone
The link of AR information, is supplied to the 4th module, then the 4th module can will obtain AR information from the link by the link.Again
Such as, user holds mobile phone shooting photo, and photo is supplied to the 4th module using mobile phone, and the mark in photo is extracted by the 4th module
Know code and obtain corresponding AR information.
In addition, disposal ability by user equipment, design such as need to be influenced at the factor, user equipment used by a user may be used also
To obtain tagged object, the AR information corresponding to the tagged object is determined, and identified AR information is passed to the 4th
Module, so that the first module, the second module and the 3rd module believe acquired AR based on the current pose information for obtaining user
Breath is adjusted, and is shown by projection arrangement.For example, user holds mobile phone shooting photo, the photo is transmitted to
4th module, then the 4th module can be using the photo as AR information to be presented.And for example, the graphics that user will store in computer
As data, video, document or other visualization files be transmitted to the 4th module, then the 4th module will can accordingly visualize file
As AR information to be presented.
At the beginning of starting, the AR information is supplied to projection arrangement by the first module 21 according to default initial attitude, and
Start the current pose information of acquisition user.
Here, in order to obtain user in actual physics space according to view data, sensing data (such as depth data)
In current pose information, prestore in first module 21 and be related to the coordinate pair of image pixel and physical space and should close
System.First module 21 can obtain more directly describing the current pose of user's current pose based on acquired view data
Information.For example, the first module 21 is tracked people's object area in view data using visual pursuit technology, thus used
Family is on such as attitude information such as station, seat, torticollis.So the current pose information of user is mainly comprising head pose information, body
Attitude information etc..Said current pose information can be the attitudes vibration information or phase relative to previous moment
Absolute pose information for presetting fixed pose.
Further, first module 21 can also utilize depth data obtain user relative to launch plane away from
From;And combination view data and depth data can also obtain the angle letter that user is moved from previous moment to current time
Breath and positional information.Wherein, initial angle information and initial position are preset with first module 21, then first module
Current location information is further included in the 21 current pose information provided.The current location information is included relative to initial angle
With the current location information of initial distance.
When the current depth image information that the current pose information is provided by measuring device provides, first module
21 for user during AR information is presented, and obtains the current depth image information of the user;And according to described current
Deep image information determines the step of current pose information of the user.
For example, first module 21 runs AR in application, obtaining measuring device in real time in the start-up operation based on user
The deep image information provided, and taken use is determined based on the feature extraction to RGB information in deep image information
The features such as the head at family, body, and determine taken user using the depth data of each pixel in deep image information
The current location information of face, limbs etc..Then, first module 21 according to the symmetry axis on obtained head relative to figure
As the angle of symmetry axis determines the current drift angle of user, and with the beeline on head to measuring device (or average distance,
Or longest distance etc.) distance as user's head relative to measuring device.In some instances, the head symmetry axis is opposite
It can be described in the angle of image symmetrical axis using the position relationship of user's eyes, the computer equipment 13 can pass through analysis chart
User's eyes determine the current drift angle of user's head relative to the angle of image symmetrical axis as in, and based on measured correspondence
The distances of user's eyes and direction obtain the position of active user in a room.
And for example, measuring device is TOF (Time of Flight) depth transducer.TOF depth transducers are by field
Target in scape continuously transmits light pulse, and the light returned from target object is then received using sensor, passes through detecting optical pulses
Flight time obtain the distance of target object.TOF depth transducers can may be used also in addition to the two dimensional image for obtaining scene objects
To obtain the depth information (distance) of each pixel in scene real-time.Send these information that sensor obtains to meter
Calculate machine equipment.The first module 21 in computer equipment can complete identification to target, posture by identification and track algorithm
Judge and track.When position and the attitudes vibration of user, depth transducer can obtain the two dimensional image and depth of user automatically
Information, sends the first module 21 to, and the first module 21 utilizes recognizer (such as gesture recognition algorithms based on deep learning
Deng) the attitudes vibration information of user can be obtained.
Second module 22 is used for the projection angle information that the AR information is adjusted based on the current pose information, so that
The projection angle information and the current pose information match after adjustment.
In some embodiments, the second module 22 can determine that AR information is projecting according to acquired current pose information
Deflection angle information in plane, and then adjust the projection angle information of AR information.For example, the second module 22 is according to acquired
Head pose information is determined from the drift angle A1 in initial state information to become when user watches AR information and turned in current pose information
Drift angle A2.Poor relative to the drift angle of initial state information according to the head pose information, the second module 22 is by AR information
Projection angle information switchs to drift angle B2 from drift angle B1, the change of its AR information (as letter H) presented is as shown in figure 3, wherein, reality
The AR information that line H is projected by user under initial attitude, the AR information that dotted line H is projected by user under current pose.
In other embodiment, the second module 22 can determine the void of AR information according to acquired current pose information
Intend projection plane.For example, the second module 22 is determined according to user current location, angle, range information etc. in current pose information
The virtual projection plane that user faces under current pose.Second module 22 is put down according to the projection that default projection arrangement is projected
Face and the angle of virtual projection plane determine to project to AR information into the projection angle information of virtual projection plane.For example, by institute
Projection plane is stated with the angle of virtual projection plane as projection angle information.And for example, by projection plane and virtual projection plane
Angle supplementary angle as projection angle information.Then, the second module 22 is according to the adjustment of obtained projection angle information
Font, image deformation etc. in existing AR information so that user is seen same from previous moment posture to current time posture
AR information is basically identical.
Specifically, second module 22 is based on described in the adjustment of the relative position relation of the measuring device and projection arrangement
Current pose information;And the projection angle information of the AR information is adjusted based on the current pose information after adjustment, with
So that the projection angle information and the current pose information match after adjustment.
Here, the second module 22 is default or can pre-enter phase of the measuring device with projection arrangement in actual physics space
To position relationship, and the initial distance information of the relatively described projection plane of the user.Wherein described relative position relation bag
Include but be not limited to:Angle between the optical axis and measuring device of spacing and projection arrangement between measuring device and projection arrangement
Deng.The initial distance can prompt the erect-position of user when starting AR applications by the second module 22 and obtain.For example, starting AR
In application, the initial position that computer equipment should be located by projection arrangement display, and thereby determine that initial distance.When second
Module 22 obtains current pose information of the user relative to measuring device by visual identity, tracking technique and location technology
When, obtained current pose information is converted into user relative to throwing using default relative position relation and initial distance
The current pose information of image device.Second module 22 adjusts the AR information according to transformed current pose information again
Projection angle information, scaling etc..
The frequent adjustment of AR information in order to prevent, for example, preventing the synchronous adjustment AR information in user's attitudes vibration in short-term.
The attitudes vibration of the second module 22 also previous attitude information of the relatively described user of current pose information described in perform detection
The step of whether being equal to or more than attitudes vibration threshold value.
Here, second module 22 not necessarily adjusts the projection angle information of AR information frame by frame, but with it is default when
Between interval or image spacing obtain the attitude information of user, even positional information, and the attitude information of temporary previous moment and position
Confidence ceases.When the second module 22 gets current pose information and current location information, the current pose information phase is detected
Whether attitudes vibration threshold value is equal to or more than to the attitudes vibration of the previous attitude information of the user.Wherein, the posture becomes
Changing threshold value includes the angle change threshold value in actual physics space, the pixel separation threshold value in image.If described attitudes vibration etc.
In or more than the attitudes vibration threshold value, the projection angle information of the AR information is adjusted based on the attitudes vibration, so that
The projection angle information and the current pose information match after adjustment.
On this basis, the second module 22 can also obtain user relative to upper one by acquired current location information
The change in location information of position, and detect the change in location of the prior location information of the relatively described user of the current location information
Whether change in location threshold value is equal to or more than.Wherein, the change in location threshold value includes the distance change in actual physics space
Pixel separation threshold value in threshold value, image.If the change in location is equal to or more than the change in location threshold value, based on institute's rheme
Put the scaling that change adjusts the AR information so that the projection angle information and scaling after adjustment with it is described
Current pose information match.
It should be noted that the mode of the above-mentioned projection angle information respectively based on user's attitude information adjustment AR information is only
Example, those skilled in the art are regarded as the tool of the application based on independent, combination or the scheme used for reference above-mentioned example and designed
Body example, is no longer described in detail one by one herein.
3rd module 23 is used to the AR information is presented to the user based on the projection angle information.
Here, the 3rd module 23 adjusts AR information according to projection angle information.Wherein, according to the class of AR information to be projected
Type carries out the adjustment based on projection angle information to AR information.For example, AR information is 3-D view, then the 3rd module 23 is according to throwing
AR information is carried out rotation processing by shadow angle.And for example, AR information is two dimensional image, and being drawn with reference to 3D can give people very in terms of certain angle
Strong three-dimensional sense, the 3rd module 23 (such as singly answer square according to projection angle information to the imagery exploitation transformation matrix to be presented
Battle array) carry out deformation process.For another example, AR information includes text information, then the 3rd module 23 can be first according to default font (such as art
Font etc.) by text conversion into image recycle transformation matrix (such as homography matrix) according to projection angle information to comprising described
The AR information of text information gives deformation process.
Deformed AR images are sent to projection arrangement by the 3rd module 23, are projected by projection arrangement on projection plane.
Thus the AR information that user is seen in attitudes vibration maintains the AR information seen under user's front viewing angle all the time.
In other embodiments, the 3rd module 23 is also performed determines the AR based on acquired current location information
The step of scalability information of information.Here, the 3rd module 23 determines that user works as according to depth data or deep image information
The preceding distance at a distance of projection plane, and the correspondence based on default image as unit pixel Yu unit physical size, AR is believed
Breath is amplified, reduced or is remained unchanged.For example, current location information of the 3rd module 23 in current pose information
The distance between user and measuring device are can obtain, and combines the position relationship energy between default measuring device and projection plane
Enough determine distance of the user at a distance of projection plane.The scaling of each distance segment and AR information can be also preset with 3rd module 23
The correspondence of ratio, the scalability information of AR information is determined based on the correspondence.
It should be noted that the mode of the above-mentioned scalability information that the AR information is determined based on the current location information is only
For citing, change and pair of the scaling of AR information of the front and rear two moment user at a distance of projection plane are in fact also based on
It should be related to the scalability information of definite AR information.Those skilled in the art are on the basis of the determination mode of any of the above-described kind of scalability information
It is improved, and the mode for obtaining the scalability information of AR information should be regarded as the specific example of the application.
In some embodiments, in actual installation scene, measuring device can be not necessarily installed on projection arrangement
Same position, this make it that the current pose information acquired in the 3rd module 23 is obtained based on the measurement angle of measuring device
's.But the AR information that user is watched need to be presented based on the projection angle of projection arrangement.So the 3rd module 23 is also held
Row following steps:The current distance information of user and projection plane is determined based on acquired current location information, wherein, it is described
Projection plane is used to the AR information be presented;And the current distance information based on the user and projection plane and the user
The initial distance information of relatively described projection plane, determines the scalability information of the AR information.
In some specific examples, the 3rd module 23 performs following steps to determine the current of user and projection plane
Range information:The first coordinate system according to belonging to acquired current location information, determines that acquired current location information exists
The correspondence position information in the second coordinate system belonging to projection plane;Sat according to acquired current location information described second
Correspondence position information in mark system, determines the current distance information of the user and projection plane.Wherein, the projection plane is used
In the presentation AR information.
For example, measuring device is TOF camera devices, the 3rd module 23 build respectively in advance using TOF camera devices as
First coordinate system of origin, with the second coordinate system of plane where projection plane and the correspondence of two coordinate systems.Wherein,
The correspondence is the position of TOF camera devices and projection plane in actual physics space based on determined by advance through calibration
Put relation and establish.3rd module 23, which can obtain in the deep image information acquired in TOF camera devices, corresponds to user's face
Current location information of each pixel of image-region in the first coordinate system;These pixels are determined further according to the correspondence
Correspondence position information o'clock in the second coordinate system, thereby determine that user's face profile that TOF camera devices can photograph with
The current distance information of projection plane.Thus, the 3rd module 23 can be according to obtained user's face profile and projection plane
Current distance information determines the scalability information of AR information.
AR information after scaling is supplied to projection arrangement by the 3rd module 23 with corresponding projection angle, for projection
Device is presented.
In conclusion the method and system for being used to present AR information of the application, by obtaining user's current pose information
To adjust the projection angle of AR information, the purpose for providing the AR information being consistent with its posture to the user is realized.So the application
Effectively overcome various shortcoming of the prior art and have high industrial utilization.
It is obvious to a person skilled in the art that the application is not limited to the details of above-mentioned one exemplary embodiment, Er Qie
In the case of without departing substantially from spirit herein or essential characteristic, the application can be realized in other specific forms.Therefore, no matter
From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and scope of the present application is by appended power
Profit requires rather than described above limits, it is intended that all in the implication and scope of the equivalency of claim by falling
Change is included in the application.Any reference numeral in claim should not be considered as to the involved claim of limitation.This
Outside, it is clear that one word of " comprising " is not excluded for other units or step, and odd number is not excluded for plural number.That is stated in device claim is multiple
Unit or device can also be realized by a unit or device by software or hardware.The first, the second grade word is used for table
Show title, and be not offered as any specific order.
Claims (23)
1. a kind of method for AR information to be presented, wherein, this method includes:
During AR information is presented for user, the current pose information of the user is obtained;
The projection angle information of the AR information is adjusted based on the current pose information, so that the projected angle after adjustment
Spend information and the current pose information match;
The AR information is presented to by the user based on the projection angle information.
2. according to the method described in claim 1, wherein, the current pose information further includes the present bit confidence of the user
Breath;
Wherein, the method further includes:
The scalability information of the AR information is determined based on the current location information;
Wherein, it is described the AR information is presented to by the user based on the projection angle information to include:
The AR information is presented to by the user based on the projection angle information and the scalability information.
3. according to the method described in claim 2, wherein, the projection angle information and the scalability information of being based on is by institute
Stating AR information and being presented to the user includes:
Processing is zoomed in and out to the AR information according to the scalability information;
The AR information after scaling is handled based on the projection angle information is presented to the user.
4. according to the method in claim 2 or 3, wherein, it is described that the AR information is determined based on the current location information
Scalability information include:
The current distance information of the user and projection plane is determined based on acquired current location information, wherein, the throwing
Shadow plane is used to the AR information be presented;
Initial distance based on the user projection plane opposite with the current distance information of projection plane and the user
Information, determines the scalability information of the AR information.
5. according to the method described in claim 4, wherein, it is described based on acquired current location information determine the user with
The current distance information of projection plane, wherein, the projection plane includes for the AR information to be presented:
The first coordinate system according to belonging to acquired current location information, determines that acquired current location information is flat in projection
The correspondence position information in the second coordinate system belonging to face, wherein, the projection plane is used to the AR information be presented;
According to correspondence position information of the acquired current location information in second coordinate system, determine the user with throwing
The current distance information of shadow plane.
6. method according to any one of claim 1 to 5, wherein, it is described during AR information is presented for user,
Obtaining the current pose information of the user includes:
During AR information is presented for user, the current depth image information of the user is obtained;
The current pose information of the user is determined according to the current depth image information.
7. it is described during AR information is presented for user according to the method described in claim 1, wherein, obtain the use
The current pose information at family includes:
During AR information is presented for user, the current pose information of user's relative measurement device is obtained;
Wherein, the projection angle information that the AR information is adjusted based on the current pose information, so that after adjustment
The projection angle information includes with the current pose information match:
The current pose information is adjusted based on the relative position relation of the measuring device and projection arrangement;
The projection angle information of the AR information is adjusted based on the current pose information after adjustment, so that the institute after adjustment
State projection angle information and the current pose information match.
8. according to the method described in claim 1, wherein, the method further includes:
Whether the attitudes vibration for detecting the previous attitude information of the relatively described user of the current pose information is equal to or more than appearance
State change threshold;
Wherein, the projection angle information that the AR information is adjusted based on the current pose information, so that after adjustment
The projection angle information includes with the current pose information match:
If the attitudes vibration is equal to or more than the attitudes vibration threshold value, the AR letters are adjusted based on the current pose information
The projection angle information of breath, so that the projection angle information and the current pose information match after adjustment.
9. method according to any one of claim 1 to 8, wherein, the method further includes:
The tagged object obtained according to user equipment, determines the corresponding AR information of the tagged object;
User corresponding to the user equipment is presented into the AR information;
Wherein, described during AR information is presented for user, obtaining the current pose information of the user includes:
During the AR information is presented for the user, the current pose information of the user is obtained.
10. a kind of system for AR information to be presented, wherein, which includes:
First module, for during AR information is presented for user, obtaining the current pose information of the user;
Second module, for adjusting the projection angle information of the AR information based on the current pose information, so that adjustment
The projection angle information afterwards and the current pose information match;
3rd module, for the AR information to be presented to the user based on the projection angle information.
11. system according to claim 10, wherein, the current pose information further includes the current location of the user
Information;
Wherein, the 3rd module is additionally operable to determine the scalability information of the AR information based on the current location information;And
For the AR information to be presented to the user based on the projection angle information and the scalability information.
12. system according to claim 11, wherein, the 3rd module is used for according to the scalability information to the AR
Information zooms in and out processing;And it is presented to institute for the AR information after scaling is handled based on the projection angle information
State user.
13. the system according to claim 11 or 12, wherein, the 3rd module is used for based on acquired current location
Information determines the current distance information of the user and projection plane, wherein, the projection plane is used to the AR information be presented;
And for based on the initial of the user projection plane opposite with the current distance information of projection plane and the user
Range information, determines the scalability information of the AR information.
14. system according to claim 13, wherein, the 3rd module is used for according to acquired current location information
The first affiliated coordinate system, determines correspondence position of the acquired current location information in the second coordinate system belonging to projection plane
Confidence ceases;And for the correspondence position information according to acquired current location information in second coordinate system, determine
The user and the current distance information of projection plane;Wherein, the projection plane is used to the AR information be presented.
15. the system according to any one of claim 10 to 14, wherein, first module is used to present for user
During AR information, the current depth image information of the user is obtained, and it is true according to the current depth image information
The current pose information of the fixed user.
16. system according to claim 10, wherein, first module during AR information is presented for user,
Obtain the current pose information of user's relative measurement device;
Wherein, second module adjusts the current pose based on the relative position relation of the measuring device and projection arrangement
Information;And the projection angle information of the AR information is adjusted based on the current pose information after adjustment, so that adjustment
The projection angle information afterwards and the current pose information match.
17. system according to claim 10, wherein, second module is additionally operable to detect the current pose information phase
Whether attitudes vibration threshold value is equal to or more than to the attitudes vibration of the previous attitude information of the user;If described attitudes vibration etc.
In or more than the attitudes vibration threshold value, the projection angle information of the AR information is adjusted based on the current pose information, with
So that the projection angle information and the current pose information match after adjustment.
18. the system according to any one of claim 10 to 17, wherein, further include:4th module is used for according to user
The tagged object that equipment obtains, determines the corresponding AR information of the tagged object;
Wherein, first module is used for during the AR information is presented for the user, obtains working as the user
Preceding attitude information;And the 3rd module be used to AR information user corresponding to the user equipment is presented.
19. a kind of system for AR information to be presented, wherein, including:
Measuring device, for obtaining the current pose information of user;
Projection arrangement, for the AR received information to be projected;
Computer equipment, is connected with the measuring device and projection arrangement, for based on acquired current pose information execution
Method as described in any in claim 1-9.
20. system according to claim 19, wherein, further include:User equipment, for obtaining the mark of corresponding A R information
Object;The corresponding AR information of the correspondence mark is supplied to projection arrangement by the computer equipment.
21. a kind of computer equipment, the computer equipment includes:
One or more processors;
Memory, for storing one or more computer programs;
When one or more of computer programs are performed by one or more of processors so that one or more of
Processor realizes method as claimed in any one of claims 1-9 wherein.
22. a kind of computer-readable recording medium, the computer-readable recording medium storage has computer code, when the meter
Calculation machine code is performed, and method as claimed in any one of claims 1-9 wherein is performed.
23. a kind of computer program product, when the computer program product is performed by computer equipment, such as claim 1
It is performed to the method any one of 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711377194.7A CN107977082A (en) | 2017-12-19 | 2017-12-19 | A kind of method and system for being used to AR information be presented |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711377194.7A CN107977082A (en) | 2017-12-19 | 2017-12-19 | A kind of method and system for being used to AR information be presented |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107977082A true CN107977082A (en) | 2018-05-01 |
Family
ID=62007042
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711377194.7A Pending CN107977082A (en) | 2017-12-19 | 2017-12-19 | A kind of method and system for being used to AR information be presented |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107977082A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109788194A (en) * | 2018-12-27 | 2019-05-21 | 北京航空航天大学 | A kind of adaptivity wearable device subjectivity multi-view image acquisition method |
CN110751707A (en) * | 2019-10-24 | 2020-02-04 | 北京达佳互联信息技术有限公司 | Animation display method, animation display device, electronic equipment and storage medium |
CN111459269A (en) * | 2020-03-24 | 2020-07-28 | 视辰信息科技(上海)有限公司 | Augmented reality display method, system and computer readable storage medium |
CN111885366A (en) * | 2020-04-20 | 2020-11-03 | 上海曼恒数字技术股份有限公司 | Three-dimensional display method and device for virtual reality screen, storage medium and equipment |
WO2022033389A1 (en) * | 2020-08-11 | 2022-02-17 | 中兴通讯股份有限公司 | Image processing method and apparatus, and electronic device and storage medium |
US20230044474A1 (en) * | 2021-08-06 | 2023-02-09 | Beijing Xiaomi Mobile Software Co., Ltd. | Audio signal processing method, electronic apparatus, and storage medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104423578A (en) * | 2013-08-25 | 2015-03-18 | 何安莉 | Interactive Input System And Method |
US9195127B1 (en) * | 2012-06-18 | 2015-11-24 | Amazon Technologies, Inc. | Rear projection screen with infrared transparency |
CN105704468A (en) * | 2015-08-31 | 2016-06-22 | 深圳超多维光电子有限公司 | Stereoscopic display method, device and electronic equipment used for virtual and reality scene |
CN105898241A (en) * | 2016-01-15 | 2016-08-24 | 上海英诗帕信息科技有限公司 | Motion Vision Displaying System And Method |
CN106095102A (en) * | 2016-06-16 | 2016-11-09 | 深圳市金立通信设备有限公司 | The method of a kind of virtual reality display interface process and terminal |
CN106373085A (en) * | 2016-09-20 | 2017-02-01 | 福州大学 | Intelligent terminal 3D watch try-on method and system based on augmented reality |
CN106817568A (en) * | 2016-12-05 | 2017-06-09 | 网易(杭州)网络有限公司 | A kind of augmented reality display methods and device |
CN107027015A (en) * | 2017-04-28 | 2017-08-08 | 广景视睿科技(深圳)有限公司 | 3D trends optical projection system based on augmented reality and the projecting method for the system |
CN107027014A (en) * | 2017-03-23 | 2017-08-08 | 广景视睿科技(深圳)有限公司 | A kind of intelligent optical projection system of trend and its method |
CN107102734A (en) * | 2017-04-17 | 2017-08-29 | 福建维锐尔信息科技有限公司 | A kind of method and device for breaking through realistic space limitation |
CN206649468U (en) * | 2016-11-30 | 2017-11-17 | 南京航空航天大学 | Adaptive dynamic solid augmented reality operation navigation system based on real-time tracking and Multi-source Information Fusion |
-
2017
- 2017-12-19 CN CN201711377194.7A patent/CN107977082A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9195127B1 (en) * | 2012-06-18 | 2015-11-24 | Amazon Technologies, Inc. | Rear projection screen with infrared transparency |
CN104423578A (en) * | 2013-08-25 | 2015-03-18 | 何安莉 | Interactive Input System And Method |
CN105704468A (en) * | 2015-08-31 | 2016-06-22 | 深圳超多维光电子有限公司 | Stereoscopic display method, device and electronic equipment used for virtual and reality scene |
CN105898241A (en) * | 2016-01-15 | 2016-08-24 | 上海英诗帕信息科技有限公司 | Motion Vision Displaying System And Method |
CN106095102A (en) * | 2016-06-16 | 2016-11-09 | 深圳市金立通信设备有限公司 | The method of a kind of virtual reality display interface process and terminal |
CN106373085A (en) * | 2016-09-20 | 2017-02-01 | 福州大学 | Intelligent terminal 3D watch try-on method and system based on augmented reality |
CN206649468U (en) * | 2016-11-30 | 2017-11-17 | 南京航空航天大学 | Adaptive dynamic solid augmented reality operation navigation system based on real-time tracking and Multi-source Information Fusion |
CN106817568A (en) * | 2016-12-05 | 2017-06-09 | 网易(杭州)网络有限公司 | A kind of augmented reality display methods and device |
CN107027014A (en) * | 2017-03-23 | 2017-08-08 | 广景视睿科技(深圳)有限公司 | A kind of intelligent optical projection system of trend and its method |
CN107102734A (en) * | 2017-04-17 | 2017-08-29 | 福建维锐尔信息科技有限公司 | A kind of method and device for breaking through realistic space limitation |
CN107027015A (en) * | 2017-04-28 | 2017-08-08 | 广景视睿科技(深圳)有限公司 | 3D trends optical projection system based on augmented reality and the projecting method for the system |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109788194A (en) * | 2018-12-27 | 2019-05-21 | 北京航空航天大学 | A kind of adaptivity wearable device subjectivity multi-view image acquisition method |
CN109788194B (en) * | 2018-12-27 | 2020-08-25 | 北京航空航天大学 | Adaptive wearable device subjective visual angle image acquisition method |
CN110751707A (en) * | 2019-10-24 | 2020-02-04 | 北京达佳互联信息技术有限公司 | Animation display method, animation display device, electronic equipment and storage medium |
CN110751707B (en) * | 2019-10-24 | 2021-02-05 | 北京达佳互联信息技术有限公司 | Animation display method, animation display device, electronic equipment and storage medium |
CN111459269A (en) * | 2020-03-24 | 2020-07-28 | 视辰信息科技(上海)有限公司 | Augmented reality display method, system and computer readable storage medium |
CN111459269B (en) * | 2020-03-24 | 2020-12-01 | 视辰信息科技(上海)有限公司 | Augmented reality display method, system and computer readable storage medium |
CN111885366A (en) * | 2020-04-20 | 2020-11-03 | 上海曼恒数字技术股份有限公司 | Three-dimensional display method and device for virtual reality screen, storage medium and equipment |
WO2022033389A1 (en) * | 2020-08-11 | 2022-02-17 | 中兴通讯股份有限公司 | Image processing method and apparatus, and electronic device and storage medium |
EP4198874A4 (en) * | 2020-08-11 | 2024-02-14 | Zte Corp | Image processing method and apparatus, and electronic device and storage medium |
US20230044474A1 (en) * | 2021-08-06 | 2023-02-09 | Beijing Xiaomi Mobile Software Co., Ltd. | Audio signal processing method, electronic apparatus, and storage medium |
US11950087B2 (en) * | 2021-08-06 | 2024-04-02 | Beijing Xiaomi Mobile Software Co., Ltd. | Audio signal processing method, electronic apparatus, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11393173B2 (en) | Mobile augmented reality system | |
CN107977082A (en) | A kind of method and system for being used to AR information be presented | |
US11386611B2 (en) | Assisted augmented reality | |
EP2915140B1 (en) | Fast initialization for monocular visual slam | |
US20180189974A1 (en) | Machine learning based model localization system | |
CN104885098B (en) | Mobile device based text detection and tracking | |
WO2020076396A1 (en) | Real-world anchor in a virtual-reality environment | |
US20230245391A1 (en) | 3d model reconstruction and scale estimation | |
US20140368539A1 (en) | Head wearable electronic device for augmented reality and method for generating augmented reality using the same | |
CN102446048B (en) | Information processing device and information processing method | |
US20130207962A1 (en) | User interactive kiosk with three-dimensional display | |
CN112823328A (en) | Method for HMD camera calibration using synchronized images rendered on an external display | |
EP3695381B1 (en) | Floor detection in virtual and augmented reality devices using stereo images | |
CN108700946A (en) | System and method for parallel ranging and fault detect and the recovery of building figure | |
US10699438B2 (en) | Mobile device localization in complex, three-dimensional scenes | |
CN114766042A (en) | Target detection method, device, terminal equipment and medium | |
US10388069B2 (en) | Methods and systems for light field augmented reality/virtual reality on mobile devices | |
WO2016012044A1 (en) | Method and system for augmenting television watching experience | |
Ng et al. | An integrated surveillance system—human tracking and view synthesis using multiple omni-directional vision sensors | |
US20200211275A1 (en) | Information processing device, information processing method, and recording medium | |
CN114185073A (en) | Pose display method, device and system | |
Piérard et al. | I-see-3d! an interactive and immersive system that dynamically adapts 2d projections to the location of a user's eyes | |
US11830213B2 (en) | Remote measurements from a live video stream | |
US20230070721A1 (en) | Method, processing device, and display system for information display | |
Inaguma et al. | Hand motion tracking based on a constraint of three-dimensional continuity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 201210 7th Floor, No. 1, Lane 5005, Shenjiang Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai Applicant after: HISCENE INFORMATION TECHNOLOGY Co.,Ltd. Address before: Room 1109, No. 570, Shengxia Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai, March 2012 Applicant before: HISCENE INFORMATION TECHNOLOGY Co.,Ltd. |