CN103814382A - Augmented reality processing method and device of mobile terminal - Google Patents

Augmented reality processing method and device of mobile terminal Download PDF

Info

Publication number
CN103814382A
CN103814382A CN201280001436.1A CN201280001436A CN103814382A CN 103814382 A CN103814382 A CN 103814382A CN 201280001436 A CN201280001436 A CN 201280001436A CN 103814382 A CN103814382 A CN 103814382A
Authority
CN
China
Prior art keywords
target
realtime graphic
buffer memory
information
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201280001436.1A
Other languages
Chinese (zh)
Other versions
CN103814382B (en
Inventor
许国军
李艳丽
刘峥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN103814382A publication Critical patent/CN103814382A/en
Application granted granted Critical
Publication of CN103814382B publication Critical patent/CN103814382B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • H04N21/8133Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides an augmented reality processing method and device of a mobile terminal. The method comprises the steps of obtaining a collected real-time images from a camera, and caching the real-time images; carrying out augmented reality AR processing on the real-time images so as to generate first AR images, and displaying the first AR images; and judging whether to carry out freeze-frame process, if so, determining one frame of cached real-time image to be a freeze-frame image, and carrying out AR processing on the freeze-frame image so as to generate an AR freeze-frame image, wherein the frame of cached real-time image is in a preset time range from the current time. By adopting the augmented reality processing method and device of the mobile terminal, the freeze-frame processing towards the images is realized, the constraint to the behavior of users is reduced, and the effect of the AR processing is improved.

Description

Augmented reality disposal route and the device of mobile terminal
Technical field
The embodiment of the present invention relates to communication technical field, relates in particular to a kind of augmented reality disposal route and device of mobile terminal.
Background technology
Augmented reality (Augmented Reality, AR), the entity information that is originally difficult to experience in the certain hour spatial dimension of real world for example, visual information, sound, taste or sense of touch etc., by the real world that is added to again after scientific and technical analog simulation by the perception of mankind's sense organ institute, thereby reach the sensory experience of exceeding reality, this technology is called augmented reality, be called for short AR technology.
Three-dimensional registration, by computer graphical Epidemiological Analysis, obtain the three dimensional space coordinate of concrete object in three dimensions, then according to the three dimensional space coordinate obtaining, the dummy object binding being generated by computing machine is spliced in real three dimensions and is gone, to reach the accurate seamless fusion of true environment and dummy object.
AR application based on mobile terminal, it is the real information that obtains real world by the camera of mobile terminal, the AR target of identification real world, some virtual informations superpose in real AR target, it is AR content that this virtual information also can be called, help user to see outside real AR target, show the AR content being associated with this AR target.
In this AR application model, lay special stress between AR content and AR target space follow the tracks of and the accuracy of registration, in the time that user uses camera to observe AR target, along with the rotation of camera lens or the movement of AR target, user can experience AR content, as virtual 3D object and the integrated servo-actuated effect of AR target.Meanwhile, between user and AR content, can carry out mutual, as clicked, amplify, dwindle, rotation etc.
After prior art is studied, inventor finds, in prior art, when user uses AR application program by mobile terminal, must aim at AR target, otherwise the AR content of stack will move and constantly shift one's position along with the object in the visual field.But in the time that user checks AR content, do not want under normal circumstances to allow AR content along with movement, now allow the stable thing that aims at the mark of user, just limited user behavior, increased user's burden.If but remove terminal, according to existing treatment scheme, the AR content of stack will disappear so, causes user's experience variation.
Summary of the invention
The embodiment of the present invention provides a kind of augmented reality disposal route and device of mobile terminal, to realize the image processing that fixes, reduces the covenant of works bundle to user, improves the effect that AR processes.
First aspect, the embodiment of the present invention provides a kind of augmented reality disposal route of mobile terminal, comprising:
Obtain the realtime graphic collecting from camera, by described realtime graphic buffer memory;
Described realtime graphic is carried out to augmented reality AR and process generation the one AR image, and a described AR image is shown;
The processing that judges whether to fix, the realtime graphic of if so, determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory is as freeze picture, described freeze picture carried out to AR and process and generate AR freeze picture and show.
In the possible implementation of the first, described in the processing that judges whether to fix, be specially:
Detect described mobile terminal and within the scope of the second Preset Time, whether keep stationary state, processing if so, fixes.
In conjunction with the possible implementation of the first of first aspect, in the possible implementation of the second, whether the described mobile terminal of described detection keeps stationary state within the scope of the second Preset Time, and the processing that if so, fixes, is specially:
According to the acceleration of gravity information collecting by acceleration of gravity meter within the scope of described the second Preset Time and the azimuth information that collects by digital compass, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, if so, processing fixes.
In the third possible implementation, described realtime graphic within the scope of distance current time first Preset Time of buffer memory, determine that the realtime graphic of at least one frame buffer, as freeze picture, is specially:
For the realtime graphic of each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the realtime graphic of described position weight maximum is defined as to described freeze picture.
In the 4th kind of possible implementation, described realtime graphic within the scope of distance current time first Preset Time of buffer memory, determine that the realtime graphic of at least one frame buffer, as freeze picture, is specially:
For the realtime graphic of each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the area ratio shared according to the AR target in the realtime graphic of described buffer memory generates area weight, generate sharpness weight according to the sharpness of the AR target in the realtime graphic of described buffer memory, determine described freeze picture according to the described position weight of the realtime graphic of buffer memory described in each frame, described area weight and described sharpness weight.
In the 5th kind of possible implementation, describedly described realtime graphic is carried out to AR process a raw AR image, and will a described AR image demonstration, comprising:
Obtain an AR target reference position information of buffer memory, according to an AR target reference position information of described buffer memory, described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the AR target tracing into and a described AR target criteria dimension information, generate the first rotation parameter and the first translation parameters, by described the first rotation parameter and described the first translation parameters buffer memory;
Obtain an AR content of buffer memory, according to described the first rotation parameter and described the first translation parameters, described realtime graphic and a described AR content are carried out to actual situation and merge and play up processing, generate a described AR image and show.
In conjunction with the 5th kind of possible implementation of first aspect, in the 6th kind of possible implementation, described in obtain an AR target reference position information of buffer memory before, described method also comprises:
Described realtime graphic is carried out to feature detection and description, generate First Characteristic and detect data of description, described First Characteristic is detected to data of description and send to AR server, carry out AR target detection so that described AR server detects data of description according to described First Characteristic;
Receive the first testing result that described AR server sends in the time an AR target being detected, wherein, in described the first testing result, carry an AR target information, a described AR target information comprises: in order to indicate an AR target reference position information of the described position of AR target in described realtime graphic and in order to indicate the big or small AR target criteria dimension information of a described AR target in standard picture, by a described AR target information buffer memory;
Stop sending described First Characteristic to described AR server according to described the first testing result and detect data of description;
Obtain an AR content of a described AR target from described AR server, by a described AR content caching.
In conjunction with the 6th kind of possible implementation of first aspect, in the 7th kind of possible implementation, describedly described freeze picture is carried out to AR process and generate AR freeze picture and show, be specially:
Obtain corresponding the first rotation parameter, the first translation parameters and the described AR content of described freeze picture of buffer memory, according to the first rotation parameter corresponding to described freeze picture and the first translation parameters, described freeze picture and a described AR content are carried out to actual situation and merge and play up processing, generate described AR freeze picture and show.
In conjunction with the 6th kind of possible implementation of first aspect, in the 8th kind of possible implementation, described in the processing that judges whether to fix, be specially:
According to described the first rotation parameter generating and described the first translation parameters, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, and processing if so, fixes within the scope of the second Preset Time.
In conjunction with the 5th kind of possible implementation of first aspect, in the 9th kind of possible implementation, a described AR target information also comprises: in order to indicate the AR target type information of type of a described AR target;
The described processing that judges whether to fix, is specially:
If a described AR target type information is for browsing type, processing fixes.
In conjunction with the 5th kind of possible implementation of first aspect, in the tenth kind of possible implementation, the described processing that judges whether to fix, if, the realtime graphic of determining at least one frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory, as after freeze picture, also comprises:
To described Real-time Obtaining to realtime graphic carry out feature detection and description, generate Second Characteristic and detect data of description, described Second Characteristic is detected to data of description and send to described AR server, carry out AR target detection so that described AR server detects data of description according to described Second Characteristic;
Receive the second testing result that described AR server sends in the time the 2nd AR target being detected, wherein, in described the second testing result, carry the 2nd AR target information, described the 2nd AR target information comprises: in order to indicate the 2nd AR target reference position information of described the 2nd position of AR target in described realtime graphic and in order to indicate the big or small two AR target criteria dimension information of described the 2nd AR target in described standard picture, by described the 2nd AR target information buffer memory;
Stop sending described Second Characteristic to described AR server according to described the second testing result and detect data of description;
By described the 2nd AR target information buffer memory, according to described the 2nd AR target reference position information, the 2nd AR target in described realtime graphic is followed the tracks of, if trace into described the 2nd AR target within the scope of the 3rd Preset Time, obtain the 2nd AR content of described the 2nd AR target from described AR server, by described the 2nd AR content caching, generating solution is except indication information the demonstration of fixing;
The instruction if the releasing receiving fixes, according to described the 2nd AR target reference position information of buffer memory, the 2nd AR target in described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the 2nd AR target tracing into and described the 2nd AR target criteria dimension information, generate the second rotation parameter and the second translation parameters, by described the second rotation parameter and described the second translation parameters buffer memory;
According to described the second rotation parameter and described the second translation parameters, described realtime graphic and described the 2nd AR content are carried out to actual situation and merge and play up processing, generate described the 2nd AR image and show.
Second aspect, the embodiment of the present invention provides a kind of augmented reality treating apparatus of mobile terminal, comprising:
Image acquisition unit, for obtain the realtime graphic collecting from camera, by described realtime graphic buffer memory;
The first augmented reality processing unit, is connected with described image acquisition unit, processes generation the one AR image, and a described AR image is shown for described realtime graphic being carried out to augmented reality AR;
Processing unit fixes, be used for judging whether to fix processing, if so, the realtime graphic of determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory is as freeze picture, described freeze picture carried out to AR and process and generate AR freeze picture and show.
In the possible implementation of the first, described in the processing unit that fixes within the scope of the second Preset Time, whether keep stationary state specifically for detecting described mobile terminal, processing if so, fixes.
In conjunction with the possible implementation of the first of second aspect, in the possible implementation of the second, the acceleration of gravity information that the described processing unit that fixes collects by acceleration of gravity meter within the scope of described the second Preset Time specifically for basis and the azimuth information collecting by digital compass, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, if so, processing fixes.
In the third possible implementation, the described processing unit that fixes is specifically for the realtime graphic for each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the realtime graphic of described position weight maximum is defined as to described freeze picture.
In the 4th kind of possible implementation, the described processing unit that fixes is specifically for the realtime graphic for each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the area ratio shared according to the AR target in the realtime graphic of described buffer memory generates area weight, generate sharpness weight according to the sharpness of the AR target in the realtime graphic of described buffer memory, determine described freeze picture according to the described position weight of the realtime graphic of buffer memory described in each frame, described area weight and described sharpness weight.
In the 5th kind of possible implementation, described the first augmented reality processing unit, comprising:
First follows the tracks of registration subelement, be connected with described image acquisition unit, for obtaining an AR target reference position information of buffer memory, according to an AR target reference position information of described buffer memory, described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the AR target tracing into and a described AR target criteria dimension information, generate the first rotation parameter and the first translation parameters, by described the first rotation parameter and described the first translation parameters buffer memory;
First plays up subelement, be connected with described the first tracking registration subelement, for obtaining an AR content of buffer memory, according to described the first rotation parameter and described the first translation parameters, described realtime graphic and a described AR content are carried out to actual situation and merge and play up processing, generate a described AR image and show.
In conjunction with the 5th kind of possible implementation of second aspect, in the 6th kind of possible implementation, described the first augmented reality processing unit also comprises:
The first detection sub-unit, be connected with described image acquisition unit, for described realtime graphic is carried out to feature detection and description, generate First Characteristic and detect data of description, described First Characteristic is detected to data of description and send to AR server, carry out AR target detection so that described AR server detects data of description according to described First Characteristic;
First receives subelement, for receiving the first testing result that described AR server sends in the time an AR target being detected, wherein, in described the first testing result, carry an AR target information, a described AR target information comprises: in order to indicate an AR target reference position information of the described position of AR target in described realtime graphic and in order to indicate the big or small AR target criteria dimension information of a described AR target in standard picture, by a described AR target information buffer memory;
First controls subelement, receives subelement be respectively connected with described the first detection sub-unit and described first, detects data of description for stop sending described First Characteristic to described AR server according to described the first testing result;
First obtains subelement, for obtain an AR content of a described AR target from described AR server, by a described AR content caching.
In conjunction with the 6th kind of possible implementation of second aspect, in the 7th kind of possible implementation, the described processing unit that fixes is specifically for obtaining corresponding the first rotation parameter, the first translation parameters and the described AR content of described freeze picture of buffer memory, according to the first rotation parameter corresponding to described freeze picture and the first translation parameters, described freeze picture and a described AR content are carried out to actual situation and merge and play up processing, generate described AR freeze picture and show.
In conjunction with the 6th kind of possible implementation of second aspect, in the 8th kind of possible implementation, described the first rotation parameter and described the first translation parameters that the described processing unit that fixes generates within the scope of the second Preset Time specifically for basis, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, if so, processing fixes.
In conjunction with the 5th kind of possible implementation of second aspect, in the 9th kind of possible implementation, a described AR target information also comprises: in order to indicate the AR target type information of type of a described AR target;
Fix if described processing unit specifically for a described AR target type information for browsing type, processing fixes.
In conjunction with the 5th kind of possible implementation of second aspect, in the tenth kind of possible implementation, the augmented reality treating apparatus of described mobile terminal, also comprises the second augmented reality processing unit, and described the second augmented reality processing unit comprises:
The second detection sub-unit, be connected with described image acquisition unit, described realtime graphic is carried out to feature detection and description, generate Second Characteristic and detect data of description, described Second Characteristic is detected to data of description and send to described AR server, carry out AR target detection so that described AR server detects data of description according to described Second Characteristic;
Second receives subelement, for receiving the second testing result that described AR server sends in the time the 2nd AR target being detected, wherein, in described the second testing result, carry the 2nd AR target information, described the 2nd AR target information comprises: in order to indicate the 2nd AR target reference position information of described the 2nd position of AR target in described realtime graphic and in order to indicate the big or small two AR target criteria dimension information of described the 2nd AR target in described standard picture;
Second controls subelement, receives subelement be respectively connected with described the second detection sub-unit and described second, stops sending described Second Characteristic to described AR server detect data of description according to described the second testing result;
Caching process subelement, receiving subelement with described image acquisition unit and described second is respectively connected, be used for described the 2nd AR target information buffer memory, according to described the 2nd AR target reference position information, the 2nd AR target in described realtime graphic is followed the tracks of, if trace into described the 2nd AR target within the scope of the 3rd Preset Time, obtain the 2nd AR content of described the 2nd AR target from described AR server, by described the 2nd AR content caching, generating solution is except indication information the demonstration of fixing;
Second follows the tracks of registration subelement, be connected with described image acquisition unit, the instruction if the releasing that is used for receiving fixes, according to described the 2nd AR target reference position information of buffer memory, the 2nd AR target in described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the 2nd AR target tracing into and described the 2nd AR target criteria dimension information, generate the second rotation parameter and the second translation parameters, by described the second rotation parameter and described the second translation parameters buffer memory;
Second plays up subelement, be connected with described the second tracking registration subelement, for according to described the second rotation parameter and described the second translation parameters, described realtime graphic and described the 2nd AR content are carried out to actual situation and merge and play up processing, generate described the 2nd AR image and show.
Augmented reality disposal route and the device of the mobile terminal that the present embodiment provides, augmented reality treating apparatus obtains the realtime graphic collecting from camera, by realtime graphic buffer memory, realtime graphic is carried out to augmented reality AR and process generation the one AR image, and an AR image is shown, processing judges whether to fix, if, the realtime graphic of determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory, as freeze picture, carries out AR by freeze picture and processes generation AR freeze picture and show.By the judgement to the processing that fixes, in the time that needs fix processing, from the realtime graphic of buffer memory, determine that a frame realtime graphic carries out AR and processes generation AR freeze picture and show, make user can watch easily the AR image after fixing, reduce the covenant of works bundle to user, greatly improved the effect that AR processes.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, to the accompanying drawing of required use in embodiment or description of the Prior Art be briefly described below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skills, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
The augmented reality process flow figure of the first mobile terminal that Fig. 1 provides for the embodiment of the present invention;
The augmented reality process flow figure of the second mobile terminal that Fig. 2 provides for the embodiment of the present invention;
The augmented reality process flow figure of the third mobile terminal that Fig. 3 provides for the embodiment of the present invention;
The one that Fig. 4 provides for the embodiment of the present invention fixes and judges treatment scheme schematic diagram;
The another kind that Fig. 5 provides for the embodiment of the present invention fixes and judges treatment scheme schematic diagram;
Augmented reality treatment scheme schematic diagram after the one that Fig. 6 provides for the embodiment of the present invention fixes;
Augmented reality treatment scheme schematic diagram after the another kind that Fig. 7 provides for the embodiment of the present invention fixes;
The augmented reality treating apparatus structural representation of the first mobile terminal that Fig. 8 provides for the embodiment of the present invention;
The augmented reality treating apparatus structural representation of the another kind of mobile terminal that Fig. 9 provides for the embodiment of the present invention;
The augmented reality treating apparatus structural representation of the third mobile terminal that Figure 10 provides for the embodiment of the present invention;
The augmented reality treating apparatus structural representation of the 4th kind of mobile terminal that Figure 11 provides for the embodiment of the present invention.
Embodiment
For making object, technical scheme and the advantage of the embodiment of the present invention clearer, below in conjunction with the accompanying drawing in the embodiment of the present invention, technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is the present invention's part embodiment, rather than whole embodiment.Based on the embodiment in the present invention, those of ordinary skills, not making the every other embodiment obtaining under creative work prerequisite, belong to the scope of protection of the invention.
The augmented reality process flow figure of the first mobile terminal that Fig. 1 provides for the embodiment of the present invention.The augmented reality disposal route of the mobile terminal that as shown in Figure 1, the present embodiment provides specifically can be applied to the AR processing procedure of the mobile terminal that is integrated with augmented reality AR application.This mobile terminal is specifically as follows the terminal devices such as mobile phone, digital camera, notebook computer and panel computer.The augmented reality disposal route of the mobile terminal that the present embodiment provides can be carried out by augmented reality treating apparatus.This augmented reality treating apparatus can be integrated in mobile terminal.
The augmented reality disposal route of the mobile terminal that the present embodiment provides specifically comprises:
Step 101, obtain the realtime graphic collecting from camera, by described realtime graphic buffer memory;
Step 102, described realtime graphic is carried out to augmented reality AR process and generate an AR image, and will be described an AR image demonstration;
Step 103, processing judges whether to fix, if, the realtime graphic of determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory, as freeze picture, carries out AR by described freeze picture and processes generation AR freeze picture and show.
Particularly, the camera of mobile terminal can gather realtime graphic, and augmented reality treating apparatus can be by the Real-time image display getting from camera on the display screen of mobile terminal, and user can watch this realtime graphic by display screen.In actual application, realtime graphic buffer area can be set in the storage unit of mobile terminal, in this realtime graphic buffer area, be cached with the current realtime graphic of a frame, follow-uply all can from this realtime graphic buffer area, obtain this realtime graphic to the processing of realtime graphic.For example, from this realtime graphic buffer area, obtaining realtime graphic shows by display screen.
When user has started AR when application, augmented reality treating apparatus to camera collection to realtime graphic carry out AR processing, also can from realtime graphic buffer area, obtain realtime graphic, and this realtime graphic getting is carried out to AR processing.The process that AR processes is specifically as follows an AR target of first identifying in realtime graphic, the one AR target is followed the tracks of to registration, again an AR content and an AR target are merged and played up, generate an AR image, and an AR image is shown to user by display screen.Wherein, AR target is specially the object that need to carry out AR processing, and AR content is virtual information, as virtual 3D object.For example, in the AR of fitting application, people in realtime graphic is AR target, virtual clothes is AR content, and the AR image of generation is behaved and tried the image of virtual clothes on, and what user saw by display screen is the effect that people tries on a dress, in the time that people moves before camera, owing to being processing to realtime graphic, be shown to user's the AR image for processing in real time, the clothes in this AR image is also in action.AR content can be stored in the storage unit of mobile terminal this locality, also can be stored in AR server end, and in the time that AR content is stored in AR server end, mobile terminal can obtain AR content from AR server end.
In the AR processing procedure to realtime graphic, can synthetic image information, this image information is included in carries out to this realtime graphic the information generating in the process of AR processing, such as the sharpness of AR target positional information, realtime graphic in realtime graphic, AR target three-dimensional log-on message and the temporal information etc. in realtime graphic.Every frame realtime graphic has corresponding image information.
Particularly, also this image information can be carried out to buffer memory, realtime graphic and image information all can be buffered in the image queue buffer area of storage unit, this image queue buffer area can buffer memory apart within the scope of current time the first Preset Time, the image information that the AR processing procedure of realtime graphic is generated, can the multiple image informations of buffer memory in image queue buffer area.Image information buffer memory to the implementation procedure of image queue buffer area is specifically as follows:
Record current time t, whether exceed the first Preset Time scope T1 according to the difference between the cache-time t1 of the first image information of current time t and image queue buffer area and judge whether to upgrade image queue buffer area, whether remove some image informations of image queue buffer area;
Front i the image information that meets (t-ti > T1) in image queue buffer area removed from image queue buffer area, t1 is made as to the cache-time of first image information in the image queue buffer area after renewal;
Wherein, the first Preset Time scope T1 can according to camera collection to the size of single frames realtime graphic and the capacity situation of the storage unit of customer mobile terminal dynamically adjust.For example, if the initial value of the first Preset Time scope T1 can be made as 5s, camera collection to the size of single-frame images can determine according to the realtime graphic of buffer memory in realtime graphic buffer area, the size of supposing camera single-frame images is q, the frame number that AR applies processing per second is r, in mobile terminal, save as a, take the storage space of too much storage unit for fear of image queue buffer area, arrange storage space that image queue buffer area takies storage unit be no more than storage space volume 5% in actual process, the size possibility real-time change of q and r, can judge the shared storage space of image queue buffer area in real time or with Preset Time interval, if q × r × T1 > is 5% × a, illustrate that the shared storage space of image queue buffer area exceedes 5% of storage spatial content, can make T1=(5% × a)/(q × r), to make the shared storage space of image queue buffer area be no more than 5% of storage space total volume by adjusting this first Preset Time scope T1.
In the time that user need to fix processing to the image of display demonstration, the AR image after fixing can be shown to user by display screen.Triggering fix process mode can have multiple:
In one implementation, user can this treatment scheme that fixes of manual triggers.Can on AR application interface, preset the enable button that fixes, user can trigger the treatment scheme that fixes by controlling this enable button that fixes.In the time that display screen is touch-screen, in the time that user wishes to use freeze function to check AR image, can directly tap this enable button that fixes, it is the manual triggers processing that fixes, the indication information that fixes of this user's input is obtained in AR application, freeze function marker is set for starting, and carries out the processing that fixes.
In another kind of implementation, whether augmented reality treating apparatus can also need to carry out the processing that fixes by automatic decision.For example, when mobile terminal remains static within the scope of certain hour, can judge that user wishes to check the AR image fixing, freeze function marker is set for starting.The execution processing that fixes.
In another implementation, augmented reality treating apparatus can also be carried out the processing that fixes according to the type information of AR object, if the type information of AR object is for browsing type, for example AR object is a still life, as books or and he is for showing stationary article etc., illustrate that user wishes to see static AR image effect, carry out this processing that fixes.Or processor knows according to the type information of AR object the object that this AR object is interactivity, user need to understand more AR content by touching or click interactive information, in the time that the quantity of this interactive information reaches certain threshold value, judge that user wants to understand in detail this AR content, carry out this processing that fixes.
The processing if judgement need to fix, the realtime graphic that a definite effect frame is good from image queue buffer area is as freeze picture.In actual application, also can from image queue buffer area, determine that multiframe realtime graphic is as freeze picture, and definite multiframe freeze picture is shown to user by display screen, select a wherein frame freeze picture by user, then the freeze picture of this selection is carried out to AR and process generation AR freeze picture and show by display screen.Or definite multiframe freeze picture can all carry out definite multiframe freeze picture AR and process generation multiframe AR freeze picture, and is shown to user by split screen form, multiframe AR freeze picture can embody the effect that AR process from multiple angles.Or, can from image queue buffer area, determine that the best realtime graphic of an effect frame is as freeze picture, and this freeze picture is carried out to AR processing.If judgement does not need the processing that fixes, continue execution step 101, realtime graphic is carried out to AR processing, and the AR image generating is shown to user by display screen.
Fixing in processing procedure, although what be shown to user is AR freeze picture,, augmented reality treating apparatus is also assigned other threads to be continued to carry out and obtains realtime graphic and realtime graphic is followed the tracks of and processed.
In the time that user does not need to check AR freeze picture, can remove and fix, the implementation that releasing fixes also can have multiple, can show for removing the function button fixing to user, user manually controls to realize and removes freeze function, or with after losing, when new AR target detected, automatically prompting user is removed the operation fixing in AR target.In the time that user's releasing fixes, it is non-startup that freeze function marker is set, and that continues running background follows the tracks of to realtime graphic the flow process of processing, then the AR target tracing into is carried out to three-dimensional registration and calculate, and carry out actual situation fusion according to AR content and play up processing, generate AR image and show.
The augmented reality disposal route of the mobile terminal that the present embodiment provides, augmented reality treating apparatus obtains the realtime graphic collecting from camera, by realtime graphic buffer memory, realtime graphic is carried out to augmented reality AR and process generation the one AR image, and an AR image is shown, processing judges whether to fix, if, the realtime graphic of determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory, as freeze picture, carries out AR by freeze picture and processes generation AR freeze picture and show.By the judgement to the processing that fixes, in the time that needs fix processing, from the realtime graphic of buffer memory, determine that a frame realtime graphic carries out AR and processes generation AR freeze picture and show, make user can watch easily the AR image after fixing, reduce the covenant of works bundle to user, greatly improved the effect that AR processes.
The augmented reality process flow figure of the second mobile terminal that Fig. 2 provides for the embodiment of the present invention.As shown in Figure 2, in the present embodiment, step 102, describedly carries out AR by described realtime graphic and processes a raw AR image, and will a described AR image demonstration, specifically can comprise the steps:
Step 205, obtain an AR target reference position information of buffer memory, according to an AR target reference position information of described buffer memory, described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the AR target tracing into and a described AR target criteria dimension information, generate the first rotation parameter and the first translation parameters, by described the first rotation parameter and described the first translation parameters buffer memory;
Step 206, obtain an AR content of buffer memory, according to described the first rotation parameter and described the first translation parameters, described realtime graphic and a described AR content are carried out to actual situation and merge and play up processing, generate a described AR image and show.
Particularly, AR server can be set and provide AR application service for mobile terminal.Augmented reality treating apparatus can be using realtime graphic as test pattern, test pattern is carried out to feature detection and description, generate First Characteristic and detect data of description, and this First Characteristic detection data of description is sent to AR server, in the database of AR server, store the feature detection data of standard picture, AR server detects data of description by the First Characteristic receiving and mates with the feature detection data of the standard picture in database, if the match is successful, an AR target in test pattern, detected, and generation detects the first testing result of an AR target in order to indication.In actual application, augmented reality treating apparatus also can directly send to AR server as test pattern using realtime graphic, by AR server, test pattern is carried out to feature detection and description, generate First Characteristic and detect data of description, First Characteristic is detected to data of description and mate with the feature detection data of the standard picture in database.The matching process of test pattern and standard picture can also adopt the mode of other images match to realize, and is not limited to utilize the mode of feature detection data.
Augmented reality treating apparatus receives the first testing result that AR server sends in the time an AR target being detected, wherein, in the first testing result, carry an AR target information, the one AR target information comprises: in order to indicate an AR target reference position information of the position of AR target in realtime graphic and in order to indicate the big or small AR target criteria dimension information of an AR target in standard picture, an AR target information to be carried out to buffer memory.The one AR target information can also comprise the type information of an AR target and the characteristic information of an AR etc.AR target cache district can be set in storage unit, so that an AR target information is carried out to buffer memory.Augmented reality treating apparatus receives after this first testing result of AR server transmission, can stop sending realtime graphic to AR server, to avoid the duplicate detection of AR server.
Augmented reality treating apparatus is downloaded an AR content corresponding to an AR target from AR server, and AR server also can be carried at an AR content in the first testing result and send to mobile terminal.AR content caching district can be set in storage unit, so that an AR content is carried out to buffer memory.
For after the processing of the realtime graphic that collects, augmented reality treating apparatus can obtain an AR target reference position information of buffer memory from the AR target cache district of storage unit, according to an AR target reference position information, the AR target in realtime graphic is followed the tracks of, can be using the realtime graphic for carrying out target following as tracking image.In to the tracing process of an AR target, can generate the first trace information, the first trace information specifically comprises the information such as positional information and the sharpness of tracking image of an AR target in tracking image.AR target location buffer area can also be set in storage unit, by the first trace information generating in tracing process, and the three-dimensional log-on message buffer memory of AR target in realtime graphic such as the first rotation parameter and the first translation parameters generating in three-dimensional registration computation process is to this AR target location buffer area.If in to the tracing process of an AR target, in tracking image, do not trace into an AR target, first object AR is with losing, or first object AR leaves the acquisition range of camera, AR target cache district and AR target location buffer area emptied.Again send realtime graphic to AR server, so that AR server carries out AR target detection to realtime graphic, follow-up processing flow can, with reference to foregoing description, not repeat them here.If the AR target again detecting is still an AR target, can again download an AR content corresponding to an AR target, directly in AR content caching district, obtain an AR content.If the AR target again detecting is not an AR target, AR content caching district is emptied, and download AR content corresponding to new AR target from AR server.
Augmented reality treating apparatus is according to the first rotation parameter and the first translation parameters, realtime graphic and an AR content carried out to actual situation and merge and play up processing, generates an AR image and also shows by display screen.
In the present embodiment, step 205, described in obtain an AR target reference position information of buffer memory before, described method further can also comprise:
Step 201, described realtime graphic is carried out to feature detection and description, generate First Characteristic and detect data of description, described First Characteristic is detected to data of description and send to AR server, carry out AR target detection so that described AR server detects data of description according to described First Characteristic;
Step 202, receive the first testing result that described AR server sends in the time an AR target being detected, wherein, in described the first testing result, carry an AR target information, a described AR target information comprises: in order to indicate an AR target reference position information of the described position of AR target in described realtime graphic and in order to indicate the big or small AR target criteria dimension information of a described AR target in standard picture, by a described AR target information buffer memory;
Step 203, stop sending described First Characteristic to described AR server according to described the first testing result and detect data of description;
Step 204, obtain an AR content of a described AR target from described AR server, by a described AR content caching.
The augmented reality process flow figure of the third mobile terminal that Fig. 3 provides for the embodiment of the present invention.The implementation procedure of the augmented reality disposal route of the mobile terminal that as shown in Figure 3, the embodiment of the present invention provides is specific as follows:
Step 31, obtain the realtime graphic of camera collection, by realtime graphic buffer memory to realtime graphic buffer area;
Step 32, judge whether AR target location buffer area is cached with AR target reference position information, if there is no buffer memory AR target reference position information in the buffer area of AR target location, illustrate that AR application just starts, AR target location buffer area is empty, or do not trace into AR target in AR target following process, AR target location buffer area is emptied to execution step 33; If be cached with AR target reference position information in the buffer area of AR target location, perform step 39;
Step 33, AR application just started or AR target with lose in the situation that, using realtime graphic as test pattern, test pattern is detected and described, carry out feature detection and description, generating feature detects data of description, and this feature detection data of description is sent to AR server;
Step 34, AR server mate this feature detection data of description with the feature detection data of description of the standard picture in database, if the match is successful, AR target in test pattern, detected, generation detects the testing result of AR target, and send to mobile terminal, if mate unsuccessfully, AR server can generate in order to indication and the testing result of AR target do not detected, and sends to mobile terminal;
Step 35, mobile terminal are known and AR target detected according to testing result, stop sending test pattern to AR server, and perform step 36; If know and AR target do not detected according to testing result, perform step 31;
Step 36, mobile terminal are downloaded AR target information from AR server, this AR target information comprises indicating the AR target reference position information of this position of AR target in test pattern, in order to indicate the big or small AR target criteria dimension information of this AR target in standard picture, type information and the AR clarification of objective information etc. of AR target, and by AR target information buffer memory to AR target location buffer area;
Step 37, judge in AR content caching district, whether to store AR content corresponding to this AR target, if exist, perform step 310; If do not exist, perform step 38;
Step 38, mobile terminal are downloaded AR content corresponding to this AR target from AR server, by AR content caching to AR content caching district;
When just having started or do not traced into AR target in AR target following process for AR application, AR target location buffer area is empty, trace into for the first time the realtime graphic of AR target for AR server, realtime graphic is not carried out to AR target following processing, AR target reference position information in the AR target information directly sending according to AR server is carried out three-dimensional registration calculating to this realtime graphic, performs step 312;
Step 39, obtain AR target reference position information from AR target location buffer area;
Step 310, using realtime graphic as tracking image, according to this AR target reference position information, the AR target in tracking image is followed the tracks of;
If step 311 traces into AR target in tracking image, perform step 312, in to the tracing process of tracking image, can generate trace information, this trace information specifically can comprise sharpness and the temporal information etc. of AR target positional information, tracking image in tracking image; If do not trace into AR target in tracking image, perform step 31;
Step 312, according to the camera parameters such as focal length and photocentre of AR target reference position information and AR target criteria dimension information and camera in trace information, AR target location buffer area calculate AR target occur the three-dimensional log-on message such as rotation parameter, translation parameters, using in step 311 and step 312 produce information as image information buffer memory to image queue buffer area;
Concrete, can calculate the parameter such as focal length and photocentre of camera according to AR target reference position information in the buffer area of AR target location and AR target criteria dimension information, then obtain according to the calculation of parameter such as focal length and photocentre of AR target reference position information, AR target criteria dimension information, camera in the buffer area of AR target location the three-dimensional log-on message such as rotation parameter, translation parameters that AR target occurs.
Step 313, according to three-dimensional log-on message, the AR target in the AR content in AR content caching district and current tracking image is carried out to actual situation and merge to play up and generate AR image, and be shown to user by display screen;
Step 314, the processing that judges whether to fix, if so, perform step 315; If not, perform step 31;
Step 315, freeze function marker is set for starting, from image queue buffer area, determines that the best realtime graphic of an effect frame is as freeze picture;
Step 316, according to the image information of this freeze picture, from the buffer area of AR target location, obtain the three-dimensional log-on message of this freeze picture, from AR content caching district, obtain AR content;
Step 317, according to three-dimensional log-on message, AR content, this freeze picture is carried out to actual situation and merge and play up processing, generate AR freeze picture and show;
Step 318, judge whether remove fix, if so, perform step 319, rear execution step 31; If not, perform step 317;
Step 319, freeze function marker is set is non-startup, and image queue buffer area is emptied.
In the present embodiment, in step 103, describedly described freeze picture is carried out to AR process and generate AR freeze picture and show, be specifically as follows:
Obtain corresponding the first rotation parameter, the first translation parameters and the described AR content of described freeze picture of buffer memory, according to the first rotation parameter corresponding to described freeze picture and the first translation parameters, described freeze picture and a described AR content are carried out to actual situation and merge and play up processing, generate described AR freeze picture and show.
The AR processing procedure of freeze picture specifically can, with reference to the detailed description of the AR processing procedure to realtime graphic, not repeated them here.
In the present embodiment, in step 103, described in the processing that judges whether to fix, be specifically as follows:
Detect described mobile terminal and within the scope of the second Preset Time, whether keep stationary state, processing if so, fixes.
It is multiple whether detection mobile terminal keeps the mode of stationary state to have within the scope of the second Preset Time, can also can calculate by software by hardware detection.
In the present embodiment, whether the described mobile terminal of described detection keeps stationary state within the scope of the second Preset Time, and the processing that if so, fixes, is specifically as follows:
According to the acceleration of gravity information collecting by acceleration of gravity meter within the scope of described the second Preset Time and the azimuth information that collects by digital compass, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, if so, processing fixes.
In one implementation, in mobile terminal, can be provided with acceleration of gravity meter and digital compass, acceleration of gravity collection can gather acceleration of gravity information, digital compass can gather azimuth information, can judge according to acceleration of gravity information and azimuth information whether mobile terminal remains static.
Particularly, hardware parameter queue buffer area can be set in storage unit, and the azimuth information that the acceleration of gravity information that acceleration of gravity meter is collected and digital compass collect and the temporal information of this buffer memory are cached to hardware parameter queue buffer area afterbody as an element.The temporal information of first element of hardware parameter queue buffer area storage is t1, and current time is t.
The one that Fig. 4 provides for the embodiment of the present invention fixes and judges treatment scheme schematic diagram.As shown in Figure 4, fix process judgement process step specific as follows:
Step 41, judge whether the mistiming of the temporal information t1 of first element of storing in current time t and hardware parameter queue buffer area exceedes the second Preset Time scope Ts, if exceed the second Preset Time scope Ts, perform step 42, if do not exceed the second Preset Time scope Ts, perform step 46;
Step 42, calculate acceleration of gravity meter in the second Preset Time scope Ts and the variation of digital compass according to the acceleration of gravity information in each element and azimuth information, concrete computation process is:
Acceleration of gravity meter being changed within (t-t1) second
Figure BDA00002459088400181
wherein n is the element number of buffer memory in hardware parameter buffer area, g ifor i acceleration of gravity meter parameter of buffer memory in hardware parameter buffer area, t ifor the temporal information of i the buffer memory acceleration of gravity timing of buffer memory in hardware parameter buffer area;
Digital compass being changed within (t-t1) second
Figure BDA00002459088400182
wherein n is the element number of buffer memory in hardware parameter buffer area, c ifor i digital compass parameter of buffer memory in hardware parameter buffer area, t itemporal information during for i buffer memory digital compass of buffer memory in hardware parameter buffer area;
Step 43, remove from hardware parameter queue buffer area meeting front i the element of (t-ti > T) in hardware parameter queue buffer area, t1 is made as to the temporal information of first element in the hardware parameter queue buffer area of new buffer memory;
If the variation g of acceleration of gravity meter in the second Preset Time scope Ts calculating in step 44 step 42 diff< 0.5 meter per second square, execution step 45, otherwise, execution step 46;
If the variation c of digital compass in the second Preset Time scope Ts calculating in step 45 step 2 diff< 5 spends, execution step 47, otherwise, execution step 46;
Step 46, freeze function marker is set is non-startup;
Step 47, arrange freeze function marker for start.
In the present embodiment, in step 103, described in the processing that judges whether to fix, be specifically as follows:
According to described the first rotation parameter generating and described the first translation parameters, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, and processing if so, fixes within the scope of the second Preset Time.
In another kind of implementation, can judge whether mobile terminal remains static according to the first rotation parameter and the first translation parameters that generate in three-dimensional registration computation process in step 205.
Particularly, the first rotation parameter rx, ry and rz that three-dimensional registration calculates, and the first translation parameters tx, ty and tz, wherein, rx, ry and rz represent respectively the angle of mobile terminal in the rotation of x direction, y direction and z direction, and tx, ty and tz represent respectively the unit of mobile terminal in x direction, y direction and the translation of z direction.Three-dimensional registration parameter queue buffer area can be set in storage unit, be cached to three-dimensional registration parameter queue buffer area afterbody using the temporal information of the first rotation parameter, the first translation parameters and this buffer memory as an element.
Three-dimensional registration parameter queue buffer area buffer memory be that the second Preset Time within the scope of, three-dimensional when Continuous Tracking AR target is registered parameter information.If do not trace into AR target, need the content that three-dimensional is registered in parameter queue buffer area to empty.
If it is t1 that this three-dimensional is registered the temporal information of first element of parameter queue buffer area storage, current time is t.
The another kind that Fig. 5 provides for the embodiment of the present invention fixes and judges treatment scheme schematic diagram.As shown in Figure 5, fix process judgement process step specific as follows:
Step 51, judge whether the mistiming of the temporal information t1 of first element of storing in current time t and three-dimensional registration parameter queue buffer area exceedes the second Preset Time scope Ts, cross Ts if exceed the second Preset Time scope, perform step 52, do not exceed the second Preset Time scope Ts, perform step 56;
Step 52, rotation variation and the translation variation in the second Preset Time scope Ts according to the first rotation parameter in each element and described the first translation parameters calculating the one AR target, concrete computation process is:
The translation of the one AR target in the second Preset Time scope Ts is changed to:
r diff = &Sigma; i = 1 n ( ( r x i + 1 - r x i ) + ( r y i + 1 - r y i ) + ( r z i + 1 - r z i ) ) ;
R diffrepresent that an AR target rotates the differential seat angle sum of variation in adjacent two frame tracking images, wherein, n is the element number of buffer memory in three-dimensional registration parameter buffer area, rx i, ry iand rz ibe respectively the first rotation parameter in three-dimensional i the element registering buffer memory in parameter buffer area of difference;
The translation of the one AR target in the second Preset Time scope Ts is changed to:
t diff = &Sigma; i = 1 n ( ( t x i + 1 - t x i ) + ( t y i + 1 - t y i ) + ( t z i + 1 - t z i ) ) ;
T diffrepresent, in the adjacent two frame tracking images of an AR target, the translational movement sum that translation changes occurs, wherein, n is the element number of buffer memory in three-dimensional registration parameter buffer area, tx i, ty iand tz ifor the first translation parameters in i element of buffer memory in three-dimensional registration parameter buffer area;
Step 53, three-dimensional is registered in parameter queue buffer area to front i the element that meets (t-ti > T) from three-dimensional registration parameter queue buffer area, remove, the three-dimensional that t1 is made as to new buffer memory is registered the temporal information of first element in parameter queue buffer area;
If the variation r of an AR target rotation in the approximately second Preset Time scope Ts calculating in step 54 step 52 diff< 5 spends, execution step 55, otherwise, execution step 56;
The variation t of an AR target translation in the second Preset Time scope Ts calculating in step 55, step 52 diff< 5 spends, execution step 57, otherwise, execution step 56;
Step 56, freeze function marker is set is non-startup;
Step 57, arrange freeze function marker for start.
In the present embodiment, a described AR target information also comprises: in order to indicate the AR target type information of type of a described AR target;
In step 103, described in the processing that judges whether to fix, be specially:
If a described AR target type information is for browsing type, processing fixes.
In the present embodiment, in step 103, described realtime graphic within the scope of distance current time first Preset Time of buffer memory, determine that the realtime graphic of a frame buffer, as freeze picture, is specifically as follows:
For the realtime graphic of each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the realtime graphic of described position weight maximum is defined as to described freeze picture.
Particularly, by calculate the one AR whole object occur the centre coordinate of position and the coordinate at the realtime graphic center of this buffer memory between pixel distance, obtain a position for AR target and the distance of screen center.According to a position for AR target and the distance of screen center, an AR target is set in the position of the cache image position weight maximum of the cache image of close picture centre, apart from the position weight minimum of picture centre cache image farthest.The cache image of position weight maximum is defined as to freeze picture.
In the present embodiment, the freeze picture definite according to position weight, a position for AR target and the close together of screen center, to obtain good display effect, this is more comfortable and convenient when user is watched.
In the present embodiment, in step 103, described realtime graphic within the scope of distance current time first Preset Time of buffer memory, determine that the realtime graphic of a frame buffer, as freeze picture, is specifically as follows:
For the realtime graphic of each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the area ratio shared according to the AR target in the realtime graphic of described buffer memory generates area weight, generate sharpness weight according to the sharpness of the AR target in the realtime graphic of described buffer memory, determine described freeze picture according to the described position weight of the realtime graphic of buffer memory described in each frame, described area weight and described sharpness weight.
Particularly, in the process of determining freeze picture, it is also conceivable that the parameter such as area and sharpness of realtime graphic.
The methods such as spatial domain parametric equation, entropy and frequency domain modulation transport function MTF of cache image information exchange in image queue buffer area can being crossed are calculated, and obtain the sharpness of the realtime graphic of buffer memory.
The sharpness size of the realtime graphic of buffer memory in image queue buffer area is pressed from small to large and arranged, the sharpness weight of realtime graphic that can be using sequencing numbers as buffer memory, the realtime graphic that is buffer memory is more clear, and the sharpness weight of the realtime graphic of buffer memory is larger.
Calculate area that an AR target occurs in the realtime graphic of buffer memory and the area of an AR whole object by the coordinate information of an AR target.The ratio of the area that the one AR target occurs in the realtime graphic of buffer memory and the area of an AR whole object is as Area Ratio.By descending Area Ratio sequence, the Area Ratio weight of the realtime graphic of every frame buffer is set.Even AR whole object appears in buffer memory picture, area proportion maximum.
If the coordinate information of an AR target does not exceed the coordinate range of the realtime graphic of buffer memory, be that an AR whole object all appears in cache image, area proportion is 1; If the coordinate information of AR target exceeds the coordinate range of tracking image, AR target does not appear at tracking image completely, can calculate the ratio of the area of the AR target in present cache image and the real area of an AR target.
The realtime graphic of the buffer memory of the weight sum maximum of the position weight of cache image, area weight and sharpness weight can be defined as to freeze picture.
In the present embodiment, according to position weight, area weight and the definite freeze picture of sharpness weight, make user can see one large and clear, and in center the freeze picture of the one AR target, this is more comfortable and convenient must make user watch time, has improved the effect that fixes.
Augmented reality treatment scheme schematic diagram after the one that Fig. 6 provides for the embodiment of the present invention fixes.As shown in Figure 6, in the present embodiment, step 103, the described processing that judges whether to fix, if so, the realtime graphic of determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory is as after freeze picture, and described method can also comprise:
Step 601, to described Real-time Obtaining to realtime graphic carry out feature detection and description, generate Second Characteristic and detect data of description, described Second Characteristic is detected to data of description and send to described AR server, carry out AR target detection so that described AR server detects data of description according to described Second Characteristic;
Step 602, receive the second testing result that described AR server sends in the time the 2nd AR target being detected, wherein, in described the second testing result, carry the 2nd AR target information, described the 2nd AR target information comprises: in order to indicate the 2nd AR target reference position information of described the 2nd position of AR target in described realtime graphic and in order to indicate the big or small two AR target criteria dimension information of described the 2nd AR target in described standard picture, by described the 2nd AR target information buffer memory;
Step 603, stop sending described Second Characteristic to described AR server according to described the second testing result and detect data of description;
Step 604, by described the 2nd AR target information buffer memory, according to described the 2nd AR target reference position information, the 2nd AR target in described realtime graphic is followed the tracks of, if trace into described the 2nd AR target within the scope of the 3rd Preset Time, obtain the 2nd AR content of described the 2nd AR target from described AR server, by described the 2nd AR content caching, generating solution is except indication information the demonstration of fixing;
The instruction if the releasing that step 605 receives fixes, according to described the 2nd AR target reference position information of buffer memory, the 2nd AR target in described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the 2nd AR target tracing into and described the 2nd AR target criteria dimension information, generate the second rotation parameter and the second translation parameters, by described the second rotation parameter and described the second translation parameters buffer memory;
Step 606, according to described the second rotation parameter and described the second translation parameters, described realtime graphic and described the 2nd AR content are carried out to actual situation and merge and play up processing, generate described the 2nd AR image and show.
Particularly, after fixing, although what be shown to user is AR freeze picture,, augmented reality treating apparatus is also assigned other threads to be continued to carry out and obtains realtime graphic and realtime graphic is correspondingly processed.Augmented reality treating apparatus can be with the realtime graphic of a Preset Time interval acquiring camera collection, using this realtime graphic as test pattern, test pattern is carried out to feature detection, description, generating feature detects data of description, and this feature detection data of description is sent to AR server together with test pattern.AR server mates feature detection data of description with the feature detection data of description of the standard picture in database.If the 2nd AR target detected in test pattern, generate in order to indication and the second testing result of the 2nd AR target detected, and send to mobile terminal.In this second testing result, carry the 2nd AR target information.Mobile terminal receives after this second testing result, judges that whether the 2nd AR target information is identical with the AR target information in AR target cache district, if different, stops sending test pattern to AR server, to avoid the duplicate detection of AR server.Mobile terminal is cached to the 2nd AR target information in the prestrain AR target cache district of storage unit, mobile terminal is followed the tracks of the realtime graphic as tracking image according to the 2nd AR target reference position information in the 2nd AR target information, if continue to trace into the 2nd AR target within the scope of the 3rd Preset Time, download the 2nd AR content corresponding to the 2nd AR target from AR server, prestrain AR content caching district by the 2nd AR content caching to storage unit, and generating solution is except the indication information that fixes is shown to user, fixes to point out user to remove.This releasing fix indication information way of realization can for eject dialog box, to point out user to select whether to check fresh target, or will manually separate the highlighted demonstration of freeze button, represent have been found that new AR target and downloaded the 2nd corresponding AR content.
If the 2nd AR target information is identical with the AR target information in AR target cache district, repeated execution of steps 601 and step 602, until detect the new AR target that is different from an AR target.
User can select to continue to keep fixing or removing and fix according to this releasing indication information that fixes, if user selects to remove and fixes, input and remove the instruction that fixes, augmented reality treating apparatus continues that tracking image is carried out to the tracking of the 2nd AR target, three-dimensional registration calculating and actual situation and plays up fusion treatment.Its specific implementation process can, with reference to the description of above-described embodiment, not repeat them here.
If user selects to keep fixing, input keeps fixing instruction, augmented reality treating apparatus execution step 601 and step 602, if the AR target detecting is still the 2nd AR target, and within the scope of Preset Time, continue to trace into the 2nd AR target, again show and remove the indication information that fixes to user.If the AR target detecting is different from the 2nd AR target, prestrain AR target cache district and prestrain AR content caching district are emptied, and by this new AR target information buffer memory to this prestrain AR target cache district.Mobile terminal is followed the tracks of the realtime graphic as tracking image according to AR target reference position information new in new AR target information, trace into this new AR target if continue within the scope of Preset Time, download AR content corresponding to this new AR target from AR server, by this AR content caching, to AR content caching district, and generating solution is shown to user except the indication information that fixes.
The AR target information in the AR target information of newly obtaining in actual applications, and prestrain AR target cache district is different two kinds of situations:
A kind of for the AR target information in prestrain target cache district be sky:
In the time of each startup freeze function, the AR target information in prestrain target cache district is emptied, if never find AR target after fixing, or the AR target information of newly obtaining is identical with the AR target information of AR target cache district buffer memory, the target information in prestrain target cache district is empty always;
Another kind of for the target information in prestrain target cache district is not empty, but the AR target information of newly obtaining is different from the AR target information in prestrain target cache district, is not to find continuously AR target.
Augmented reality treatment scheme schematic diagram after the another kind that Fig. 7 provides for the embodiment of the present invention fixes.As shown in Figure 7, fix after the process step of augmented reality processing specific as follows:
Step 71, freeze function start, for user shows AR freeze picture;
Step 72, judge whether to have waited for T2 second, if so, perform step 73, if not, continue to wait for;
Step 73, obtain realtime graphic from camera;
Step 74, carry out Image Feature Detection and description using realtime graphic as test pattern, generating feature detects data of description, and this feature detection data of description is sent to AR server;
Step 75, AR server mate this feature detection data of description with the feature detection data of description of the standard picture in database, if the match is successful, AR target in test pattern, detected, generation detects the testing result of AR target, and send to mobile terminal, if mate unsuccessfully, AR server can generate in order to indication and the testing result of AR target do not detected, and sends to mobile terminal;
Step 76, mobile terminal are known and AR target detected according to testing result, stop sending test pattern to AR server, and perform step 77; If know and AR target do not detected according to testing result, perform step 72;
Step 77, the AR target detecting are AR target a, download the AR target information of AR target a;
Step 78, judge that whether the AR target information of AR target a is identical with the AR target information in AR target cache district, if identical, perform step 72; If different, perform step 79;
Step 79, judge that whether the AR target information in the AR target information of AR target a and the AR target cache district of prestrain is identical, if identical, perform step 711; If different, perform step 710;
Step 710, AR target cache district by the AR target information buffer memory of AR target a to prestrain;
Step 711, judge within T3 second whether continue to trace into this AR target a, if perform step 712; Perform step 72 if not;
Step 712, judge whether to download the AR content of AR target a if so, to perform step 714; If not, perform step 713;
Step 713, download the AR content of AR target a from AR server, and by AR content caching the AR content caching district to prestrain;
Whether the time that step 714, judging distance AR content are downloaded exceedes T4 second, if so, performs step 715; Continue if not to wait for;
Step 715, prompting user find new AR target;
Step 716, user select to show new AR target, execution step 717; User selects not show new AR target, execution step 718;
Step 717, freeze function marker is set is non-startup;
The AR content caching district of step 718, the AR target cache district that empties prestrain and prestrain, execution step 72.
In the present embodiment, by the setting in the AR target location buffer area of prestrain and the AR content caching district of prestrain, relevant information and AR content that can the AR target that buffer memory newly detects in the process of fixing, in the time that user's releasing fixes, can carry out subsequent treatment according to the data in the AR content caching district of AR target location buffer area and prestrain at once, avoid processing latency, realized seamless switching.
It should be noted that, in order to be described clearly, in above-described embodiment, register parameter queue buffer area to distinguish the different information of buffer memory by the AR target location buffer area of realtime graphic buffer area, image queue buffer area, prestrain, AR content caching district AR target cache district, AR content caching district, AR target location buffer area, hardware parameter queue buffer area and the three-dimensional of prestrain, but in actual implementation procedure, above-mentioned each buffer area can be only buffer area in logic, or does not distinguish buffer area and realize by unique caching region.
The augmented reality treating apparatus structural representation of the first mobile terminal that Fig. 8 provides for the embodiment of the present invention.As shown in Figure 8, the augmented reality treating apparatus 81 of the mobile terminal that the present embodiment provides specifically can be realized each step of the augmented reality disposal route of the mobile terminal that any embodiment of the present invention provides, and its specific implementation process, does not repeat them here.The augmented reality treating apparatus 81 of the mobile terminal that the present embodiment provides comprises image acquisition unit 801, the first augmented reality processing unit 802 and the processing unit 803 that fixes.Image acquisition unit 801 is for obtain the realtime graphic collecting from camera, by described realtime graphic buffer memory.The first augmented reality processing unit 802 is connected with described image acquisition unit 801, processes generation the one AR image, and a described AR image is shown for described realtime graphic being carried out to augmented reality AR.Fix processing unit 803 for judging whether the processing that fixes, if, the realtime graphic of determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory, as freeze picture, carries out AR by described freeze picture and processes generation AR freeze picture and show.
The augmented reality treating apparatus 81 of the mobile terminal that the present embodiment provides, image acquisition unit 801 obtains the realtime graphic collecting from camera, by described realtime graphic buffer memory.The first augmented reality processing unit 802 carries out augmented reality AR by described realtime graphic and processes generation the one AR image, and a described AR image is shown.The processing unit 803 that the fixes processing that judges whether to fix, if, the realtime graphic of determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory, as freeze picture, carries out AR by described freeze picture and processes generation AR freeze picture and show.By the judgement to the processing that fixes, in the time that needs fix processing, from the realtime graphic of buffer memory, determine that a frame realtime graphic carries out AR and processes generation AR freeze picture and show, make user can watch easily the AR image after fixing, reduce the covenant of works bundle to user, greatly improved the effect that AR processes.
The augmented reality treating apparatus structural representation of the another kind of mobile terminal that Fig. 9 provides for the embodiment of the present invention.As shown in Figure 9, in the present embodiment, described the first augmented reality processing unit 802 comprises that the first tracking registration subelement 905 and first plays up subelement 906.First follows the tracks of registration subelement 905 is connected with described image acquisition unit 801, for obtaining an AR target reference position information of buffer memory, according to an AR target reference position information of described buffer memory, described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the AR target tracing into and a described AR target criteria dimension information, generate the first rotation parameter and the first translation parameters, by described the first rotation parameter and described the first translation parameters buffer memory.First play up subelement 906 and described first follow the tracks of registration subelement 905 be connected, for obtaining an AR content of buffer memory, according to described the first rotation parameter and described the first translation parameters, described realtime graphic and a described AR content are carried out to actual situation and merge and play up processing, generate a described AR image and show.
In the present embodiment, described the first augmented reality processing unit 802 also comprises that the first detection sub-unit 901, the first reception subelement 902, the first control subelement 903 and first obtain subelement 904.The first detection sub-unit 901 is connected with described image acquisition unit 801, for described realtime graphic is carried out to feature detection and description, generate First Characteristic and detect data of description, described First Characteristic is detected to data of description and send to AR server, carry out AR target detection so that described AR server detects data of description according to described First Characteristic.First receives the first testing result that subelement 902 sends in the time an AR target being detected for receiving described AR server, wherein, in described the first testing result, carry an AR target information, a described AR target information comprises: in order to indicate an AR target reference position information of the described position of AR target in described realtime graphic and in order to indicate the big or small AR target criteria dimension information of a described AR target in standard picture, by a described AR target information buffer memory.First controls subelement 903 is connected with described the first detection sub-unit 901 and described the first reception subelement 902 respectively, detects data of description for stop sending described First Characteristic to described AR server according to described the first testing result.First obtains subelement 904 for obtain an AR content of a described AR target from described AR server, by a described AR content caching.
In the present embodiment, the described processing unit 803 that fixes specifically can be for obtaining corresponding the first rotation parameter, the first translation parameters and the described AR content of described freeze picture of buffer memory, according to the first rotation parameter corresponding to described freeze picture and the first translation parameters, described freeze picture and a described AR content are carried out to actual situation and merge and play up processing, generate described AR freeze picture and show.
In the present embodiment, described in the processing unit 803 that fixes specifically can within the scope of the second Preset Time, whether keep for detection of described mobile terminal stationary state, processing if so, fixes.
In the present embodiment, the acceleration of gravity information that the described processing unit 803 that fixes specifically can collect by acceleration of gravity meter for basis within the scope of described the second Preset Time and the azimuth information collecting by digital compass, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, if so, processing fixes.
In the present embodiment, described the first rotation parameter and described the first translation parameters that the described processing unit 803 that fixes generates within the scope of the second Preset Time specifically for basis, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, if so, processing fixes.
In the present embodiment, a described AR target information also comprises: in order to indicate the AR target type information of type of a described AR target.Fix if described processing unit 803 specifically for a described AR target type information for browsing type, processing fixes.
In the time that user need to fix processing to the image of display demonstration, the AR image after fixing can be shown to user by display screen.The triggering mode of processing that fixes can have multiplely, is not limited with the present embodiment.
In the present embodiment, the described processing unit 803 that fixes specifically can be for the realtime graphic for each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the realtime graphic of described position weight maximum is defined as to described freeze picture.
Particularly, by calculate the one AR whole object occur the centre coordinate of position and the coordinate at the realtime graphic center of this buffer memory between pixel distance, obtain a position for AR target and the distance of screen center.According to a position for AR target and the distance of screen center, an AR target is set in the position of the cache image position weight maximum of the cache image of close picture centre, apart from the position weight minimum of picture centre cache image farthest.The cache image of position weight maximum is defined as to freeze picture.
In the present embodiment, the described processing unit 803 that fixes is specifically for the realtime graphic for each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the area ratio shared according to the AR target in the realtime graphic of described buffer memory generates area weight, generate sharpness weight according to the sharpness of the AR target in the realtime graphic of described buffer memory, determine described freeze picture according to the described position weight of the realtime graphic of buffer memory described in each frame, described area weight and described sharpness weight.
Particularly, the methods such as spatial domain parametric equation, entropy and frequency domain modulation transport function MTF of the cache image information exchange in image queue buffer area can being crossed are calculated, and obtain the sharpness of the realtime graphic of buffer memory.
The sharpness size of the realtime graphic of buffer memory in image queue buffer area is pressed from small to large and arranged, the sharpness weight of realtime graphic that can be using sequencing numbers as buffer memory, the realtime graphic that is buffer memory is more clear, and the sharpness weight of the realtime graphic of buffer memory is larger.
Calculate area that an AR target occurs in the realtime graphic of buffer memory and the area of an AR whole object by the coordinate information of an AR target.The ratio of the area that the one AR target occurs in the realtime graphic of buffer memory and the area of an AR whole object is as Area Ratio.By descending Area Ratio sequence, the Area Ratio weight of the realtime graphic of every frame buffer is set.Even AR whole object appears in buffer memory picture, area proportion maximum.
If the coordinate information of an AR target does not exceed the coordinate range of the realtime graphic of buffer memory, be that an AR whole object all appears in cache image, area proportion is 1; If the coordinate information of AR target exceeds the coordinate range of tracking image, AR target does not appear at tracking image completely, can calculate the ratio of the area of the AR target in present cache image and the real area of an AR target.
The realtime graphic of the buffer memory of the weight sum maximum of the position weight of cache image, area weight and sharpness weight can be defined as to freeze picture.
In the present embodiment, according to position weight, area weight and the definite freeze picture of sharpness weight, make user can see one large and clear, and in center the freeze picture of the one AR target, this is more comfortable and convenient must make user watch time, has improved the effect that fixes.
The augmented reality treating apparatus structural representation of the third mobile terminal that Figure 10 provides for the embodiment of the present invention.As shown in figure 10, further, in the present embodiment, the augmented reality treating apparatus 81 of described mobile terminal can also comprise the second augmented reality processing unit 106, and described the second augmented reality processing unit 106 comprises that the second detection sub-unit 1001, second receives subelement 1002, second and controls subelement 1003, caching process subelement 1004, removes the judgment sub-unit 1005 of fixing, second and follow the tracks of registration subelement 1006 and second and play up subelement 1007.The second detection sub-unit 1001 is connected with described image acquisition unit 801, described realtime graphic is carried out to feature detection and description, generate Second Characteristic and detect data of description, described Second Characteristic is detected to data of description and send to described AR server, carry out AR target detection so that described AR server detects data of description according to described Second Characteristic.Second receives the second testing result that subelement 1002 sends in the time the 2nd AR target being detected for receiving described AR server, wherein, in described the second testing result, carry the 2nd AR target information, described the 2nd AR target information comprises: in order to indicate the 2nd AR target reference position information of described the 2nd position of AR target in described realtime graphic and in order to indicate the big or small two AR target criteria dimension information of described the 2nd AR target in described standard picture.Second controls subelement 1003 is connected with described the second detection sub-unit 1001 and described the second reception subelement 1002 respectively, detects data of description for stop sending described Second Characteristic to described AR server according to described the second testing result.Caching process subelement 1004 receives subelement 1002 with described image acquisition unit 801 and described second respectively and is connected, be used for described the 2nd AR target information buffer memory, according to described the 2nd AR target reference position information, the 2nd AR target in described realtime graphic is followed the tracks of, if trace into described the 2nd AR target within the scope of the 3rd Preset Time, obtain the 2nd AR content of described the 2nd AR target from described AR server, by described the 2nd AR content caching, generating solution is except indication information the demonstration of fixing.Second follows the tracks of registration subelement 1005 is connected with described image acquisition unit 801, the instruction if the releasing that is used for receiving fixes, according to described the 2nd AR target reference position information of buffer memory, the 2nd AR target in described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the 2nd AR target tracing into and described the 2nd AR target criteria dimension information, generate the second rotation parameter and the second translation parameters, by described the second rotation parameter and described the second translation parameters buffer memory.Second play up subelement 1006 and described second follow the tracks of registration subelement 1005 be connected, be used for according to described the second rotation parameter and described the second translation parameters, described realtime graphic and described the 2nd AR content are carried out to actual situation and merge and play up processing, generate described the 2nd AR image and show.
In the present embodiment, by the setting in the AR target location buffer area of prestrain and the AR content caching district of prestrain, relevant information and AR content that can the AR target that buffer memory newly detects in the process of fixing, in the time that user's releasing fixes, can carry out subsequent treatment according to the data in the AR content caching district of AR target location buffer area and prestrain at once, avoid processing latency, realized seamless switching.
The augmented reality treating apparatus structural representation of the 4th kind of mobile terminal that Figure 11 provides for the embodiment of the present invention.As shown in figure 11, the augmented reality treating apparatus of the mobile terminal that the present embodiment provides comprises at least one processor 1101 (for example CPU), storer 1102, camera 1103, display screen 1104 and at least one communication bus 1105, for realizing the connection communication between these devices.Processor 1101 for example, for the executable module of execute store 1102 storages, computer program.Storer 1102 may comprise high-speed random access memory (RAM:Random Access Memory), also may also comprise non-unsettled storer (non-volatile memory), for example at least one magnetic disk memory.Camera 1103 is for gathering realtime graphic, and display screen 1104 is for showing realtime graphic or processing in real time the AR image or the AR freeze picture that obtain.
In some embodiments, storer 1102 has been stored programmed instruction, programmed instruction can be carried out by processor 1101, wherein, programmed instruction comprises image acquisition unit 801, the first augmented reality processing unit 802 and the processing unit 803 that fixes, wherein, and the corresponding units that the specific implementation of each unit discloses referring to Fig. 8, the technique effect of its specific implementation process and generation, is not repeated at this.
Through the above description of the embodiments, those skilled in the art can be well understood to the present invention and can realize with hardware, or firmware realization, or their array mode realizes.In the time using software to realize, one or more instructions or the code that above-mentioned functions can be stored in computer-readable medium or on computer-readable medium transmit.Computer-readable medium comprises computer-readable storage medium and communication media, and wherein communication media comprises any medium of being convenient to transmit from a place to another place computer program.Storage medium can be any usable medium that computing machine can access.As example but be not limited to: computer-readable medium can comprise RAM, ROM, EEPROM, CD-ROM or other optical disc storage, magnetic disk storage medium or other magnetic storage apparatus or can be used in carry or store the expectation with instruction or data structure form program code and can be by any other medium of computer access.In addition.Any connection can be suitable become computer-readable medium.For example, if software be use concentric cable, optical fiber cable, twisted-pair feeder, Digital Subscriber Line (DSL) or the wireless technology such as infrared ray, radio and microwave from website, server or the transmission of other remote source, so concentric cable, optical fiber cable, twisted-pair feeder, DSL or the wireless technology such as infrared ray, wireless and microwave be included under in the photographic fixing of medium.As used in the present invention, dish (Disk) and dish (disc) comprise compression laser disc (CD), laser dish, laser disc, digital universal laser disc (DVD), floppy disk and Blu-ray Disc, the copy data of the common magnetic of its mid-game, dish carrys out the copy data of optics with laser.Within combination above also should be included in the protection level of computer-readable medium.
In a word, the foregoing is only the preferred embodiment of technical solution of the present invention, be not intended to limit protection level of the present invention.Within the spirit and principles in the present invention all, any modification of doing, be equal to replacement, improvement etc., within all should being included in protection level of the present invention.

Claims (22)

1. an augmented reality disposal route for mobile terminal, is characterized in that, comprising:
Obtain the realtime graphic collecting from camera, by described realtime graphic buffer memory;
Described realtime graphic is carried out to augmented reality AR and process generation the one AR image, and a described AR image is shown;
The processing that judges whether to fix, the realtime graphic of if so, determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory is as freeze picture, described freeze picture carried out to AR and process and generate AR freeze picture and show.
2. the augmented reality disposal route of mobile terminal according to claim 1, is characterized in that, described in the processing that judges whether to fix, be specially:
Detect described mobile terminal and within the scope of the second Preset Time, whether keep stationary state, processing if so, fixes.
3. the augmented reality disposal route of mobile terminal according to claim 2, is characterized in that, whether the described mobile terminal of described detection keeps stationary state within the scope of the second Preset Time, and the processing that if so, fixes, is specially:
According to the acceleration of gravity information collecting by acceleration of gravity meter within the scope of described the second Preset Time and the azimuth information that collects by digital compass, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, if so, processing fixes.
4. the augmented reality disposal route of mobile terminal according to claim 1, is characterized in that, determines that the realtime graphic of a frame buffer, as freeze picture, is specially described realtime graphic within the scope of distance current time first Preset Time of buffer memory:
For the realtime graphic of each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the realtime graphic of described position weight maximum is defined as to described freeze picture.
5. the augmented reality disposal route of mobile terminal according to claim 1, is characterized in that, determines that the realtime graphic of a frame buffer, as freeze picture, is specially described realtime graphic within the scope of distance current time first Preset Time of buffer memory:
For the realtime graphic of each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the area ratio shared according to the AR target in the realtime graphic of described buffer memory generates area weight, generate sharpness weight according to the sharpness of the AR target in the realtime graphic of described buffer memory, determine described freeze picture according to the described position weight of the realtime graphic of buffer memory described in each frame, described area weight and described sharpness weight.
6. the augmented reality disposal route of mobile terminal according to claim 1, is characterized in that, describedly described realtime graphic carried out to AR processes a raw AR image, and will a described AR image demonstration, comprising:
Obtain an AR target reference position information of buffer memory, according to an AR target reference position information of described buffer memory, described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the AR target tracing into and a described AR target criteria dimension information, generate the first rotation parameter and the first translation parameters, by described the first rotation parameter and described the first translation parameters buffer memory;
Obtain an AR content of buffer memory, according to described the first rotation parameter and described the first translation parameters, described realtime graphic and a described AR content are carried out to actual situation and merge and play up processing, generate a described AR image and show.
7. the augmented reality disposal route of mobile terminal according to claim 6, is characterized in that, described in obtain an AR target reference position information of buffer memory before, described method also comprises:
Described realtime graphic is carried out to feature detection and description, generate First Characteristic and detect data of description, described First Characteristic is detected to data of description and send to AR server, carry out AR target detection so that described AR server detects data of description according to described First Characteristic;
Receive the first testing result that described AR server sends in the time an AR target being detected, wherein, in described the first testing result, carry an AR target information, a described AR target information comprises: in order to indicate an AR target reference position information of the described position of AR target in described realtime graphic and in order to indicate the big or small AR target criteria dimension information of a described AR target in standard picture, by a described AR target information buffer memory;
Stop sending described First Characteristic to described AR server according to described the first testing result and detect data of description;
Obtain an AR content of a described AR target from described AR server, by a described AR content caching.
8. the augmented reality disposal route of mobile terminal according to claim 7, is characterized in that, describedly described freeze picture is carried out to AR processes and generates AR freeze picture and show, is specially:
Obtain corresponding the first rotation parameter, the first translation parameters and the described AR content of described freeze picture of buffer memory, according to the first rotation parameter corresponding to described freeze picture and the first translation parameters, described freeze picture and a described AR content are carried out to actual situation and merge and play up processing, generate described AR freeze picture and show.
9. the augmented reality disposal route of mobile terminal according to claim 7, is characterized in that, described in the processing that judges whether to fix, be specially:
According to described the first rotation parameter generating and described the first translation parameters, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, and processing if so, fixes within the scope of the second Preset Time.
10. the augmented reality disposal route of mobile terminal according to claim 7, is characterized in that, a described AR target information also comprises: in order to indicate the AR target type information of type of a described AR target;
The described processing that judges whether to fix, is specially:
If a described AR target type information is for browsing type, processing fixes.
The 11. augmented reality disposal routes of mobile terminal according to claim 6, it is characterized in that, the described processing that judges whether to fix, if, the realtime graphic of determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory, as after freeze picture, also comprises:
To described Real-time Obtaining to realtime graphic carry out feature detection and description, generate Second Characteristic and detect data of description, described Second Characteristic is detected to data of description and send to described AR server, carry out AR target detection so that described AR server detects data of description according to described Second Characteristic;
Receive the second testing result that described AR server sends in the time the 2nd AR target being detected, wherein, in described the second testing result, carry the 2nd AR target information, described the 2nd AR target information comprises: in order to indicate the 2nd AR target reference position information of described the 2nd position of AR target in described realtime graphic and in order to indicate the big or small two AR target criteria dimension information of described the 2nd AR target in described standard picture, by described the 2nd AR target information buffer memory;
Stop sending described Second Characteristic to described AR server according to described the second testing result and detect data of description;
By described the 2nd AR target information buffer memory, according to described the 2nd AR target reference position information, the 2nd AR target in described realtime graphic is followed the tracks of, if trace into described the 2nd AR target within the scope of the 3rd Preset Time, obtain the 2nd AR content of described the 2nd AR target from described AR server, by described the 2nd AR content caching, generating solution is except indication information the demonstration of fixing;
The instruction if the releasing receiving fixes, according to described the 2nd AR target reference position information of buffer memory, the 2nd AR target in described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the 2nd AR target tracing into and described the 2nd AR target criteria dimension information, generate the second rotation parameter and the second translation parameters, by described the second rotation parameter and described the second translation parameters buffer memory;
According to described the second rotation parameter and described the second translation parameters, described realtime graphic and described the 2nd AR content are carried out to actual situation and merge and play up processing, generate described the 2nd AR image and show.
The augmented reality treating apparatus of 12. 1 kinds of mobile terminals, is characterized in that, comprising:
Image acquisition unit, for obtain the realtime graphic collecting from camera, by described realtime graphic buffer memory;
The first augmented reality processing unit, is connected with described image acquisition unit, processes generation the one AR image, and a described AR image is shown for described realtime graphic being carried out to augmented reality AR;
Processing unit fixes, be used for judging whether to fix processing, if so, the realtime graphic of determining a frame buffer the realtime graphic within the scope of distance current time first Preset Time of buffer memory is as freeze picture, described freeze picture carried out to AR and process and generate AR freeze picture and show.
The augmented reality treating apparatus of 13. mobile terminals according to claim 12, it is characterized in that: described in the processing unit that fixes within the scope of the second Preset Time, whether keep stationary state specifically for detecting described mobile terminal, if so, processing fixes.
The augmented reality treating apparatus of 14. mobile terminals according to claim 13, it is characterized in that: described in fix processing unit specifically for according to the acceleration of gravity information that collects by acceleration of gravity meter and the azimuth information collecting by digital compass within the scope of described the second Preset Time, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, if so, processing fixes.
The augmented reality treating apparatus of 15. mobile terminals according to claim 12, it is characterized in that, the described processing unit that fixes is specifically for the realtime graphic for each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the realtime graphic of described position weight maximum is defined as to described freeze picture.
The augmented reality treating apparatus of 16. mobile terminals according to claim 12, it is characterized in that: described in fix processing unit specifically for the realtime graphic for each frame buffer, generate position weight according to the position of the AR target in the realtime graphic of described buffer memory, the area ratio shared according to the AR target in the realtime graphic of described buffer memory generates area weight, generate sharpness weight according to the sharpness of the AR target in the realtime graphic of described buffer memory, according to the described position weight of the realtime graphic of buffer memory described in each frame, described area weight and described sharpness weight are determined described freeze picture.
The augmented reality treating apparatus of 17. mobile terminals according to claim 12, is characterized in that, described the first augmented reality processing unit, comprising:
First follows the tracks of registration subelement, be connected with described image acquisition unit, for obtaining an AR target reference position information of buffer memory, according to an AR target reference position information of described buffer memory, described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the AR target tracing into and a described AR target criteria dimension information, generate the first rotation parameter and the first translation parameters, by described the first rotation parameter and described the first translation parameters buffer memory;
First plays up subelement, be connected with described the first tracking registration subelement, for obtaining an AR content of buffer memory, according to described the first rotation parameter and described the first translation parameters, described realtime graphic and a described AR content are carried out to actual situation and merge and play up processing, generate a described AR image and show.
The augmented reality treating apparatus of 18. mobile terminals according to claim 17, is characterized in that, described the first augmented reality processing unit also comprises:
The first detection sub-unit, be connected with described image acquisition unit, for described realtime graphic is carried out to feature detection and description, generate First Characteristic and detect data of description, described First Characteristic is detected to data of description and send to AR server, carry out AR target detection so that described AR server detects data of description according to described First Characteristic;
First receives subelement, for receiving the first testing result that described AR server sends in the time an AR target being detected, wherein, in described the first testing result, carry an AR target information, a described AR target information comprises: in order to indicate an AR target reference position information of the described position of AR target in described realtime graphic and in order to indicate the big or small AR target criteria dimension information of a described AR target in standard picture, by a described AR target information buffer memory;
First controls subelement, receives subelement be respectively connected with described the first detection sub-unit and described first, detects data of description for stop sending described First Characteristic to described AR server according to described the first testing result;
First obtains subelement, for obtain an AR content of a described AR target from described AR server, by a described AR content caching.
The augmented reality treating apparatus of 19. mobile terminals according to claim 18, it is characterized in that: described in fix processing unit specifically for obtaining corresponding the first rotation parameter, the first translation parameters and the described AR content of described freeze picture of buffer memory, according to the first rotation parameter corresponding to described freeze picture and the first translation parameters, described freeze picture and a described AR content are carried out to actual situation and merge and play up processing, generate described AR freeze picture and show.
The augmented reality treating apparatus of 20. mobile terminals according to claim 18, it is characterized in that: described in fix processing unit specifically for according to described the first rotation parameter and described the first translation parameters that generate within the scope of the second Preset Time, judge whether described mobile terminal keeps stationary state within the scope of described the second Preset Time, if so, processing fixes.
The augmented reality treating apparatus of 21. mobile terminals according to claim 17, is characterized in that, a described AR target information also comprises: in order to indicate the AR target type information of type of a described AR target;
Fix if described processing unit specifically for a described AR target type information for browsing type, processing fixes.
The augmented reality treating apparatus of 22. mobile terminals according to claim 17, is characterized in that, also comprises the second augmented reality processing unit, and described the second augmented reality processing unit comprises:
The second detection sub-unit, be connected with described image acquisition unit, described realtime graphic is carried out to feature detection and description, generate Second Characteristic and detect data of description, described Second Characteristic is detected to data of description and send to described AR server, carry out AR target detection so that described AR server detects data of description according to described Second Characteristic;
Second receives subelement, for receiving the second testing result that described AR server sends in the time the 2nd AR target being detected, wherein, in described the second testing result, carry the 2nd AR target information, described the 2nd AR target information comprises: in order to indicate the 2nd AR target reference position information of described the 2nd position of AR target in described realtime graphic and in order to indicate the big or small two AR target criteria dimension information of described the 2nd AR target in described standard picture;
Second controls subelement, receives subelement be respectively connected with described the second detection sub-unit and described second, detects data of description for stop sending described Second Characteristic to described AR server according to described the second testing result;
Caching process subelement, receiving subelement with described image acquisition unit and described second is respectively connected, be used for described the 2nd AR target information buffer memory, according to described the 2nd AR target reference position information, the 2nd AR target in described realtime graphic is followed the tracks of, if trace into described the 2nd AR target within the scope of the 3rd Preset Time, obtain the 2nd AR content of described the 2nd AR target from described AR server, by described the 2nd AR content caching, generating solution is except indication information the demonstration of fixing;
Second follows the tracks of registration subelement, be connected with described image acquisition unit, the instruction if the releasing that is used for receiving fixes, according to described the 2nd AR target reference position information of buffer memory, the 2nd AR target in described realtime graphic is followed the tracks of, carry out three-dimensional registration calculating according to the 2nd AR target tracing into and described the 2nd AR target criteria dimension information, generate the second rotation parameter and the second translation parameters, by described the second rotation parameter and described the second translation parameters buffer memory;
Second plays up subelement, be connected with described the second tracking registration subelement, for according to described the second rotation parameter and described the second translation parameters, described realtime graphic and described the 2nd AR content are carried out to actual situation and merge and play up processing, generate described the 2nd AR image and show.
CN201280001436.1A 2012-09-14 2012-09-14 The augmented reality processing method and processing device of mobile terminal Active CN103814382B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/081430 WO2014040281A1 (en) 2012-09-14 2012-09-14 Augmented reality processing method and device for mobile terminal

Publications (2)

Publication Number Publication Date
CN103814382A true CN103814382A (en) 2014-05-21
CN103814382B CN103814382B (en) 2016-10-05

Family

ID=50277517

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201280001436.1A Active CN103814382B (en) 2012-09-14 2012-09-14 The augmented reality processing method and processing device of mobile terminal

Country Status (2)

Country Link
CN (1) CN103814382B (en)
WO (1) WO2014040281A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104156082A (en) * 2014-08-06 2014-11-19 北京行云时空科技有限公司 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes
CN105184825A (en) * 2015-10-29 2015-12-23 丽水学院 Indoor-scene-oriented mobile augmented reality method
CN106843493A (en) * 2017-02-10 2017-06-13 深圳前海大造科技有限公司 A kind of augmented reality implementation method of picture charge pattern method and application the method
CN106875431A (en) * 2017-02-10 2017-06-20 深圳前海大造科技有限公司 Picture charge pattern method and augmented reality implementation method with moving projection
CN107168619A (en) * 2017-03-29 2017-09-15 腾讯科技(深圳)有限公司 User-generated content treating method and apparatus
CN108804330A (en) * 2018-06-12 2018-11-13 Oppo(重庆)智能科技有限公司 Test method, device, storage medium and electronic equipment
CN108958929A (en) * 2018-06-15 2018-12-07 Oppo(重庆)智能科技有限公司 Using the method, apparatus of algorithms library, storage medium and electronic equipment
CN109215132A (en) * 2017-06-30 2019-01-15 华为技术有限公司 A kind of implementation method and equipment of augmented reality business
CN109300184A (en) * 2018-09-29 2019-02-01 五八有限公司 AR Dynamic Display method, apparatus, computer equipment and readable storage medium storing program for executing
CN109741289A (en) * 2019-01-25 2019-05-10 京东方科技集团股份有限公司 Image fusion method and VR equipment
CN109842790A (en) * 2017-11-29 2019-06-04 财团法人工业技术研究院 Image information display methods and display
CN111429335A (en) * 2020-06-12 2020-07-17 恒信东方文化股份有限公司 Picture caching method and system in virtual dressing system
CN113115110A (en) * 2021-05-20 2021-07-13 广州博冠信息科技有限公司 Video synthesis method and device, storage medium and electronic equipment
CN113269832A (en) * 2021-05-31 2021-08-17 长春工程学院 Electric power operation augmented reality navigation system and method for extreme weather environment
CN113660528A (en) * 2021-05-24 2021-11-16 杭州群核信息技术有限公司 Video synthesis method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1692631A (en) * 2002-12-06 2005-11-02 卡西欧计算机株式会社 Image pickup device and image pickup method
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
CN101877063A (en) * 2009-11-25 2010-11-03 中国科学院自动化研究所 Sub-pixel characteristic point detection-based image matching method
WO2011152902A1 (en) * 2010-03-08 2011-12-08 Empire Technology Development Llc Broadband passive tracking for augmented reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1692631A (en) * 2002-12-06 2005-11-02 卡西欧计算机株式会社 Image pickup device and image pickup method
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
CN101877063A (en) * 2009-11-25 2010-11-03 中国科学院自动化研究所 Sub-pixel characteristic point detection-based image matching method
WO2011152902A1 (en) * 2010-03-08 2011-12-08 Empire Technology Development Llc Broadband passive tracking for augmented reality

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104156082A (en) * 2014-08-06 2014-11-19 北京行云时空科技有限公司 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes
CN105184825A (en) * 2015-10-29 2015-12-23 丽水学院 Indoor-scene-oriented mobile augmented reality method
CN106843493A (en) * 2017-02-10 2017-06-13 深圳前海大造科技有限公司 A kind of augmented reality implementation method of picture charge pattern method and application the method
CN106875431A (en) * 2017-02-10 2017-06-20 深圳前海大造科技有限公司 Picture charge pattern method and augmented reality implementation method with moving projection
CN107168619A (en) * 2017-03-29 2017-09-15 腾讯科技(深圳)有限公司 User-generated content treating method and apparatus
CN107168619B (en) * 2017-03-29 2023-09-19 腾讯科技(深圳)有限公司 User generated content processing method and device
CN109215132A (en) * 2017-06-30 2019-01-15 华为技术有限公司 A kind of implementation method and equipment of augmented reality business
CN109842790B (en) * 2017-11-29 2021-02-26 财团法人工业技术研究院 Image information display method and display
CN109842790A (en) * 2017-11-29 2019-06-04 财团法人工业技术研究院 Image information display methods and display
US10896500B2 (en) 2017-11-29 2021-01-19 Industrial Technology Research Institute Display and method for displaying dynamic information of object
CN108804330A (en) * 2018-06-12 2018-11-13 Oppo(重庆)智能科技有限公司 Test method, device, storage medium and electronic equipment
CN108958929B (en) * 2018-06-15 2021-02-02 Oppo(重庆)智能科技有限公司 Method and device for applying algorithm library, storage medium and electronic equipment
CN108958929A (en) * 2018-06-15 2018-12-07 Oppo(重庆)智能科技有限公司 Using the method, apparatus of algorithms library, storage medium and electronic equipment
CN109300184A (en) * 2018-09-29 2019-02-01 五八有限公司 AR Dynamic Display method, apparatus, computer equipment and readable storage medium storing program for executing
CN109741289B (en) * 2019-01-25 2021-12-21 京东方科技集团股份有限公司 Image fusion method and VR equipment
CN109741289A (en) * 2019-01-25 2019-05-10 京东方科技集团股份有限公司 Image fusion method and VR equipment
CN111429335A (en) * 2020-06-12 2020-07-17 恒信东方文化股份有限公司 Picture caching method and system in virtual dressing system
CN111429335B (en) * 2020-06-12 2020-09-08 恒信东方文化股份有限公司 Picture caching method and system in virtual dressing system
CN113115110A (en) * 2021-05-20 2021-07-13 广州博冠信息科技有限公司 Video synthesis method and device, storage medium and electronic equipment
CN113660528A (en) * 2021-05-24 2021-11-16 杭州群核信息技术有限公司 Video synthesis method and device, electronic equipment and storage medium
CN113660528B (en) * 2021-05-24 2023-08-25 杭州群核信息技术有限公司 Video synthesis method and device, electronic equipment and storage medium
CN113269832A (en) * 2021-05-31 2021-08-17 长春工程学院 Electric power operation augmented reality navigation system and method for extreme weather environment

Also Published As

Publication number Publication date
WO2014040281A1 (en) 2014-03-20
CN103814382B (en) 2016-10-05

Similar Documents

Publication Publication Date Title
CN103814382A (en) Augmented reality processing method and device of mobile terminal
CN109618212B (en) Information display method, device, terminal and storage medium
JP2013196157A (en) Control apparatus, electronic device, control method, and program
CN103871092A (en) Display control device, display control method and program
CN103248810A (en) Image processing device, image processing method, and program
US20190335034A1 (en) Input method, device and system
CN111552470A (en) Data analysis task creation method and device in Internet of things and storage medium
CN111432245B (en) Multimedia information playing control method, device, equipment and storage medium
CN103455136A (en) Inputting method, inputting device and inputting system based on gesture control
CN103201710A (en) Image processing system, image processing method, and storage medium storing image processing program
JP2013196158A (en) Control apparatus, electronic apparatus, control method, and program
CN104025615A (en) Interactive streaming video
CN111836069A (en) Virtual gift presenting method, device, terminal, server and storage medium
CN110933468A (en) Playing method, playing device, electronic equipment and medium
CN103309437A (en) Buffering mechanism for camera-based gesturing
CN114154068A (en) Media content recommendation method and device, electronic equipment and storage medium
CN113422980B (en) Video data processing method and device, electronic equipment and storage medium
US20170180671A1 (en) Method for displaying operation trajectory , an electronic device and a non-transitory computer-readable storage medium
CN111277904B (en) Video playing control method and device and computing equipment
CN104541304A (en) Target object angle determination using multiple cameras
CN105468206B (en) Interactive demonstration method and equipment
CN112001442A (en) Feature detection method and device, computer equipment and storage medium
CN114339294B (en) Method, device and equipment for confirming network jitter and storage medium
CN115531875A (en) Virtual scene zooming method and device, storage medium and electronic equipment
JP6856084B2 (en) Information processing device, content control device, information processing method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant