CN109302561A - A kind of image capture method, terminal and storage medium - Google Patents
A kind of image capture method, terminal and storage medium Download PDFInfo
- Publication number
- CN109302561A CN109302561A CN201710614083.7A CN201710614083A CN109302561A CN 109302561 A CN109302561 A CN 109302561A CN 201710614083 A CN201710614083 A CN 201710614083A CN 109302561 A CN109302561 A CN 109302561A
- Authority
- CN
- China
- Prior art keywords
- camera
- terminal
- motion tracking
- rgb
- rgb camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
Abstract
The invention discloses a kind of image capture method, terminal and storage mediums, the terminal includes motion tracking camera, the RGB RGB camera for being adapted to AR function, the described method includes: in the state that terminal AR application is closed, when receiving photographing instruction, start the motion tracking camera and the RGB camera, and the image shot to the motion tracking camera and the RGB camera carries out synthesis processing;Wherein, the overlapping of the visual field of the motion tracking camera and RGB camera is greater than the threshold value of setting.The present invention can be multiplexed the motion tracking camera of adaptation AR function, so that motion tracking camera under different application scene, can efficiently use, not only improve the overall performance of terminal device, but also under equal performance, reduce the cost of terminal device.
Description
Technical field
The present invention relates to augmented reality field of display technology more particularly to a kind of image capture methods, terminal and storage medium.
Background technique
Google Tango is AR (augmented reality) development platform, is added on the basis of common RGB (RGB) camera
Add, fisheye (flake) camera and depth detection mould group.Depth detection mould group is by IR (infrared) transmitter and IR
Sensor composition, the former emits for IR infrared ray, and the latter sensor is to infrared detection, and according to distance, infrared light flies
Row time, each pixel save range information, form point cloud, then cooperate RGB camera texture, carry out 3D imaging.Depth detection
Mould group can be accurate to Centimeter Level to surrounding objects ranging, can detect 0.4m to 4m object distance.Depth detection uses IR,
So being required to light, it is impossible to be used in outdoor, interior also requires light.Fish-eye camera effect is to be used for motion tracking,
It namely needs to perceive the movement of oneself in mobile device moving process, cooperates IMU (Inertial Measurement Unit), complete perception and move
Dynamic equipment moving track.
However, Google Tango prior art defect is that some matched hardware components are only used for AR application,
It is multiplexed relatively low, cost is relatively high.Meanwhile depth detection mould group uses infrared detection ambient enviroment, is limited by light, it can only
For interior, and it is indoor in use, also required to light, cause using limited.
Summary of the invention
The embodiment of the present invention proposes a kind of image capture method, terminal and storage medium, is adapted to AR in the prior art to solve
The hardware component reusability of function is low, so cause terminal overall cost improve the problem of.
According to one aspect of the present invention, a kind of image capture method is provided, is applied in the terminal with AR function, the end
End includes motion tracking camera, the RGB RGB camera of adaptation AR function, which is characterized in that the described method includes:
Terminal AR application close in the state of, when receiving photographing instruction, start the motion tracking camera and
The RGB camera, and the image shot to the motion tracking camera and the RGB camera carry out synthesis processing;
Wherein, the overlapping of the visual field of the motion tracking camera and RGB camera is greater than the threshold value of setting.
Optionally, the method also includes:
In the state that terminal AR application is opened, by the figure of the motion tracking camera and RGB camera shooting
As carrying out synthesis processing, and will synthesis treated that image transmitting is applied to the AR, to assist the AR application progress 3D line
Reason imaging.
Optionally, the RGB camera are as follows: optical zoom pick-up head, alternatively, Digital Zoom camera.
Optionally, when the RGB camera is optical zoom pick-up head, the RGB camera includes: periscopic camera shooting
Head;
When the RGB camera is Digital Zoom camera, the RGB camera includes: focal length camera.
Optionally, when the RGB camera is optical zoom pick-up head, in the state that terminal AR application is closed, institute
The work of RGB camera is stated under optical zoom mode;In the state that terminal AR application is opened, the RGB camera work exists
Under fixed-focus mode or zoom mode.
Optionally, the visual field of the motion tracking camera and RGB camera overlapping is greater than 70%.
Optionally, the motion tracking camera is fish-eye camera;And/or the resolution of the motion tracking camera
Rate is 8,000,000 or more pixels.
Optionally, the terminal further includes being adapted to the depth detection mould group of AR function;The depth detection mould group are as follows: flight
Time TOF radar mould group.
According to another aspect of the invention, a kind of terminal is provided, comprising: memory, processor, the fortune for being adapted to AR function
It is dynamic to track camera, RGB RGB camera and be stored in the meter that run on the memory and on the processor
The step of calculation machine program, the processor realizes the method for the invention when executing the computer program.
In terms of third according to the present invention, a kind of computer readable storage medium is provided, computer journey is stored thereon with
The step of sequence, which realizes the method for the invention when being executed by processor.
The present invention has the beneficial effect that:
Firstly, the embodiment of the present invention, can be multiplexed the motion tracking camera of adaptation AR function, so that motion tracking
Camera can efficiently use under different application scene, not only improve the overall performance of terminal device, but also in congruency
Under energy, the cost of terminal device is reduced.
Secondly, the embodiment of the present invention can effectively improve actual environment by using the RGB camera with optical zoom
The ability of imaging;
Third, the embodiment of the present invention substitute existing infrared depth detection mould group using TOF radar depth detection mould group, can
The use scope of AR function is expanded to outdoor from interior, the value of AR platform is improved.
Above description is only the general introduction of technical solution of the embodiment of the present invention, in order to better understand the embodiment of the present invention
Technological means, and can be implemented in accordance with the contents of the specification, and in order to allow above and other mesh of the embodiment of the present invention
, feature and advantage can be more clearly understood, the special specific embodiment for lifting the embodiment of the present invention below.
Detailed description of the invention
By reading the following detailed description of the preferred embodiment, various other advantages and benefits are common for this field
Technical staff will become clear.The drawings are only for the purpose of illustrating a preferred embodiment, and is not considered as to the present invention
Limitation.And throughout the drawings, the same reference numbers will be used to refer to the same parts.In the accompanying drawings:
Fig. 1 is a kind of flow chart for image capture method that first embodiment of the invention provides;
Fig. 2 is mode transition diagram in the embodiment of the present invention;
Fig. 3 is a kind of example flow diagram that the present invention realizes image capture method;
Fig. 4 is a kind of flow chart for image capture method that second embodiment of the invention provides;
Fig. 5 is the another example flow diagram that the present invention realizes image capture method;
Fig. 6 is a kind of structural block diagram for terminal that third embodiment of the invention provides.
Specific embodiment
Exemplary embodiments of the present disclosure are described in more detail below with reference to accompanying drawings.Although showing the disclosure in attached drawing
Exemplary embodiment, it being understood, however, that may be realized in various forms the disclosure without should be by embodiments set forth here
It is limited.On the contrary, these embodiments are provided to facilitate a more thoroughly understanding of the present invention, and can be by the scope of the present disclosure
It is fully disclosed to those skilled in the art.
In the first embodiment of the present invention, a kind of image capture method is provided, is applied in the terminal with AR function, it is described
Terminal includes motion tracking camera, RGB RGB camera, depth detection mould group and the inertia measurement list for being adapted to AR function
Member, as shown in Figure 1, described method includes following steps for the present embodiment:
Step S101, in the state that terminal AR application is closed, when receiving photographing instruction, starting motion tracking camera shooting
Head and RGB camera;
Step S102, the image shot to motion tracking camera and RGB camera carry out synthesis processing;
In the embodiment of the present invention, the visual field FOV overlapping of motion tracking camera and RGB camera is greater than the threshold value of setting.
Optionally, in the present embodiment, the visual field overlapping of motion tracking camera and RGB camera is greater than 70%.In order to improve synthesis effect
Fruit suggests that visual field overlapping reaches 90% in the present embodiment.
The method of the embodiment of the present invention, in the case where terminal AR application is closed, to the motion tracking of adaptation AR application
Camera is multiplexed, so that motion tracking camera and RGB camera constitute the dual camera that terminal is taken pictures, is realized in nothing
In the case where terminal cost need to be increased, the taking pictures of terminal, shooting capacity are improved.
In the embodiment of the present invention, the motion tracking camera be can be, but not limited to as fish-eye camera.Fish eye lens is most
Big effect is that angular field of view is big, and visual angle generally can reach 220 or 230, this is a wide range of Landscape Creation of shooting at close range
Condition;Fish-eye camera can cause transparent effect strongly when shooting close to object, emphasize that object is close big remote
Small comparison makes taken the photograph picture have a kind of stirring appeal;Fish eye lens has the quite long depth of field, is conducive to table
The long Deep Canvas of existing photo.In the embodiment of the present invention, in order to guarantee the picture synthesis effect of fish-eye camera and RGB camera
Fruit, the resolution ratio for designing fish-eye camera is 8,000,000 or more pixels.
In one particular embodiment of the present invention, the RGB camera are as follows: optical zoom pick-up head;
Wherein, optical zoom pick-up head can be, but not limited to are as follows: periscopic camera.Periscopic camera is to pass through prism
Refraction, is focused inside machine by the floating of eyeglass mould group, can be realized as optical zoom without camera protrusion, from
And clearly shoot farther away subject.
In the present embodiment, when the RGB camera is optical zoom pick-up head, in the state that terminal AR application is closed
Under, the RGB camera work is under optical zoom mode.The present embodiment is by making the work of RGB camera in optical zoom
Under mode, the effect taken pictures can be improved, obtain the image of high-quality effect.
It is not very high terminal device for AR application picture sense requirement in still another embodiment of the invention, it can
To use focal length camera to replace periscopic camera.
Further, in the embodiment of the present invention, in the state that terminal AR application is opened, RGB camera and motion tracking
Camera is respectively used to the imaging of depth detection 3D texture and motion tracking, specific:
The environmental view of shooting is sent AR application by RGB camera, and AR applies the environmental view for shooting RGB camera
It is matched with the data of depth detection mould group detection, realizes the imaging of depth detection 3D texture.
The environmental view of shooting is sent AR application by motion tracking camera, and AR is applied and shot motion tracking camera
Environmental view and Inertial Measurement Unit detection data match, realize the motion tracking of terminal device.
In the present embodiment, when RGB camera uses optical zoom pick-up head, in the state that terminal AR application is opened,
Under fixed-focus mode, the focal length of fixed-focus can be pre-configured with for the RGB camera work.In the present embodiment, by by RGB camera
It is set as fixed-focus mode, the power consumption of terminal can be reduced.
Certainly, in the present embodiment, in the state that terminal AR application is opened, the RGB camera can also work in zoom
Under mode.When the work of RGB camera is under zoom mode, in AR scene, RGB camera enhances bat by optical zoom
Distant view ability is taken the photograph, and then improves the effect of 3D texture imaging, promotes AR user experience.
Those skilled in the art can flexibly be set in RGB camera in the state that terminal AR application is opened according to demand
Operating mode.
It further, is TOF (flight time) radar mould group by depth detection module design in the embodiment of the present invention.By
Light and weather influence are received unlike infrared in radar depth detection technology, so that the application such as composition of 3D navigation in real time does not exist
It is influenced by light, weather, so that the AR equipment is suitable for outdoor various occasions.
Below by an example, the implementation process of the present embodiment the method is described in detail.
The method of the embodiment of the present invention is applied in google Tango AR development platform intelligent mobile terminal, multiplexing
Fish-eye camera, and using periscopic camera as RGB camera, so that AR application can be run and take high-quality
Effect video effect picture.In the embodiment of the present invention, designs fish-eye camera and RGB camera FOV is laminated in 90% or more.
As shown in Fig. 2, mode transition diagram is shown, specifically, default mode is optical zoom in terminal booting
Mode, alternatively, being Tango AR mode.Under Tango AR mode, Tango application is closed, optical zoom mode is converted into, is dived
Prestige formula camera becomes normal zoom function from a certain focal length of fixation;Under optical zoom mode, Tango AR application, light are opened
It learns zoom mode and is converted into Tango AR mode, periscopic camera fixed-focus is a certain proper focal length.
As shown in figure 3, described method includes following steps for the present embodiment:
Step S301, when booting, default mode is optical zoom mode.
Step S302, terminal judge whether Tango AR application starts, if Tango AR is determined eventually using not starting
End is optical zoom mode, executes step S303;If Tango AR application starting, it is Tango AR mode that terminal, which is arranged,
Execute step S304;
Wherein, when terminal is optical zoom mode, periscopic camera works under zoom mode;
When terminal is Tango AR mode, periscopic camera works under fixed-focus mode.Certainly, in some specific AR
Under application scenarios, for example distant view occasion, periscopic camera can also work under zoom mode, at this point, user can adjust latent prestige
Formula camera focal length increases image recognition accuracy, to promote AR application performance.
Step S303, terminal start periscopic camera and fish-eye camera in the photographing instruction for receiving user, and
The image that two cameras are shot is subjected to synthesis processing, obtains good photo.
Link is being used, user adjusts periscopic camera focal length, fish-eye camera and periscopic according to the needs when taking pictures
Camera takes close shot and distant view respectively, synthesizes high-quality photo.
Step S304, the RGB camera shooting environmental view of terminal, and AR application is sent by environmental view, with auxiliary
AR application carries out 3D navigation and composition;The fish-eye camera shooting environmental view of terminal, and AR application is sent by environmental view,
To assist AR application to carry out motion tracking.
In second embodiment of the invention, a kind of image capture method is provided, is applied in the terminal with AR function, the end
End includes motion tracking camera, RGB camera, depth detection mould group and the Inertial Measurement Unit of adaptation AR function, this implementation
Example will be focused on illustrating the difference with first embodiment, and something in common is referring to first embodiment.As shown in figure 4, this reality
Applying example, described method includes following steps:
Step S401, in the state that terminal AR application is closed, when receiving photographing instruction, starting motion tracking camera shooting
Head and RGB camera, and the image shot to motion tracking camera and RGB camera carries out synthesis processing;
Step S402, in the state that terminal AR application is opened, by the figure of motion tracking camera and the shooting of RGB camera
As carrying out synthesis processing, and will synthesis treated that image transmitting is applied to the AR, to assist the AR application progress 3D line
Reason imaging.
In the embodiment of the present invention, when RGB camera is optical zoom pick-up head, in the state that terminal AR application is opened
Under, RGB camera works under fixed-focus mode or zoom mode.Those skilled in the art can flexibly set according to demand
The operating mode of RGB camera in the state that terminal AR application is opened.
In the embodiment of the present invention, carried out using the auxiliary AR application of the composograph of RGB camera and motion tracking camera
The imaging of 3D texture, can be improved the effect of 3D texture imaging, promote AR user experience.
It should be pointed out that step S401 and S402 do not have stringent ordinal relation in the embodiment of the present invention.
Below by an example, the implementation process of the present embodiment the method is described in detail.
The method of the embodiment of the present invention is applied in google Tango AR development platform intelligent mobile terminal, multiplexing
Fish-eye camera, and using periscopic camera as RGB camera, so that AR application can be run and take high-quality
Effect video effect picture.In the embodiment of the present invention, designs fish-eye camera and RGB camera FOV is laminated in 90% or more.
As shown in figure 5, described method includes following steps for the embodiment of the present invention:
Step S501, terminal device judge whether Tango AR application starts, and do not start if Tango AR is applied, really
Determining terminal is optical zoom mode, thens follow the steps S502;If Tango AR application starting, it is Tango AR that terminal, which is arranged,
Mode thens follow the steps S503;
Wherein, when terminal is optical zoom mode, periscopic camera works under zoom mode;
When terminal is Tango AR mode, periscopic camera works under fixed-focus mode or zoom mode.
Step S502, terminal start periscopic camera and fish-eye camera in the photographing instruction for receiving user, and
The image that two cameras are shot is subjected to synthesis processing, obtains good photo.
RGB camera and fish-eye camera shooting environmental view are carried out synthesis processing by step S503, terminal, and will synthesis
Environmental view that treated is sent to AR application, to assist AR application to carry out 3D navigation and composition;The fish-eye camera of terminal is clapped
Environmental view is taken the photograph, and sends AR application for environmental view, to assist AR application to carry out the motion tracking of terminal itself.
In the present embodiment, the synthesising picture of periscopic camera and fish-eye camera can effectively promote 3D texture in AR application
Effect.
In the embodiment of the present invention, what user can carry out Tango AR application is turned on or off operation, if in Tango
AR application open state, user execute shutoff operation, then discharge Tango AR related resource and general T ango AR release process
Equally;If user can execute opening operation under Tango AR closed state, then terminal device application Tango AR is related
Resource is as general T ango AR application resource process.
In third embodiment of the invention, provide a kind of terminal, as shown in Figure 6, comprising: memory 610, processor 620,
It is adapted to motion tracking camera 630, RGB camera 640, depth detection mould group 650 and the Inertial Measurement Unit 660 of AR function,
And it is stored in the computer program that can be run on memory 610 and on processor 620, the processor 620 executes described
Following method and step is realized when computer program:
In the state that terminal AR application is closed, when receiving photographing instruction, start 630 He of motion tracking camera
RGB camera 640, and the image shot to motion tracking camera 630 and RGB camera 640 carries out synthesis processing;
In the embodiment of the present invention, the visual field FOV overlapping of motion tracking camera 630 and RGB camera 640 is greater than setting
Threshold value.Optionally, in the present embodiment, the visual field overlapping of motion tracking camera 630 and RGB camera 640 is greater than 70%.In order to
Synthetic effect is improved, suggests that visual field overlapping reaches 90% in the present embodiment.
In the embodiment of the present invention, the motion tracking camera 630 be can be, but not limited to as fish-eye camera.
In the embodiment of the present invention, in order to guarantee the picture synthetic effect of motion tracking camera 630 and RGB camera 640,
The resolution ratio for designing motion tracking camera 630 is 8,000,000 or more pixels.
In one particular embodiment of the present invention, the RGB camera 640 are as follows: optical zoom pick-up head;Wherein, light
Learning zoom camera can be, but not limited to are as follows: periscopic camera.
In the present embodiment, when the RGB camera 640 is optical zoom pick-up head, in the shape that terminal AR application is closed
Under state, the work of RGB camera 640 is under optical zoom mode.The present embodiment is by making the work of RGB camera 640 exist
Under optical zoom mode, the effect taken pictures can be improved, obtain the image of high-quality effect.
It is not very high terminal device for AR application picture sense requirement in still another embodiment of the invention, it can
To use focal length camera to replace periscopic camera.
In one particular embodiment of the present invention, in the state that terminal AR application is opened, RGB camera 640 and fortune
Dynamic tracking camera 630 is respectively used to the imaging of depth detection 3D texture and motion tracking, specific:
The environmental view of shooting is sent AR application by RGB camera 640, and AR applies the ring for shooting RGB camera 640
The data that border view and depth detection mould group 650 detect match, and realize the imaging of depth detection 3D texture.
The environmental view of shooting is sent AR application by motion tracking camera 630, and AR is applied motion tracking camera
The data that the environmental view and Inertial Measurement Unit 660 of 630 shootings detect match, and realize the motion tracking of terminal device.
In the present embodiment, when RGB camera 640 is using optical zoom pick-up head, in the state that terminal AR application is opened
Under, under fixed-focus mode, the focal length of fixed-focus can be pre-configured with for the work of RGB camera 640.In the present embodiment, by by RGB
Camera is set as fixed-focus mode, can reduce the power consumption of terminal.
Certainly, in the present embodiment, in the state that terminal AR application is opened, the RGB camera 640, which can also work, to be become
Under burnt mode.When the work of RGB camera 640 is under zoom mode, in AR scene, RGB camera 640 passes through optical zoom
To enhance shooting distant view ability, and then the effect of raising 3D texture imaging, promotion AR user experience.
Those skilled in the art can flexibly be set in RGB camera in the state that terminal AR application is opened according to demand
Operating mode.
In still another embodiment of the invention, in the state that terminal AR application is opened, by motion tracking camera
The image that 630 and RGB camera 640 is shot carries out synthesis processing, and will synthesis treated that image transmitting gives AR application,
To assist the AR application to carry out the imaging of 3D texture.
In the embodiment of the present invention, answered using the composograph of RGB camera 640 and motion tracking camera 630 auxiliary AR
With 3D texture imaging is carried out, the effect of 3D texture imaging can be improved, promote AR user experience.
Further, in the embodiment of the present invention, depth detection mould group 650 is designed as TOF (flight time) radar mould group.
Since radar depth detection technology receives light and weather influence unlike infrared, so that the application such as composition of 3D navigation in real time is not
It is influenced by light, weather, so that the AR equipment is suitable for outdoor various occasions.
Terminal described in the embodiment of the present invention images the motion tracking of adaptation AR application in the case where AR application is closed
Head is multiplexed, so that motion tracking camera and RGB camera constitute the dual camera that terminal is taken pictures, is realized without increasing
In the case where adding terminal cost, the taking pictures of terminal, shooting capacity are improved.
In addition, terminal described in the embodiment of the present invention can be effectively improved by using the RGB camera with optical zoom
The ability of actual environment imaging;
Furthermore terminal described in the embodiment of the present invention is examined using the existing infrared depth of TOF radar depth detection mould group substitution
Mould group is surveyed, the use scope of AR function outdoor can be expanded into from interior, improve the value of AR platform.
In the fourth embodiment of the present invention, a kind of computer readable storage medium is provided, computer journey is stored thereon with
Sequence, the step of method as described in first and/or second embodiment is realized when which is executed by processor.
Due to being elaborated to the implementation process of method in the first, second embodiment, so this implementation
It is no longer repeated herein for example.
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method, apparatus, system or calculate
Machine program product.Therefore, hardware embodiment, software implementation or implementation combining software and hardware aspects can be used in the present invention
The form of example.Moreover, can be used can in the computer that one or more wherein includes computer usable program code by the present invention
With the form for the computer program product implemented on storage medium (including but not limited to magnetic disk storage and optical memory etc.).
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions
The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs
Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce
A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real
The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates,
Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or
The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting
Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or
The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one
The step of function of being specified in a box or multiple boxes.
The foregoing is only a preferred embodiment of the present invention, is not intended to limit the scope of the present invention.
Claims (10)
1. a kind of image capture method is applied in the terminal with augmented reality AR function, the terminal includes adaptation AR function
Motion tracking camera, RGB RGB camera, which is characterized in that the described method includes:
In the state that terminal AR application is closed, when receiving photographing instruction, start the motion tracking camera and described
RGB camera, and the image shot to the motion tracking camera and the RGB camera carry out synthesis processing;
Wherein, the overlapping of the visual field of the motion tracking camera and RGB camera is greater than the threshold value of setting.
2. the method as described in claim 1, which is characterized in that the method also includes:
In the state that terminal AR application is opened, image that the motion tracking camera and the RGB camera are shot into
Row synthesis processing, and synthesis treated image transmitting is applied to the AR, with assist the AR application progress 3D texture at
Picture.
3. method according to claim 1 or 2, which is characterized in that the RGB camera are as follows: optical zoom pick-up head, or
Person, Digital Zoom camera.
4. method as claimed in claim 3, which is characterized in that described when the RGB camera is optical zoom pick-up head
RGB camera includes: periscopic camera;
When the RGB camera is Digital Zoom camera, the RGB camera includes: focal length camera.
5. method as claimed in claim 3, which is characterized in that when the RGB camera is optical zoom pick-up head, at end
In the state of holding AR application to close, the RGB camera work is under optical zoom mode;In the state that terminal AR application is opened
Under, the RGB camera work is under fixed-focus mode or zoom mode.
6. the method as described in claim 1, which is characterized in that the visual field weight of the motion tracking camera and RGB camera
It is folded to be greater than 70%.
7. the method as described in claim 1, which is characterized in that the motion tracking camera is fish-eye camera;And/or
The resolution ratio of the motion tracking camera is 8,000,000 or more pixels.
8. the method as described in claim 1, which is characterized in that the terminal further includes being adapted to the depth of augmented reality AR function
Detect mould group;The depth detection mould group are as follows: flight time TOF radar mould group.
9. a kind of terminal characterized by comprising memory, processor, the motion tracking camera, red green for being adapted to AR function
Blue RGB camera and it is stored in the computer program that can be run on the memory and on the processor, the processing
Device is realized when executing the computer program such as the step of claim 1 to 8 any one the method.
10. a kind of computer readable storage medium, which is characterized in that be stored thereon with computer program, the program is by processor
It realizes when execution such as the step of method described in any item of the claim 1 to 8.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710614083.7A CN109302561A (en) | 2017-07-25 | 2017-07-25 | A kind of image capture method, terminal and storage medium |
PCT/CN2018/095034 WO2019019907A1 (en) | 2017-07-25 | 2018-07-09 | Photographing method, terminal and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710614083.7A CN109302561A (en) | 2017-07-25 | 2017-07-25 | A kind of image capture method, terminal and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109302561A true CN109302561A (en) | 2019-02-01 |
Family
ID=65039976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710614083.7A Pending CN109302561A (en) | 2017-07-25 | 2017-07-25 | A kind of image capture method, terminal and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109302561A (en) |
WO (1) | WO2019019907A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109919128A (en) * | 2019-03-20 | 2019-06-21 | 联想(北京)有限公司 | Acquisition methods, device and the electronic equipment of control instruction |
CN111932901A (en) * | 2019-05-13 | 2020-11-13 | 阿里巴巴集团控股有限公司 | Road vehicle tracking detection apparatus, method and storage medium |
CN112230217A (en) * | 2020-09-10 | 2021-01-15 | 成都多普勒科技有限公司 | A integrative radar sensor of miniature photoelectricity for intelligent automobile |
CN113012199A (en) * | 2021-03-23 | 2021-06-22 | 北京灵汐科技有限公司 | System and method for tracking moving object |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN202818502U (en) * | 2012-09-24 | 2013-03-20 | 天津市亚安科技股份有限公司 | Multidirectional monitoring area early warning positioning monitoring device |
CN103179339A (en) * | 2011-12-01 | 2013-06-26 | 索尼公司 | Image processing system and method |
CN105409212A (en) * | 2013-02-28 | 2016-03-16 | 谷歌技术控股有限责任公司 | Electronic device with multiview image capture and depth sensing |
US20160368417A1 (en) * | 2015-06-17 | 2016-12-22 | Geo Semiconductor Inc. | Vehicle vision system |
CN106464786A (en) * | 2014-06-27 | 2017-02-22 | 富士胶片株式会社 | Imaging device |
CN106941588A (en) * | 2017-03-13 | 2017-07-11 | 联想(北京)有限公司 | A kind of data processing method and electronic equipment |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10719939B2 (en) * | 2014-10-31 | 2020-07-21 | Fyusion, Inc. | Real-time mobile device capture and generation of AR/VR content |
CN105681766A (en) * | 2016-03-21 | 2016-06-15 | 贵州大学 | Three-dimensional panoramic camera augmented reality system |
CN106210547A (en) * | 2016-09-05 | 2016-12-07 | 广东欧珀移动通信有限公司 | A kind of method of pan-shot, Apparatus and system |
CN106791298A (en) * | 2016-12-01 | 2017-05-31 | 广东虹勤通讯技术有限公司 | A kind of terminal and photographic method with dual camera |
-
2017
- 2017-07-25 CN CN201710614083.7A patent/CN109302561A/en active Pending
-
2018
- 2018-07-09 WO PCT/CN2018/095034 patent/WO2019019907A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103179339A (en) * | 2011-12-01 | 2013-06-26 | 索尼公司 | Image processing system and method |
CN202818502U (en) * | 2012-09-24 | 2013-03-20 | 天津市亚安科技股份有限公司 | Multidirectional monitoring area early warning positioning monitoring device |
CN105409212A (en) * | 2013-02-28 | 2016-03-16 | 谷歌技术控股有限责任公司 | Electronic device with multiview image capture and depth sensing |
CN106464786A (en) * | 2014-06-27 | 2017-02-22 | 富士胶片株式会社 | Imaging device |
US20160368417A1 (en) * | 2015-06-17 | 2016-12-22 | Geo Semiconductor Inc. | Vehicle vision system |
CN106941588A (en) * | 2017-03-13 | 2017-07-11 | 联想(北京)有限公司 | A kind of data processing method and electronic equipment |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109919128A (en) * | 2019-03-20 | 2019-06-21 | 联想(北京)有限公司 | Acquisition methods, device and the electronic equipment of control instruction |
CN109919128B (en) * | 2019-03-20 | 2021-04-13 | 联想(北京)有限公司 | Control instruction acquisition method and device and electronic equipment |
CN111932901A (en) * | 2019-05-13 | 2020-11-13 | 阿里巴巴集团控股有限公司 | Road vehicle tracking detection apparatus, method and storage medium |
CN111932901B (en) * | 2019-05-13 | 2022-08-09 | 斑马智行网络(香港)有限公司 | Road vehicle tracking detection apparatus, method and storage medium |
CN112230217A (en) * | 2020-09-10 | 2021-01-15 | 成都多普勒科技有限公司 | A integrative radar sensor of miniature photoelectricity for intelligent automobile |
CN113012199A (en) * | 2021-03-23 | 2021-06-22 | 北京灵汐科技有限公司 | System and method for tracking moving object |
CN113012199B (en) * | 2021-03-23 | 2024-01-12 | 北京灵汐科技有限公司 | System and method for tracking moving target |
Also Published As
Publication number | Publication date |
---|---|
WO2019019907A1 (en) | 2019-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106716985B (en) | Video camera controller, camera shooting control method and camera system | |
US10116879B2 (en) | Method and apparatus for obtaining an image with motion blur | |
CN107087107A (en) | Image processing apparatus and method based on dual camera | |
US20190108622A1 (en) | Non-local means denoising | |
JP2018513640A (en) | Automatic panning shot generation | |
CN109302561A (en) | A kind of image capture method, terminal and storage medium | |
CN109194877A (en) | Image compensation method and device, computer readable storage medium and electronic equipment | |
CN107800979A (en) | High dynamic range video image pickup method and filming apparatus | |
CN114095662B (en) | Shooting guide method and electronic equipment | |
CN104660909A (en) | Image acquisition method, image acquisition device and terminal | |
CN106605404A (en) | Camera initial position setting method, camera, and camera system | |
TW202203085A (en) | Automatic camera guidance and settings adjustment | |
WO2012163370A1 (en) | Image processing method and device | |
US10257417B2 (en) | Method and apparatus for generating panoramic images | |
JP2015159510A (en) | Image pickup device and image pickup device control method | |
CN104052913A (en) | Method for providing light painting effect, and device for realizing the method | |
CN106559614A (en) | Method, device and the terminal taken pictures | |
WO2022040868A1 (en) | Panoramic photography method, electronic device, and storage medium | |
US9538097B2 (en) | Image pickup apparatus including a plurality of image pickup units and method of controlling the same | |
WO2021145913A1 (en) | Estimating depth based on iris size | |
CN112219218A (en) | Method and electronic device for recommending image capture mode | |
CN106921831B (en) | Method for generating photo containing rotation track and related photographic device | |
US20210303824A1 (en) | Face detection in spherical images using overcapture | |
US11636708B2 (en) | Face detection in spherical images | |
US9036070B2 (en) | Displaying of images with lighting on the basis of captured auxiliary images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190201 |
|
RJ01 | Rejection of invention patent application after publication |