CN105391924A - Data processing method and terminal - Google Patents

Data processing method and terminal Download PDF

Info

Publication number
CN105391924A
CN105391924A CN201510746994.6A CN201510746994A CN105391924A CN 105391924 A CN105391924 A CN 105391924A CN 201510746994 A CN201510746994 A CN 201510746994A CN 105391924 A CN105391924 A CN 105391924A
Authority
CN
China
Prior art keywords
view data
frame period
data
imageing sensor
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510746994.6A
Other languages
Chinese (zh)
Inventor
黄晓峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jinli Communication Equipment Co Ltd
Original Assignee
Shenzhen Jinli Communication Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jinli Communication Equipment Co Ltd filed Critical Shenzhen Jinli Communication Equipment Co Ltd
Priority to CN201510746994.6A priority Critical patent/CN105391924A/en
Publication of CN105391924A publication Critical patent/CN105391924A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Abstract

The embodiment of the invention discloses a data processing method and a terminal. The method comprises: at least two image sensors are used for collecting at least two image data within one frame period, wherein the collection time points of the at least two image data are different; and the at least two image data are outputted within the frame period according to the collection sequence to carry out displaying. According to the embodiment of the invention, the real-time performance of a preview picture is improved.

Description

A kind of data processing method and terminal
Technical field
The present invention relates to electronic technology field, particularly relate to a kind of data processing method and terminal.
Background technology
Along with developing rapidly of electronic technology, terminal has spread to the every field of our life, and people can be done shopping by it, play games, read a book, take pictures (comprising shooting) etc.Wherein, taking pictures is an important function, and terminal presents preview screen when taking pictures to user and contributes to promoting shooting effect.
Please refer to Fig. 1, Fig. 1 is the principle schematic presenting preview screen in prior art, and wherein, view data 111, view data 113, view data 115 and view data 117 are respectively that imageing sensor collects at time point 121,123,125 and 127 place.But under prior art conditions, the quantity of the view data that imageing sensor collects is not sufficient to make the coherent output of preview screen.So, in order to ensure the continuity of preview screen, terminal is based on the view data 111 collected, view data 113 image data generating 112, based on the view data 113 collected and view data 115 image data generating 114, and based on view data 115 and view data 117 image data generating 116, and Sequential output view data 111 ~ 117 is to make preview screen continuous.
When exporting, the time point time delay field output that the view data 111 of collection, view data 113, view data 115 and view data 117 are all gathering.Exported at the time point of acquisition of image data 113 by the view data 112 calculated, exported at the time point of acquisition of image data 115 by the view data 114 calculated, exported at the time point of acquisition of image data 117 by the view data 116 calculated, significantly, the view data that any time collects all needs to postpone the half frame cycle and just can be presented in preview screen, cause preview screen and photographed scene inconsistent and affect shooting effect.
Summary of the invention
The embodiment of the invention discloses a kind of data processing method and terminal, the real-time of preview screen can be improved.
First aspect, embodiments provide a kind of data processing method, the method comprises:
Within a frame period, by least two imageing sensor collections at least two view data, wherein, the acquisition time of described at least two view data is not identical;
Described at least two view data are exported to show within a frame period according to the order gathered.
Second aspect, the embodiment of the present invention provides a kind of terminal, and this terminal comprises:
Collecting unit, within a frame period, by least two imageing sensor collections at least two view data, wherein, the acquisition time of described at least two view data is not identical;
First output unit, for exporting to show according to the order gathered by described at least two view data within a frame period.
By implementing the embodiment of the present invention, terminal by multiple imageing sensor acquisition of image data in turn, and exports the picture data collected in turn, makes the real-time of preview screen stronger.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, be briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is a kind of principle schematic presenting preview screen that prior art provides;
Fig. 2 is the schematic flow sheet of a kind of data processing method that the embodiment of the present invention provides;
Fig. 2 A is a kind of structural representation being provided with the terminal of imageing sensor that the embodiment of the present invention provides;
Fig. 2 B is the scene schematic diagram of two imageing sensor acquisition of image data that the embodiment of the present invention provides;
Fig. 2 C is that a kind of that the embodiment of the present invention provides gathers and the scene schematic diagram of output image data;
Fig. 2 D is a kind of scene schematic diagram being generated intermediate data by interpolation algorithm that the embodiment of the present invention provides;
Fig. 3 is the structural representation of a kind of terminal that the embodiment of the present invention provides;
Fig. 3 A is the structural representation of a kind of first output unit that the embodiment of the present invention provides;
Fig. 3 B is the structural representation of another the first output unit that the embodiment of the present invention provides;
Fig. 4 is the structural representation of another terminal that the embodiment of the present invention provides.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is a part of embodiment of the present invention, instead of whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtained under creative work prerequisite, belong to the scope of protection of the invention.
It should be noted that, the term used in embodiments of the present invention is only for the object describing specific embodiment, and not intended to be limiting the present invention." one ", " described " and " being somebody's turn to do " of the singulative used in the embodiment of the present invention and appended claims is also intended to comprise most form, unless context clearly represents other implications.It is also understood that term "and/or" used herein refer to and comprise one or more project of listing be associated any or all may combine.In addition, term " first ", " second ", " the 3rd " and " the 4th " etc. in specification of the present invention and claims and above-mentioned accompanying drawing are for distinguishing different object, instead of for describing particular order.In addition, term " comprises " and " having " and their any distortion, and intention is to cover not exclusive comprising.Such as contain the process of series of steps or unit, method, system, product or equipment and be not defined in the step or unit listed, but also comprise the step or unit do not listed alternatively, or also comprise alternatively for other intrinsic step of these processes, method, product or equipment or unit.
Terminal described in the embodiment of the present invention can be that mobile phone, panel computer, notebook computer, palmtop PC, mobile internet device (MID, mobileinternetdevice), wearable device (such as intelligent watch (as iWatch etc.), Intelligent bracelet, pedometer etc.) etc. can the terminal equipments of deployment diagram image-position sensor.
Refer to Fig. 2, Fig. 2 is the schematic flow sheet of a kind of data processing method that the embodiment of the present invention provides, and the method includes but not limited to following steps.
Step S201: terminal is within a frame period, and by least two imageing sensor collections at least two view data, wherein, the acquisition time of at least two view data is not identical.
Particularly, terminal is provided with at least two imageing sensors, such as, terminal is provided with two imageing sensors, three imageing sensors or more imageing sensor.The optical axis of the imageing sensor that terminal is arranged can be parallel, or forms angle in certain scope.When the optical axis of the imageing sensor that terminal is arranged is parallel, imageing sensor can gather towards same direction, can ensure that the view data that this image sensing collects is substantially identical like this, particularly when gathering the view data compared with distant view thing, effect is more obvious.Such as, the distance between any two imageing sensors is less than 1/10th of distance between this scenery to terminal and namely shows that terminal is away from scenery.
Within a frame period, different imageing sensors is according to sequencing successively acquisition of image data, and the different time points of namely different imageing sensors within a frame period gathers.Wherein, the time interval that the time point of two adjacent imageing sensor acquisition of image data is formed can be identical, also can be different.Each imageing sensor can both acquisition of image data separately, for each imageing sensor, when carrying out image data acquiring, the light of image sensor senses scene reflections forms the analog signal of scenery, and is the view data of scenery by analog-to-digital conversion by the analog-signal transitions of scenery.
In a specific embodiment, refer to Fig. 2 A, in fig. 2, terminal 20 is provided with the first imageing sensor 21 and the second imageing sensor 22 two imageing sensors, the first imageing sensor 21 and the second imageing sensor 22 interleaved acquisition view data.Suppose within a frame period, the first imageing sensor 21 is identical with the time interval that the time point of the second imageing sensor 22 acquisition of image data is formed.Refer to Fig. 2 B, in fig. 2b, view data 211, view data 213 and view data 215 are the view data that above-mentioned first imageing sensor 21 collects, view data 212 and view data 214 are the view data that above-mentioned second imageing sensor 22 collects, in other words, first imageing sensor 21 and the second imageing sensor 22 are using 1/2 frame period as time interval acquisition of image data in turn, until end image data acquiring.
Step S202: at least two view data are exported to show within a frame period according to the order gathered.
Particularly, after these at least two imageing sensors collect view data, the view data collected being exported for checking, such as, showing this view data by the display screen of terminal.In addition, the view data spending a frame period to collect also needs to export within a frame period.Acquisition of image data and output image data can within the same frame periods, also can not within the same frame period.
In the optional scheme of one, one of them imageing sensor is instant after collecting view data exports this view data, then after interval 1/n frame period, next imageing sensor acquisition of image data also exports this view data collected immediately, by that analogy, wherein, n is the quantity of above-mentioned at least two imageing sensors, n be greater than 1 natural number.
In another optional scheme, described at least two view data are made up of destination image data and other view data; Described described at least two view data are exported to carry out display comprise according to the order gathered within a frame period: other view data described are carried out affine transformation based on described destination image data; Other view data described in after the described destination image data collected and affine transformation are exported to show within a described frame period according to the order gathered.
Particularly, for the quantity of above-mentioned at least two imageing sensors for 2, one of them imageing sensor is the first imageing sensor, another imageing sensor is the second imageing sensor, the view data that first imageing sensor collects is destination image data, the view data that second imageing sensor collects is other view data, wherein, first imageing sensor exports this destination image data after collecting destination image data immediately, after 1/2 frame period of interval, this second imageing sensor gathers other view data, then based on the characteristic point in the characteristic point in these other view data and the destination image data that collects above, affine transformation is carried out to these other view data, the characteristic point that these other view data of completion lack compared to destination image data, and delete the characteristic point that these other view data have more compared to destination image data, obtain the affine view data that these other view data are corresponding, then this affine view data is exported, still using 1/2 frame period as interval, this first imageing sensor and this second imageing sensor repeat above-mentioned flow process.
Please refer to Fig. 2 C, Fig. 2 C is that a kind of that the embodiment of the present invention provides gathers and the scene schematic diagram of output image data.Launch to describe for two imageing sensors in Fig. 2 C, wherein, clock circuit 223 clocking starts acquisition of image data to trigger the first imageing sensor 225, delay circuit 224 is transferred to the second imageing sensor 226 by after 1/2 frame period of this pulse delay signal, makes the time of the second imageing sensor 226 acquisition of image data postpone 1/2 frame period than the first imageing sensor 225 acquisition of image data; After first imageing sensor 225 and the second imageing sensor 226 collect view data, view data is sent to respectively picture signal process (ImageSignalProcessing, ISP) chip 227 and ISP chip 228 process, as color interpolation, Automatic white balance controls, automatic growth control, noise reduction, image enhaucament etc. processes.After ISP chip 227 and ISP chip 228 pairs of view data have carried out basic process, view data is sent to frame rate conversion (FrameRateConversion, FRC) chip 229, FRC chip 229 finally will derive from the first imageing sensor 225 and the second imageing sensor 226 view data processes, affine transformation described above etc., and export.
Imageing sensor exports after collecting view data immediately, or export immediately after affine transformation, instead of half frame all after dates in interval export again, make the view data collected to present on a display screen in time, improve the real-time of preview screen.
In another optional scheme, described described at least two view data are exported to carry out display comprise according to the order gathered within a frame period: at least one intermediate data of at least two view data described in being calculated by interpolation algorithm; At least two view data and at least one intermediate data described described in exporting within a frame period.
For example, be still 2 image acquisition devices at least two image acquisition devices, refer to Fig. 2 D, a kind of scene schematic diagram being generated intermediate data by interpolation algorithm that Fig. 2 D provides for the embodiment of the present invention; In figure 2d, the first imageing sensor gathers at the initial time of a frame, obtains view data 211; Output image data 211 after 1/4 frame period of interval; After 1/4 frame period of interval, the second imageing sensor gathers again, obtains view data 212, then generates middle view data 221 according to view data 211 and view data 212, and output image data 221; Again after 1/4 frame period of interval, output image data 212; First imageing sensor acquisition of image data 213 after 1/4 frame period of interval again, then according to view data 213 and view data 212 image data generating 222, and output image data 222, then carry out according to this regular cycles.In other words, after imageing sensor collects view data, only need postpone 1/4 frame period and exportable at every turn, instead of as prior art, postpone 1/2 frame period export, improve the real-time of preview screen.It should be noted that, when the quantity of these at least two cameras is n, after imageing sensor collects view data, only need postpone 1/n frame period and exportable.
In the method described by Fig. 2, terminal by multiple imageing sensor acquisition of image data in turn, and exports the picture data collected in turn, makes the real-time of preview screen stronger.
Further, intermediate data can also be calculated by multiple view data, and replace output image data and intermediate data within a frame period, make preview screen more smooth; Again further, the optical axis of multiple imageing sensor is set to parastate, the picture collected to make multiple imageing sensor closer to, make preview screen more smooth.
The above-mentioned method illustrating the embodiment of the present invention, for the ease of implementing the such scheme of the embodiment of the present invention better, correspondingly, provided below is the device of the embodiment of the present invention.
Refer to Fig. 3, Fig. 3 is the structural representation of a kind of terminal 30 that the embodiment of the present invention provides, and this terminal 30 can comprise collecting unit 301 and the first output unit 302, wherein, and being described in detail as follows of unit.
Collecting unit 301 is within a frame period, and by least two imageing sensor collections at least two view data, wherein, the acquisition time of at least two view data is not identical.
First output unit 302 for exporting to show according to the order gathered by least two view data within a frame period.
In the optional scheme of one, collecting unit 301 is specifically for passing through at least two imageing sensor equal time spacing collections at least two view data.
In another optional scheme, at least two view data are made up of destination image data and other view data, above-mentioned first output unit 302 can be the structure shown in Fig. 3 A, in figure 3 a, this first output unit 302 can comprise converter unit 3021 and the second output unit 3022, wherein, being described in detail as follows of converter unit 3021 and the second output unit 3022.
Converter unit 3021 is for carrying out affine transformation by other view data based target view data;
Second output unit 3022 for exporting to show according to the order gathered by other view data after the destination image data collected and affine transformation within a frame period.
In another optional scheme, first output unit 302 can also be the structure shown in Fig. 3 B, and in figure 3b, the first output unit 302 comprises computing unit 3023 and the 3rd output unit 3024, wherein, being described in detail as follows of computing unit 3023 and the 3rd output unit 3024.
Computing unit 3023 is for calculating at least one intermediate data of at least two view data by interpolation algorithm;
3rd output unit 3024 for exporting at least two view data and at least one intermediate data within a frame period.
It should be noted that, in embodiments of the present invention, the specific implementation of unit can also the corresponding corresponding description with reference to the embodiment of the method shown in Fig. 2.
In the terminal 30 described by Fig. 3, terminal 30 by multiple imageing sensor acquisition of image data in turn, and exports the picture data collected in turn, makes the real-time of preview screen stronger.
Further, intermediate data can also be calculated by multiple view data, and replace output image data and intermediate data within a frame period, make preview screen more smooth; Again further, the optical axis of multiple imageing sensor is set to parastate, the picture collected to make multiple imageing sensor closer to, make preview screen more smooth.
Please refer to Fig. 4, Fig. 4 is the structural representation of another terminal 40 that the embodiment of the present invention provides, and this terminal 40 can comprise: at least one memory 401, baseband chip 402, radio-frequency module 403, peripheral system 404, transducer 405 and communication bus 406.Wherein, memory 401 is for storage operation system, network communication program, user interface program, control program etc.; Baseband chip 402 comprises at least one processor 4021, such as CPU, clock module 4022 and power management module 4023; Peripheral system 404 comprises camera controller 4042, Audio Controller 4043, touch display screen controller 4044 and sensor management module 4045, correspondingly, also comprises camera 4047, voicefrequency circuit 4048 and touch display screen 4049; Further, transducer 405 can comprise light sensor, displacement transducer, acceleration transducer, fingerprint sensor etc., and generally speaking, transducer 405 can increase depending on actual needs or reduce; Memory 401 can be high-speed RAM memory, also can be non-labile memory (non-volatilememory), such as at least one magnetic disc store.Memory 405 can also be optionally that at least one is positioned at the storage device away from aforementioned processor 4021.
In the terminal 40 shown in Fig. 4, processor 4021 is for calling the control program of storage in memory 401 to perform following operation:
Within a frame period, by least two imageing sensor collections at least two view data, wherein, the acquisition time of at least two view data is not identical;
At least two view data are exported to show within a frame period according to the order gathered.
In the optional scheme of one, at least two view data are made up of destination image data and other view data, and the mode that at least two view data export to carry out showing according to the order gathered within a frame period can be specially by calling the control program stored in memory 401 by processor 4021:
Other view data based target view data are carried out affine transformation;
Other view data after the destination image data collected and affine transformation are exported to show within a frame period according to the order gathered.
In another optional scheme, the mode that at least two view data export to carry out showing according to the order gathered within a frame period can be specially by calling the control program stored in memory 401 by processor 4021:
At least one intermediate data of at least two view data is calculated by interpolation algorithm;
At least two view data and at least one intermediate data is exported within a frame period.
In another optional scheme, processor 4021 can be specially by the mode of at least two imageing sensor collections at least two view data by calling the control program stored in memory 401:
By at least two imageing sensor equal time spacing collections at least two view data.
In another optional scheme, the optical axis of at least two imageing sensors is parallel.
In sum, by implementing the embodiment of the present invention, terminal by multiple imageing sensor acquisition of image data in turn, and exports the picture data collected in turn, makes the real-time of preview screen stronger.
Further, intermediate data can also be calculated by multiple view data, and replace output image data and intermediate data within a frame period, make preview screen more smooth; Again further, the optical axis of multiple imageing sensor is set to parastate, the picture collected to make multiple imageing sensor closer to, make preview screen more smooth.
Step in embodiment of the present invention method can be carried out order according to actual needs and be adjusted, merges and delete.
Unit in embodiment of the present invention mobile terminal can carry out merging, divide and deleting according to actual needs.
One of ordinary skill in the art will appreciate that all or part of flow process realized in above-described embodiment method, that the hardware that can carry out instruction relevant by computer program has come, described program can be stored in computer read/write memory medium, this program, when performing, can comprise the flow process of the embodiment as above-mentioned each side method.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (Read-OnlyMemory, ROM) or random store-memory body (RandomAccessMemory, RAM) etc.
Above disclosedly be only a kind of preferred embodiment of the present invention, certainly the interest field of the present invention can not be limited with this, one of ordinary skill in the art will appreciate that all or part of flow process realizing above-described embodiment, and according to the equivalent variations that the claims in the present invention are done, still belong to the scope that invention is contained.

Claims (10)

1. a data processing method, is characterized in that, comprising:
Within a frame period, by least two imageing sensor collections at least two view data, wherein, the acquisition time of described at least two view data is not identical;
Described at least two view data are exported to show within a frame period according to the order gathered.
2. method according to claim 1, is characterized in that, described at least two view data are made up of destination image data and other view data; Described described at least two view data exported to carry out display comprise according to the order gathered within a frame period:
Other view data described are carried out affine transformation based on described destination image data;
Other view data described in after the described destination image data collected and affine transformation are exported to show within a described frame period according to the order gathered.
3. method according to claim 1, is characterized in that, described described at least two view data are exported to carry out display comprise according to the order gathered within a frame period:
At least one intermediate data of at least two view data described in being calculated by interpolation algorithm;
At least two view data and at least one intermediate data described described in exporting within a frame period.
4. the method according to any one of claims 1 to 3, is characterized in that, is describedly comprised by least two imageing sensor collections at least two view data:
By at least two imageing sensor equal time spacing collections at least two view data.
5. the method according to any one of claims 1 to 3, is characterized in that, the optical axis of described at least two imageing sensors is parallel.
6. a terminal, is characterized in that, comprising:
Collecting unit, within a frame period, by least two imageing sensor collections at least two view data, wherein, the acquisition time of described at least two view data is not identical;
First output unit, for exporting to show according to the order gathered by described at least two view data within a frame period.
7. terminal according to claim 6, is characterized in that, described at least two view data are made up of destination image data and other view data; Described first output unit comprises:
Converter unit, for carrying out affine transformation by other view data described based on described destination image data;
Second output unit, for exporting to show according to the order gathered by other view data described in after the described destination image data collected and affine transformation within a described frame period.
8. terminal according to claim 6, is characterized in that, described first output unit comprises:
Computing unit, at least one intermediate data of at least two view data described in being calculated by interpolation algorithm;
3rd output unit, at least two view data and at least one intermediate data described described in output within a frame period.
9. the terminal according to any one of claim 6 ~ 8, is characterized in that, described collecting unit is specifically for passing through at least two imageing sensor equal time spacing collections at least two view data.
10. the terminal according to any one of claim 6 ~ 8, is characterized in that, the optical axis of described at least two imageing sensors is parallel.
CN201510746994.6A 2015-11-05 2015-11-05 Data processing method and terminal Pending CN105391924A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510746994.6A CN105391924A (en) 2015-11-05 2015-11-05 Data processing method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510746994.6A CN105391924A (en) 2015-11-05 2015-11-05 Data processing method and terminal

Publications (1)

Publication Number Publication Date
CN105391924A true CN105391924A (en) 2016-03-09

Family

ID=55423716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510746994.6A Pending CN105391924A (en) 2015-11-05 2015-11-05 Data processing method and terminal

Country Status (1)

Country Link
CN (1) CN105391924A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107277351A (en) * 2017-06-30 2017-10-20 维沃移动通信有限公司 The processing method and mobile terminal of a kind of view data
CN107360345A (en) * 2017-06-30 2017-11-17 维沃移动通信有限公司 The processing method and mobile terminal of a kind of view data
CN109672809A (en) * 2018-05-23 2019-04-23 李芝宏 Light shunt multisensor high speed video recording system and method
CN114070992A (en) * 2020-07-29 2022-02-18 北京小米移动软件有限公司 Image display device, image display method, and electronic apparatus
WO2023130706A1 (en) * 2022-01-04 2023-07-13 北京石头创新科技有限公司 Multi-camera frame synchronization control method and self-propelled device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030048363A1 (en) * 2001-09-07 2003-03-13 Tohru Watanabe Imaging apparatus
CN1941918A (en) * 2005-09-30 2007-04-04 卡西欧计算机株式会社 Imaging device and imaging method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030048363A1 (en) * 2001-09-07 2003-03-13 Tohru Watanabe Imaging apparatus
CN1941918A (en) * 2005-09-30 2007-04-04 卡西欧计算机株式会社 Imaging device and imaging method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107277351A (en) * 2017-06-30 2017-10-20 维沃移动通信有限公司 The processing method and mobile terminal of a kind of view data
CN107360345A (en) * 2017-06-30 2017-11-17 维沃移动通信有限公司 The processing method and mobile terminal of a kind of view data
CN109672809A (en) * 2018-05-23 2019-04-23 李芝宏 Light shunt multisensor high speed video recording system and method
CN114070992A (en) * 2020-07-29 2022-02-18 北京小米移动软件有限公司 Image display device, image display method, and electronic apparatus
CN114070992B (en) * 2020-07-29 2023-12-26 北京小米移动软件有限公司 Image display device, image display method and electronic equipment
WO2023130706A1 (en) * 2022-01-04 2023-07-13 北京石头创新科技有限公司 Multi-camera frame synchronization control method and self-propelled device

Similar Documents

Publication Publication Date Title
CN105391924A (en) Data processing method and terminal
CN103428460B (en) The video recording method and recording device of output video signal sequence are recorded for image capture module
CN111815754B (en) Three-dimensional information determining method, three-dimensional information determining device and terminal equipment
CN110012209B (en) Panoramic image generation method and device, storage medium and electronic equipment
CN105898143A (en) Moving object snapshotting method and mobile terminal
US20200372254A1 (en) Method for outputting a signal from an event-based sensor, and event-based sensor using such method
US10848746B2 (en) Apparatus including multiple cameras and image processing method
CN110399908B (en) Event-based camera classification method and apparatus, storage medium, and electronic apparatus
CN107808397B (en) Pupil positioning device, pupil positioning method and sight tracking equipment
CN111064865B (en) Background activity noise filter of dynamic vision sensor and processor
CN109561257A (en) Picture focusing method, device, terminal and corresponding storage medium
CN113554726B (en) Image reconstruction method and device based on pulse array, storage medium and terminal
CN106201624A (en) A kind of recommendation method of application program and terminal
CN107341787A (en) Method, apparatus, server and the automobile that monocular panorama is parked
NL1025642C2 (en) Method and device for reconstructing high-resolution images.
CN114245028B (en) Image display method and device, electronic equipment and storage medium
CN108564546A (en) Model training method, device and photo terminal
CN112511859B (en) Video processing method, device and storage medium
CN109121194B (en) Method and apparatus for state transition of electronic device
CN114885144B (en) High frame rate 3D video generation method and device based on data fusion
CN109889892A (en) Video effect adding method, device, equipment and storage medium
CN112560002B (en) Gait behavior-based identity authentication method, device, equipment and storage medium
CN115290299A (en) Method and device for determining falling depth of screen light leakage and electronic equipment
KR20100028007A (en) An image sensor, the operating method and usage thereof
CN115138063A (en) Image processing method, device and program electronic device and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160309

WD01 Invention patent application deemed withdrawn after publication