CN101305596A - Image processing device, image processing method, program thereof, recording medium containing the program, and imaging device - Google Patents

Image processing device, image processing method, program thereof, recording medium containing the program, and imaging device Download PDF

Info

Publication number
CN101305596A
CN101305596A CNA2006800421517A CN200680042151A CN101305596A CN 101305596 A CN101305596 A CN 101305596A CN A2006800421517 A CNA2006800421517 A CN A2006800421517A CN 200680042151 A CN200680042151 A CN 200680042151A CN 101305596 A CN101305596 A CN 101305596A
Authority
CN
China
Prior art keywords
image
display mode
selected zone
distortion
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2006800421517A
Other languages
Chinese (zh)
Other versions
CN101305596B (en
Inventor
山冈成光
神谷了
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority claimed from PCT/JP2006/322499 external-priority patent/WO2007055336A1/en
Publication of CN101305596A publication Critical patent/CN101305596A/en
Application granted granted Critical
Publication of CN101305596B publication Critical patent/CN101305596B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

An image processing device processes image data containing distortion of an imaging optical unit obtained by capturing an optical image from an object via the imaging optical unit giving a distortion. A region selection mode setting unit (13b) selectively sets a first region selection mode for selecting a selection region indicating a partial region of field of view by using an orthogonal coordinate system to the field of view expressed by the image data and a second region selection mode for selecting a selection region expressing a partial region of the field of view by using the polar coordinate system to the field of view expressed by the image data. A distortion correction unit (13a) distortion-corrects the image data corresponding to the selection region selected in the first or the second region selection mode set by the region selection mode setting unit (13b) and obtains distortion-corrected data. A data output unit (13d) outputs the distortion-corrected data.

Description

Image processing equipment, image processing method and program thereof and the recording medium, the imaging device that comprise this program
Technical field
The present invention relates to image processing equipment, image processing method and program thereof, wherein write down the recording medium and the image pick up equipment of this program, it is carried out the wide field-of-view image that is picked up and handles.
Background technology
Traditionally, proposed a kind of technology, by using this technology, the user is specifying the zone that needs by using in the image that for example fish eye lens picked up, distortion aberration to the fish eye lens view data of appointed area is proofreaied and correct, and calibrated image is presented on the monitor.The image that is picked up can be a moving image, can be rest image (for example, seeing Japanese Patent Application Publication No.2000-324386 (seeing [0009] section in Fig. 1 and specification)).
Use the equipment of describing among this Japanese Patent Application Publication No.200-324386, can expect,, then can obtain the convenience of higher degree if the user uses better method to come appointed area in the image that camera picked up.The system of expecting this use camera is being used in the extensive use in the future, therefore may have the vital task that easy-to-use interface is offered the user.
Summary of the invention
Image processing equipment according to the present invention is an image processing equipment of handling the view data of the distortion that comprises the image pickup opticator, described distortion is to pick up optical imagery by the image pickup opticator that provides distortion from scenery to obtain, described image processing equipment comprises: data output unit, it is by the view data of the calibrated image output of the distortion of using scene image and the selected zone display image corresponding with display mode, and the selected zone of this scene image of the part in the visual field of feasible this pictorial data representation of indication can be discerned; Display mode is provided with part, and it is provided with display mode; And control section, it is according to the switching in selected zone, first display mode is changed into second display mode, in first display mode, do not use to make its discernible scene image in selected zone, and in second display mode, make its discernible scene image in selected zone.In the present invention " image " mainly refers to moving image, still, also comprises rest image certainly.
The present invention is provided with the importation, and it carries out for example switching command in selected zone according to the switching in selected zone, and if switch selected zone by this switching command, then display mode is changed a predetermined amount of time.Further, first area preference pattern and second area preference pattern are provided, in the preference pattern of first area, in orthogonal coordinates, switch selected zone based on switching command, and in the second area preference pattern, in polar coordinates, switch selected zone based on switching command, make when Zone switched preference pattern, display mode is changed a predetermined amount of time.For the visual field that for example people normally watched, that is to say, pick up such as about having, about the situation of landscape map picture under, the orthogonal coordinates pattern is very effective.Further, under the situation of picking up the image with the visual field that for example makes progress from horizontal plane or watch downwards, polar coordinate model is very effective.That is to say,, can carry out operation very intuitively by the people by changing regional preference pattern.Therefore, if by switching selected zone or regional preference pattern then change display mode one predetermined amount of time, when switching selected zone, can in a predetermined amount of time, provide with normal to show different demonstrations, thereby make it possible to realize efficiently utilizing the image of display part to show.Further, provide detected image to pick up the direction detection sensor of direction, make when selected when regional based on determining to change from the sensor signal of this direction detection sensor, if even display mode is changed a preset time section, also can provide the image that efficiently utilizes the display part to show.
Here, control section will be wherein use in display mode makes the display mode of its discernible scene image in selected zone, changes to wherein to make the display mode of its discernible scene image in selected zone in a predetermined amount of time in display mode.In this case, if change selected zone or Zone switched preference pattern, then show to make discernible scene image one predetermined amount of time in its selected zone, so that can confirm the state that is provided with in selected zone.Further, after passing through predetermined amount of time, it switches to wherein not use in display mode and makes the display mode of its discernible scene image in selected zone, if even so that owing to make that in the calibrated image of the distortion in for example selected zone the demonstration of its discernible scene image in selected zone causes taking place the blind area, also after through the preset time section, can confirm the image that this is regional.Should be noted that this preset time section can be, but be not limited to for example three seconds, five seconds etc.
Image processing method according to the present invention is an image processing method of handling the view data of the distortion that comprises the image pickup opticator, this distortion is to pick up optical imagery by the image pickup opticator that provides distortion from scenery to obtain, this image processing method comprises: data output step, it is by the view data of the calibrated image output of the distortion of using scene image and the selected zone display image corresponding with display mode, and the selected zone of this scene image of the part in the visual field of feasible this pictorial data representation of indication can be discerned; Display mode is provided with step, and it is provided with display mode; Change step with display mode, it is according to the switching in selected zone, first display mode is changed into second display mode, in first display mode, do not use to make its discernible scene image in selected zone, and in second display mode, make its discernible scene image in selected zone.
As mentioned above, utilize the present invention, by can reproducing by selecting the image processing of the image that the zone obtains in using the employed image of image pickup opticator, image processing equipment that realization can be easily easily used by the user or image processing method etc. become possibility.
Description of drawings
Fig. 1 is the block diagram that illustrates according to the structure of the image processing system of one embodiment of the present invention;
Fig. 2 is the example schematic diagram of the relation between scene image that forms on the image pick-up element and selected zone;
Fig. 3 is the block diagram that the functional structure of image processing section is shown;
Fig. 4 is the exemplary plot of display mode and regional preference pattern;
Fig. 5 is the block diagram of the ad hoc structure of example image processing section;
Fig. 6 is the schematic diagram that complete image is shown;
Fig. 7 is the schematic diagram that is illustrated in an example of the image that shows on the display part;
Fig. 8 is the flow chart that the operation of processing controls part is shown;
Fig. 9 is the schematic diagram that illustrates as the visual field of three-dimensional ball;
Figure 10 is the schematic diagram that is used to illustrate each visual field (image pickup scope) when partly being picked up by image pickup;
Figure 11 is the example schematic diagram of the picture altitude feature of camera lens;
Figure 12 is the example schematic diagram of the principle of distortion correction processing;
Figure 13 is the example schematic diagram of the operation in the orthogonal coordinates pattern in regional preference pattern;
Figure 14 is the example schematic diagram of the operation in the polar coordinate model in regional preference pattern;
Figure 15 shows the example of the GUI that shows when the user changes selecteed zone by the importation instruction;
Figure 16 is instructed example schematic diagram under the situation of switching selecteed zone when selecting the orthogonal coordinates pattern;
Figure 17 shows the display image that shows when switching display mode in proper order when selecting the orthogonal coordinates pattern on display part 14;
Figure 18 is the schematic diagram that the display image that shows on the display part when selecting polar coordinate model is shown;
Figure 19 is the schematic diagram that the display image sending switching command when selecting polar coordinate model after is shown;
Figure 20 illustrates the display image that shows on the display part when selecting polar coordinate model, and explain the schematic diagram of the situation that the demonstration in the divided display mode is spun upside down;
Figure 21 is the schematic diagram that is illustrated in the transformation that the display mode under the situation of the selection that enables divided display mode switches;
Figure 22 is the example schematic diagram that processing is amplified/dwindled to the ratio that is used for the image that shows in selecteed zone;
Figure 23 is illustrated in the schematic diagram that sends rotate instruction state before;
Figure 24 is the example schematic diagram that is used in the processing of selecteed zone image rotating;
Figure 25 is the example schematic diagram that is used in the another kind processing of selecteed zone image rotating;
Figure 26 is the block diagram that illustrates according to the structure of the image processing system of another embodiment of the invention;
Figure 27 is the flow chart that an example of the processing of being carried out by image processing system shown in Figure 26 is shown;
Figure 28 is the flow chart that another example of the processing of being carried out by image processing system shown in Figure 26 is shown;
Figure 29 is the flow chart that the processing that is used for stored position information or trace information is shown;
Figure 30 is the example schematic diagram with the track of flow process shown in Figure 29 presumptive area on shown image;
Figure 31 is the block diagram that illustrates according to the structure of the image processing system of another embodiment of the invention;
Figure 32 is the block diagram that illustrates according to the structure of the image processing system of another execution mode of the present invention;
Figure 33 is the conceptual schematic view that is used for coming according to the direction of the image processing system that wherein image pickup part 11 is placed on Figure 32 the mode of Zone switched preference pattern MS;
Figure 34 is the state S that is provided for switching Figure 33 0, S 1And S 2The example schematic diagram of method of threshold value;
Figure 35 is illustrated in the image processing system 40 of Figure 32 with state S 0, S 1And S 2The flow chart of the routine of the operation under the situation of Qie Huaning each other;
Figure 36 is the example schematic diagram that the image processing system at Figure 32 is provided with the coordinate Calculation method under the situation of orthogonal coordinates pattern and polar coordinate model respectively;
Figure 37 is the example schematic diagram that the image processing system at Figure 32 is provided with the coordinate Calculation method under the situation of orthogonal coordinates pattern and polar coordinate model respectively;
Figure 38 is the example schematic diagram that the image processing system at Figure 32 is provided with the coordinate record method under the situation of orthogonal coordinates pattern and polar coordinate model respectively;
Figure 39 is the example schematic diagram that the image processing system at Figure 32 is provided with the coordinate record method under the situation of orthogonal coordinates pattern and polar coordinate model respectively;
Figure 40 is the conceptual schematic view that is used for switching according to contact the method for display mode;
Figure 41 illustrates the schematic diagram that the image pickup opticator has obtained the situation at 270 visual angles of spending;
Figure 42 illustrates wherein image pickup direction to be set to about the horizontal direction schematic diagram of the states of 45 degree upwards;
Figure 43 is the schematic diagram that the example of wherein placing the image pickup part is shown;
Figure 44 is the example schematic diagram of the automatic switchover of regional preference pattern;
Figure 45 is that the GUI under the situation of regional preference pattern that automaticallyes switch shows and the schematic diagram of image-region moving direction;
Figure 46 is in the example schematic diagram that comprises the handover operation between the regional preference pattern of integrated mode;
Figure 47 is at the flow chart that comprises the handover operation between the regional preference pattern of integrated mode;
Figure 48 is the flow chart that the operation when the direction of operating button is shown;
Figure 49 is illustrated in another execution mode of the present invention, changes the schematic diagram of the mode of display mode MH in response to the switching in selected zone;
Figure 50 is illustrated in another execution mode of the present invention, in response to the switching of regional preference pattern MS, changes the schematic diagram of the mode of display mode MH;
Figure 51 is the flow chart that is illustrated in the operation of the image processing system under the situation of carrying out the display mode hand-off process;
Figure 52 is the flow chart of another flow process that is illustrated in the operation of the image processing system under the situation of carrying out the display mode hand-off process; With
Figure 53 is the schematic diagram that the another kind of state of display mode hand-off process is shown.
Embodiment
Below, describe embodiments of the present invention in detail with reference to accompanying drawing.
Fig. 1 is the block diagram that illustrates according to the structure of the image processing system of one embodiment of the present invention.
Image processing system 10 is provided with image pickup part 11, importation 12, image processing section 13 and display part 14.Further, by using image pickup opticator 111 and image pick-up element 112 composing images to pick up part 11.
Use image pickup opticator 111 that scene image is focused on the imaging region of image pick-up element 112.In this case, as image pickup opticator 111, for example, use wide-angle lens (wide-angle lens) with wide visual field scenery (wide-filed subject) image focusing on the imaging region of image pick-up element 112.Wide-angle lens has the visual angle of about at least 45 degree, but is not limited thereto.On the other hand, can be by using fish eye lens, waiting composing images to pick up opticator 111 as a kind of panoramic annular lens (PAL) of annular lens.Further, be alternative in the use wide-angle lens, can use tubular (tube-shaped), bowl-type or conical minute surface, make wide visual field scene image can be passed through the reflect focalization of this minute surface on the imaging region of image pick-up element 112.Further, a plurality of prisms and minute surface can be made up to further expand the visual field.For example, by having two fish eye lenses at visual angles of about 180 degree, can obtain to have the panorama visual angle scene image of (diameter of Spherical Volume (360 degree)).
As image pick-up element 112, for example, use charge-coupled device (CCD), complementary metal oxide semiconductors (CMOS) (CMOS) transducer etc., light is converted to the signal of telecommunication.This image pick-up element 112 generates view data DVa and provides it to image processing section 13 based on scene image.
Fig. 2 shows using under the situation of fish eye lens as image pickup opticator 111, in scene image that forms on the image pick-up element 112 and the relation between the selected zone.If image pickup opticator 111 has the visual angle of for example about 180 degree and can represent its visual field by the hemisphere sphere 51 among Fig. 2, (below be called " wide field-of-view image ") the scene image Gc that then forms on image pick-up element 112 will become and comprise because the image of the distortion that image pickup opticator 111 causes, for example circular image.Therefore, when based on the view data DVa display image that obtained by image pick-up element 112, shown image becomes and has because the image of the distortion that image pickup opticator 111 causes.In this case, if the selected zone of indication by the subregion in the visual field of view data DVa statement is provided, selected regional 71 corresponding in the visual angle of this selected zone and for example image pickup opticator 111 then, and corresponding with image-region ARs in the wide field-of-view image Gc.Therefore, image processing section 13 can be carried out distortion correction to the image of image-region ARs and handle the distortion that causes owing to image pickup opticator 111 to proofread and correct, and shows the distortionless image in selected zone.Therefore, be set to comprise the scenery that needs in the visual angle of image pickup opticator 111 for example by so selected zone, can show the orthoscopic image that needs scenery.And the position by switching selected zone is to reposition or change this regional size or shape, also will change position or size or the shape of the image-region ARs corresponding with selected zone.Therefore can proofread and correct owing under the condition of the distortion that image pickup opticator 111 causes, in the visual angle of image pickup opticator 111, show the image in optional position or zone.
Here, can wait by the angular range of specifying indicating image to pick up the position at selected regional 71 in the visual angle of opticator 111 and scope selecteed zone is set.Further because as mentioned above, image-region Gc with selected zone corresponding, so can also wait selecteed zone is set by position, the scope that appointment is set at the image-region ARs on the wide field-of-view image Gc.
Should be noted that and using under the fish-eye situation that selecteed zone is more little and its center apart from the visual field is near more, then the scope of image-region ARs and selected zone are alike more in shape.On the contrary, selected zone is big more and its edge apart from the visual field is near more, then the scope of image-region ARs about selected zone in shape the distortion big more.
Use the operation of importation 12, switch the position in selected zone, change the area size and the region shape in selected zone, and operator scheme and image display mode in the switching in selected zone are set in response to the user.Importation 12 can be an any apparatus, as long as it can be operated by the user.Shaft-like control device that it can for example be mouse, keyboard, switchgear, touch sensor, game machine controller, can be caught by the user etc.Importation 12 generates operates corresponding input information PS with the user, and provides it to image processing section 13.
Image processing section 13 provides the view data DVa that comes to carry out distortion correction processing, the image in the selected zone that the distortion that wherein causes owing to image pickup opticator 111 with generation is corrected by using from image pickup part 11.Further, image processing section 13 generates the view data DVd of the display image corresponding with the display mode that is provided with like this, and provides it to display part 14 for being displayed on the image setting display mode on the display part 14.Should be noted that image processing section 13 with the image in wide field-of-view image, the selected zone that is corrected of distortion etc. as display image.Further, image processing section 13 is provided as the regional preference pattern of the operator scheme that is activated when switching the position in selected zone, and switches selecteed zone based on the input information PS from importation 12 in the regional preference pattern that is provided with like this.And, image processing section 13 is carried out it is set in the initial operation display mode of appointment in advance or the processing of regional preference pattern at first, display mode when it is set to display operation and finishes or regional preference pattern and the processing that begins to operate etc.
Display part 14 is made of liquid crystal display cells, organic EL etc., and the view data DVd display image that comes based on providing from image processing section 13 is provided.
In image processing system 10 according to present embodiment, can image pickup part 11, importation 12, image processing section 13 and display part 14 is integrated each other or separated from one another.Further, can be only that in them some are integrated each other.For example, if importation 12 and display part 13 is integrated each other, then could be easily by on display part 14, confirming to show to come operation input section 12.And, in image pickup part 11, can image pickup opticator 111 and image pick-up element 112 is integrated each other or separated from one another.
Fig. 3 is the block diagram that the function formation of image processing section 13 is shown.Image processing section 13 has that distortion correction part 13a, regional preference pattern are provided with part 13b, display mode is provided with part 13c, data output unit 13d and control section 13e.
Distortion correction part 13a carry out distortion correction with by use with view data DVa in the corresponding view data in selected zone, the distortion that correction causes by image pickup opticator 111, calibrated data thereby generation distorts.
The zone preference pattern is provided with the regional preference pattern that part 13b is provided as the operator scheme when being provided with or switch selected zone.This zone preference pattern is provided with part 13b and is provided with, as regional preference pattern MS's, for example as shown in Figure 4 as the orthogonal coordinates pattern MS1 of first area preference pattern work with as the polar coordinate model MS2 of first area preference pattern work, and in these regional preference patterns any one is set optionally.The back will be described these regional preference patterns respectively.
When distorting when image of overcorrect etc. is presented on the display part 14, display mode is provided with part 13c display mode is set.As shown in Figure 4, for example, this display mode is provided with part 13c and is provided with, as display mode MH's, complete image display mode MH1, selected image display mode MH2, two display mode MH3 and divided display mode MH4, and in these display modes any one is set.The back will be described these display modes respectively.
The view data of the display image that data output unit 13d output is corresponding with the display mode that is provided with like this.Here, if will be shown as display image at the image in the selected zone that its distortion that causes owing to image pickup opticator 111 is corrected, then its output distortion is through the data of overcorrect.Further, if wide field-of-view image is shown as display image, then its output provides next view data from image pickup part 11.Further, if the image in the selected zone that demonstration wide field-of-view image and its distortion that causes owing to image pickup opticator 111 are corrected, then it provides the view data of coming to generate new image data by using distortion through the data of overcorrect with from image pickup part 11, and with this data output.
Control section 13e is provided with or switches selecteed zone according to the set regional preference pattern MS of part 13b is set by regional preference pattern.
Fig. 5 is the block diagram of the specific formation of example image processing section 13.View data DVa is provided to image extracts processing section 131 and distortion correction processing section 133.
The view data DVc about wide field-of-view image (scene image) Gc is extracted in image extraction processing section 131 from view data DVa, and provides it to highlighted demonstration processing section, selected zone 132.Here, as shown in Figure 2, the subregion on the sensor surface of wide field-of-view image Gc indicating image pickup device 112, and definite by image pickup opticator 111.Therefore, if the zone of wide field-of-view image Gc is fixed on sensor surface, then from view data DVa, extracts the predetermined picture data and extract the pixel data in the zone of wide field-of-view image Gc.Further, if can alternative image picking up the zone of opticator 111 and wide field-of-view image Gc changes on sensor surface, so if perhaps on sensor surface, change because can change the zone of the optical characteristics wide field-of-view image Gc of image pickup opticator 111, the view data in zone of this sign of wide field-of-view image Gc is extracted in the zone of the lip-deep wide field-of-view image Gc of its prior mark sensor then then.By for example use white scenery (white subject) be full of the complete visual field of image pickup opticator 111, to it picks up and inspection image data DVa occupies on white level location of pixels, identify the zone of wide field-of-view image Gc, therefore, can easily identify the zone of wide field-of-view image Gc.
Highlighted demonstration processing section, selected zone 132 is carried out and is handled, make that the user can be easily based on selected regional configuration information JA, in wide field-of-view image Gc, between selected zone that this selected regional configuration information JA indication that comes is provided by the processing controls part of describing from behind 135 and correspondence image zone ARs, distinguish.For example, highlighted demonstration processing section, selected zone 132 is carried out and is shown control, make and to show by the boundary line that the boundary line between indicating image zone ARs and the zone except this image-region ARs is provided, perhaps, come identification image zone ARs by changing the brightness or the color in the zone except image-region ARs.The image of this image-region ARs is highlighted can being identified, and be referred to below as highlighted image Gs.By carrying out this image processing, generation makes its image-region ARs can be identified as the view data DVcp of the image of highlighted image Gs (below be called " complete image Gcp ") in wide field-of-view image Gc, and these data are offered the image output processing part divides 134.
The distortion correction processing section 133 corresponding with the distortion correction part 13a shown in Fig. 3 generates wherein the calibrated view data DVsc that is corrected by the distortion that the indicated selected zone of the selected regional configuration information JA that comes is provided from processing controls part 135 by using view data DVa, proofread and correct because the distortion that causes of image pickup opticator 111, and provide it to the image output processing part and divide 134.
The image output processing part corresponding with the data output unit 13d shown in Fig. 3 divides 134 by using view data DVcp and/or calibrated view data DVsc, based on display control information JH, generate the view data DVd of display image from processing controls part 135.
Processing controls part 135 is provided with part 13b with regional preference pattern, part 13c is set display mode and control section 13e is corresponding.Processing controls part 135 setting area preference patterns, be provided with based on input information PS or switch selecteed zone according to the regional preference pattern of such setting from importation 12, generate the set selected zone of indication or the selected regional configuration information JA in newly-installed selected zone, and provide it to highlighted demonstration processing section 132, selected zone and distortion correction processing section 133.Further, processing controls part 135 is provided with display mode, generates display control information JH according to the display mode of such setting, and provides it to the image output processing part and divide 134.Processing controls part 135 is further carried out to handle based on display control information JH etc. menu is shown and is included in the display image.
Image processing section 13 is by for example hardware such as CPU (CPU), random-access memory (ram) and read-only memory (ROM), is stored in compositions such as software among ROM etc., firmware.Further, image processing section 13 can be made up of field programmable gate array (FPGA) or digital signal processor (DSP) etc., and can be provided with video encoder, vocoder, be used to obtain above-mentioned input information PS interface, be used for view data DVd is outputed to the interface of above-mentioned display part 14 etc.Further, the mode of carrying out their task respectively with FPGA and DSP is used FPGA and DSP.
Will be from the importation the 12 input information PS that are provided to image processing section 13 be the above-mentioned setting of indication display mode MH and regional preference pattern MS information, be used to switch the instruction in selected zone etc.Further, the indication information that is used for switching the instruction in selected zone can comprise and is used at predetermined direction the information of the position through changing in the information of the mobile predetermined unit in selected zone, the selected zone of indication etc.Input information PS can further include the information, the information that is used to rotate selected zone that are used to change the area size of selecting the zone, be used to be provided with the information etc. of the region shape in selected zone.
Selected zone is not limited to wherein to operate according to the user carries out the execution mode that switches etc.For example, can consider such execution mode: can as described above selected zone be set to prior appointed positions.In this case, can be in advance with about the information stores in this selected zone among ROM, unshowned External memory equipment etc.Further, discerning under the situation of the specific region among the wide field-of-view image Gc automatically, can consider such execution mode: handle as the image-region ARs corresponding with selected zone the specific region that will be somebody's turn to do identification automatically.For example, under the situation of the automatic mobile scenery of identification, can consider such execution mode: can be automatically will comprise that this zone of moving scenery handles as image-region ARs.Alternatively, such execution mode is possible: will be handled as the image-region ARs corresponding with selected zone by unshowned various transducers detected image-region in wide field-of-view image Gc.In this case, transducer can be that for example any other in temperature sensor, sound transducer, pressure sensor, optical pickocff, humidity sensor, vibrating sensor, gas sensor or the various transducer is a kind of.To offer processing controls part 135 by the sensor signal that this transducer generates, it utilizes this sensor signal to switch selecteed zone or controls corresponding part then.For example, if detect defective temperature or sound, then by the selected zone corresponding of automaticallying switch with the direction that wherein detects this defective temperature or sound, can the image that is picked up that wherein detects the direction of defective temperature or sound be presented on the screen of display part 14 detecting under the condition of its distortion.Further, size by switching display mode, zone or shape etc. in response to the detection of defective, can confirm easily that the mode of this defective comes display image with the user.
Should be noted that structure shown in Figure 5 only is exemplary, and image processing section 13 is not limited to the sort of of structure shown in Figure 5, as long as it has the mechanism shown in Fig. 3.For example, if provide the view data DVa that comes only to indicate wide field-of-view image Gc, then do not need to provide image to extract processing section 131 from image pickup part 11.Further, highlighted demonstration processing section, selected zone 132 can be placed on the image output processing part and divide 134 output, be alternative in and be placed on input.In this case, if wide field-of-view image is included in the image based on view data DVa, then will with its can be easily by the mode of user ID handle with wide field-of-view image in the corresponding image-region in selected zone.
Fig. 6 shows complete image.If do not carry out distortion correction by 13 couples of view data DVa of image processing section, then wide field-of-view image Gc becomes the image that comprises the distortion that generates by image pickup opticator 111.Here, if by processing controls part 135 selected zone is set, highlighted demonstration processing section, then selected zone 132 carries out image processing make that works as mentioned above easily to identify selected zone.That is to say, can provide the border (for example to show, frame shows) with image-region ARs corresponding of indication in shown wide field-of-view image Gc and the boundary line between the zone except this image-region ARs, perhaps change the brightness in the zone except this image-region ARs or color highlighted image Gs to provide image-region ARs wherein to be highlighted with selected zone.Should be noted that if selected zone is set on the complete visual field, then corresponding with selected zone image-region ARs indicates complete wide field-of-view image Gc.
Further, if by dividing 134 operation that complete image display mode MH1 is set as output processing part based on display control information JH control chart, then processing controls part 135 control charts divide 134 to generate view data DVd as output processing part, by this view data DVd only complete image Gcp to be shown, wherein can be shown in Fig. 6 in wide field-of-view image Gc, image-region ARs is designated highlighted image Gs like that.Further, if be provided with selecteed image display mode MH2, then processing controls part 135 control charts divide 134 to generate view data DVd as output processing part, only want display image (below be called " selected regional display image ") Gsc by this view data DVd, wherein the distortion that the image rectification of image-region ARs is generated by image pickup opticator 111.Further, if be provided with two display mode MH3, then processing controls part 135 control charts divide 134 as output processing part, to generate the view data DVd that wherein shows the display image of complete image Gcp and selected regional display image Gsc as shown in Figure 7 simultaneously.And, if a plurality of selected zones are set, then its control chart divides 134 to generate the view data DVd of the display image that complete image Gcp wherein and image as selected regional display image Gsc together show as output processing part, wherein proofreaies and correct the highlighted distortion in images corresponding with these selected zones.Here, when showing complete image Gcp and selected regional display image Gsc as shown in Figure 7 simultaneously, the image output processing part divides 134 for example can use and show (on screen display, OSD) technology with screen.
Subsequently, below with reference to the operation of the flow chart description processing controls part 135 of Fig. 8.Processing controls part 135 is carried out display mode and the initial setting up of regional preference pattern and the setting (ST1001) in selected zone.For example, when initial operation begins, the display mode or the regional preference pattern that set in advance are set.And, selected zone is set to predetermined size and along the predetermined direction in the visual field.Further, when EO, the information that can store the state that is provided with of indicated number pattern and regional preference pattern and select the state that is provided with in zone can be used this information when operation when beginning then next time, makes to operate with the state of EO before to begin.
Processing controls part 135 determines that whether provide the input information PS that comes from importation 12 is the information (ST1002) of bringing variation during being provided with etc.If input information PS is an information (is "Yes" at ST1002) of bringing variation in being provided with etc., then processing controls part 135 is according to the input information PS change display mode that is obtained and the setting and the selecteed zone of regional preference pattern, and the operation of control distortion correction processing section 133 is handled so that the image in the selected zone through changing is carried out distortion correction.And it controls highlighted demonstration processing section, selected zone 132, makes it possible to identify the image in the selected zone through changing.Further, its control chart divides 134 operation as output processing part, makes can generate and the corresponding view data DVd (ST1003) of pattern through change.Further, if the input information PS that is obtained brings variation in the size in selecteed zone or shape, its control distortion correction processing section 133 and highlighted demonstration processing section, selected zone 132, make can with carry out accordingly through the selected zone that changes that distortion correction is handled and highlighted demonstration processing.For example, if shape or complicated more geometry that selected zone is set to circle, ellipse, triangle, pentagon or octangle, is made of straight line and curve, its control distortion correction processing section 133 then makes and carries out distortion correction to having in these shapes in any one selected zone included image.It is operation of control highlighted demonstration processing section, selected zone 132 further, with provide with have these shapes in the corresponding highlighted image in any one selected zone.Should be noted that if input information PS does not bring the variation (is "No" at ST1002) that is provided with or selects then operation turns back to ST1002 newly provides the input information PS that comes whether to bring the variation that is provided with or selects with decision from importation 12.
Below, will be under the condition that its distortion is corrected processing controls part 135 treatment of picture obtaining input information PS and obtain selected zone be called development treatment.
Subsequently, with reference to Fig. 9 and 10 state model of wherein placing image pickup opticator 111 is described.If the visual field is expressed as the three dimensions shown in Fig. 9, then can be on ball 52 with the choice area domain identifier.Should be noted that about angle θ and represent incidence angle as the arrow OA of optical axis.
In this case, if image pickup opticator 111 is made up of the fish eye lens at the visual angle with for example about 180 degree, then its visual field is corresponding with the hemisphere of ball 52.Therefore, if with image pickup opticator 111 to upper right placement, shown in Figure 10 (A), then the first half of this ball provides the visual field of image pickup opticator 111.Should be noted that the visual field with the first half of this ball is called the episphere visual field.And if image pickup opticator 111 is placed down to the right, shown in Figure 10 (B), then the Lower Half of this ball provides the visual field of image pickup opticator 111.Should be noted that the visual field with the Lower Half of this ball is called the lower semisphere visual field.Further, if with image pickup opticator 111 horizontal positioned with from the front end captured image, shown in Figure 10 (C), then the first half of this ball provides the visual field of image pickup opticator 111.Should be noted that the visual field with the first half of this ball be called before the hemisphere visual field.And, if be alternative in front end from the right side or the left side captured image, then obtain the right side or the left hemisphere visual field respectively.
With image pickup opticator 111 under the situation of upper right placement, just, the optical axis of image pickup opticator 111 is roughly aimed at vertical line will pick up direction upwards, this situation hypothesis, for example, the user may be from ground, floor or desk look up.Under the situation that image pickup opticator 111 is placed down to the right, that is to say, the optical axis of image pickup opticator 111 is roughly aimed at vertical line will pick up under the situation about being directed downwards this situation hypothesis, for example, the user may look down from ceiling or sky.Under the situation of horizontal positioned image pickup opticator 111, this situation hypothesis, for example, the user flatly or abreast watches from wall perpendicular to the ground etc.
In addition, can consider such visual field: up or down tilts.In this mode, obtain the hemisphere visual field along the direction of wherein placing image pickup opticator 111 (if image pickup opticator 111 and image pick-up element 112 is integrated each other then wherein place the direction of image pickup part 11).Should be noted that not only and use therein in the fish-eye situation, and use therein in the situation of wide-angle lens or minute surface that the direction in the visual field changes according to the direction of placing.Further,, then can select the part of its scope in the visual field, and can utilize the part of this selection of the scope in the visual field if the visual field is big.
Subsequently, will describe the distortion correction that is undertaken by distortion correction processing section 133 below handles.As the distortion correction processing method, also can use the geometric correction technology to use common algorithm, the two-dimensional coordinate system that this algorithm for example will comprise distortion is converted to the two-dimensional quadrature coordinate system that does not comprise distortion.In this case, conversion formula or table can be stored in ROM or any other the unshowned memory.Yet, except that this distortion correction technology, can also use any other known distortion correction technology.
Figure 11 is the example schematic diagram of the picture altitude characteristic of camera lens.In Figure 11 (A), will be shown as in the y direction of principal axis near the episphere visual field two dimension the O point and watch.In the figure, for example, arrow OPk indication scenery direction.Suppose that the focus by the scenery of being placed in the indicated direction of this arrow OPk is a Q, then the distance from an O to focus Q provides picture altitude Lh.Figure 11 (B) is the figure that the characteristic of this picture altitude is shown.Its trunnion axis is represented angle (incidence angle) θ and its vertical axis presentation video height Lh.Can in advance its data be stored in the memory as conversion table.
Figure 12 is the example schematic diagram of the principle of distortion correction processing.If 12 (A) are the display surfaces 81 that the scope of the image that will show on display part 14 is shown.Figure 12 (B) shows the state that display surface 81 wherein is set about ball 51 in the episphere visual field.Here, if the image in the selected zone is presented on the display part 14, then the display surface 81 in the scope that this display image is shown is corresponding with selecteed zone.Figure 12 (C) shows wherein the ball 51 shown in Figure 12 (B) is projected on the x-y face, makes the regional corresponding state of zone that this ball 51 projects to and complete image Gcp.
The point of some P on the display surface 81 that ball 51 about the episphere visual field is provided with and so on for example will be described below.The position of supposing this P be P (u, v, w) because OP=(u 2+ v 2+ w 2) 1/2So, can be by calculating θ=arccos[w/ ((u 2+ v 2+ w 2) 1/2)] obtain angle θ.The center that should be noted that the hypothesis display surface is HO.Further, by the top picture altitude characteristic of prior acquisition image pickup opticator 111 and the conversion table of stored angles θ and picture altitude Lh, can be by calculating the picture altitude Lh of angle θ acquisition about a P.
Further, suppose the x-y face and from the crosspoint of a P between the vertical line that the x-y face is drawn be a P ' (u, v, w), OP '=OP * sin (θ).Therefore, (xp yp) gets such position, wherein xp=u * Lh/ ((u to focus Q 2+ v 2+ w 2) 1/2* sin (θ) and yp=v * Lh/ ((u 2+ v 2+ w 2) 1/2* sin (θ), therefore, can obtain focus Q (xp, yp).
Further, can be at the angle between the direction that obtains the some P ' on x axle and the x-y face corresponding with a P
Figure A20068004215100181
Afterwards, from angle θ and picture altitude Lh, obtain the position of focus Q.Here, can be by calculating
Figure A20068004215100182
Calculate the angle Therefore, about the x axle at the angle
Figure A20068004215100185
Direction in leave an O and have the position of picture altitude Lh that focus Q is provided.
By from image pickup pixel 112, obtaining the therefore picture element signal of the focus Q of acquisition, on display surface 81, draw some P based on the pixel data shown in Figure 12 D.Further, by each point (pixel) on the display surface 81 is carried out identical processing, can on display surface 81, show because the image that its distortion that image pickup opticator 111 causes is corrected.Should be noted that if there is no corresponding picture element signal, then can use near the picture element signal of the pixel of focus Q to generate the picture element signal corresponding with focus Q with focus Q.For example, by interpolation between near the picture element signal of the pixel the focus Q etc., can generate the picture element signal corresponding with focus Q.
By like this by use with display surface 81 on each pixel data of putting corresponding focus Q carry out drafting, image through overcorrect can obtain to distort, make processing controls part 135 generate the information that can identify the focus corresponding with selected zone, that is to say, the image-region ARs corresponding with selected zone is as being selected regional configuration information JA.For example, processing controls part 135 by use the angle θ shown in Figure 12 B and
Figure A20068004215100191
Generate the information in the selected zone of indication, as selected regional configuration information JA.In this case, can from angle θ and angle
Figure A20068004215100192
The sign image-region ARs corresponding with selected zone among the corresponding picture altitude Lh makes highlighted demonstration processing section, selected zone 132 can generate the view data DVcp that wherein corresponding with selected zone image-region ARs is given the complete image Gcp that makes highlighted image Gs.Further, by based on the display surface corresponding with selected zone the indication those pixels angle θ and
Figure A20068004215100193
Obtain the pixel data corresponding with each pixel, distortion correction processing section 133 can generate the view data DVsc of the distortion of the selected regional display image Gsc that carries out the distortion correction processing on it through overcorrect.Further, if even will indicate the coordinate figure of the scope in selected zone to be used as selected regional configuration information JA, by carrying out aforementioned calculation processing etc., also can generate the view data DVcp of complete image Gcp and the distortion process image correcting data DVsc of selected regional display image Gsc.Further, by using coordinate figure,, also selected zone can be shown easily even selected zone has complicated shape.
Subsequently, below regional preference pattern will be described.As regional preference pattern MS, as shown in Figure 4, prepare as the orthogonal coordinates pattern MS1 of first area preference pattern work with as the polar coordinate model MS2 of second area preference pattern work.
Orthogonal coordinates pattern MS1 is such pattern, if its make from wall level perpendicular to the ground or watch abreast and be the preceding hemisphere visual field, for example shown in Figure 10 (C), need then can easily obtain the image of the distortion of scenery through overcorrect.Specifically, if 12 provide indication to be used to switch the input information PS of the instruction in selecteed zone from the importation, it is selected regional 71 to move in the direction of principal axis of orthogonal coordinate system based on switching command that then processing controls part 135 is carried out computings, thereby generate the selected regional configuration information JA in the newly-installed selected zone of indication.
Figure 13 is the example schematic diagram of the operation among the orthogonal coordinates pattern MS1 in regional preference pattern.In orthogonal coordinates pattern MS1, use orthogonal coordinate system to switch selected regional 71 according to switching command.Switching command is for example indicated x coordinate figure and the y coordinate figure that switches selected zone, back, perhaps the x direction in this selected zone and y direction displacement, thereby the zone 71 that switching is selected on orthogonal coordinates.Here, iff one that has changed in x coordinate and the y coordinate, then selected regional 71 direction of principal axis along orthogonal coordinate system move to reposition.Further, if changed x coordinate and y coordinate, then selected regional 71 direction of principal axis about orthogonal coordinate system tilt to move to reposition.
If to come order to be provided with selected regional 71 by move its in the x direction of principal axis based on switching command, then the track of the arbitrfary point in selected regional 71 (for example, center P O) is followed line 51x.On the other hand, if to come order to be provided with selected regional 71 by move its in the y direction of principal axis based on switching command, then the track of the center P O in selected regional 71 is followed line 51y.Should be noted that when selected regional 71 move image-region ARs also moves.
By this way, in orthogonal coordinates pattern MS1, switch above-mentioned selected zone by the coordinate figure that changes orthogonal coordinate system.Therefore, by orthogonal coordinates pattern MS1 is provided under the situation in the hemisphere visual field before providing, can be easily be set to the position of level or vertical moving according to selecteed regional switching command, make and image shown on the display part 14 easily can be switched to the image that exists in the direction of wanting selected regional 71.For example, can easily select want one of horizontal scenery and it is shown.
Subsequently, below polar coordinate model MS2 will be described.Polar coordinate model MS2 is such pattern, if make from ground, floor or desk upwards watch as the episphere visual field, for example shown in Figure 10 (A), if perhaps watch downwards as lower semisphere visual field etc. from ceiling or sky, shown in Figure 10 B, can easily obtain to carry out the image that needs scenery of distortion correction thereon.Specifically, if 12 provide indication to be used to switch the input information PS of the instruction in selected zone from the importation, then processing controls part 135 is carried out computings with based on switching command, move in the direction that the argument of polar coordinate system changes therein selected regional 71, thereby generate the selected regional configuration information JA in the newly-installed selected zone of indication.
Figure 14 is the example schematic diagram of the operation among the polar coordinate model MS2 in regional preference pattern.In polar coordinate model MS2, use polar coordinate system to switch selected regional 71 according to switching command.If the visual field is expressed as shown in figure 14 three dimensions, for example, then the argument θ ag and the argument in selected zone, back are switched in the switching command indication
Figure A20068004215100201
And variation angle or the argument of argument θ ag
Figure A20068004215100202
The variation angle, make and can in polar coordinates, switch selected regional 71.Here, if only changed argument θ ag and argument
Figure A20068004215100203
In one, in the reformed direction of then selected regional 71 polar therein argument θ ag (below be called " direction that θ ag changes), perhaps polar therein argument
Figure A20068004215100204
Reformed direction (below be called " The direction that changes) moves to reposition in.Further, if argument θ ag and argument
Figure A20068004215100206
Be changed, then selected regional 71 directions that change about the θ ag of polar coordinate system or
Figure A20068004215100207
The direction that changes moves to reposition obliquely.
If select zone 71 that it is set in proper order by moving in the direction that θ ag changes based on switching command, then the track of the arbitrfary point in selected regional 71 (for example, center P O) is followed line 51r.On the other hand, if by existing based on switching command
Figure A20068004215100211
Move in the direction that changes and select zone 71 that it is set in proper order, then the track of the center P O in selected regional 71 is followed line 51s.Should be noted that when selected regional 71 move image-region ARs also moves.
Further, if use the image of representing in two-dimensional space to represent the visual field, then the argument in selected zone, back is for example switched in instruction indication thereafter With mobile radius or argument
Figure A20068004215100213
The variation angle or the variation of mobile radius, thereby also using two-dimensional space to represent in polar coordinates, to switch selecteed regional 71 under the situation in the visual field.
By this way, in polar coordinate model MS2, switch selected zone by argument and/or the mobile radius that changes polar coordinate system.Therefore, by polar coordinate model MS2 is provided on providing or under the situation in the lower semisphere visual field, can easily be set to by moving the position that arrives according to selected regional switching command in the direction that argument changes selecteed regional 71, making the image in demonstration on the display part 14 easily to be switched to is needing the image that exists in the direction.For example, can easily be chosen in of wanting in the scenery of arranging around the image pickup opticator 111 and it is shown.
Figure 15 shows an example of graphic user interface (GUI) shown when the user uses the selected zone of 12 operations, importation.Operation entr screen Gu shown in Figure 15 (A) and 15 (B) and complete image Gcp and selected regional display image Gsc shown in Figure 7 together can be presented on the display part 14.Alternatively, operation entr screen Gu and complete image Gcp etc. separately can be shown on different display parts.Further, can for example provide the display part of separating on the display part 12, on this display part, provide GUI to show like this with the display part 14 that shows complete image Gcp and selected regional display image Gsc thereon.Operation entr screen Gu is provided with arrow button group Gua or arrow button group Gub, " amplification " button Guc1 and " dwindling " button Guc2.In Figure 15 (A), as arrow button group Gua, for example, " selections " the button Gua1 in the middle of having and its on every side such as " on ", other arrow buttons Gua2 the D score, " left side " and " right side ".Further, in Figure 15 (B), as arrow button group Gub, for example, " selection " button Gub1 in the middle of having and other orientation button Gub2 such as " north " and " SE (southeast) " around it.
Figure 16 is the example schematic diagram that the situation in selected zone is switched in instruction when selecting orthogonal coordinates pattern MS1.In this example, described and wherein used the situation of two display mode MH3 as display mode MH.Further, in the following description, use the wherein example of user's executable operations when watching operation entr screen Gu shown in Figure 15 (A) etc.
Suppose in the state shown in Figure 16 (A) user by use mouse or keyboard etc. press among the arrow button Gua2 " left side " button once or continuous several times.Button is pressed once then is held the state that does not discharge of pinning if refer to wherein to should be noted that " pressing continuously ", promptly " presses continuously ".
If press " right side " button by this way, according to corresponding input information PS by importation 12 inputs, processing controls part 135 in image processing section 13 is carried out hand-off process to selected zone in the direction corresponding with orthogonal coordinates pattern MS1, thereby generates the selected regional configuration information JA in the newly-installed selected zone of indication.Further, processing controls part 135 offers highlighted demonstration processing section, selected zone 132 to change highlighted viewing area with the selected regional configuration information JA that is generated, and makes highlighted image Gs can show and the newly-installed corresponding image-region ARs in selected zone.And, processing controls part 135 offers distortion correction processing section 133 with the selected regional configuration information JA that is generated, so that the image of the image-region ARs corresponding with newly-installed selected zone to be provided, the selected regional display image Gsc that is corrected as the distortion that wherein generates by image pickup opticator 111.
Further, the image output processing part divides 134 to generate the view data DVd of the display image comprise selected regional display image Gsc according to display mode, and provides it to display part 14.
Therefore, in display part 14,, show the image that wherein selected zone moves right if shown in Figure 16 (B).And in complete image Gcp, upgrade the position of highlighted image Gs.Further, if repeatedly or continuously press " right side " button, then processing controls part 135 is that selecteed zone is provided with displacement according to the operated number of times of " right side " button or its time period of being pinned.By this way, processing controls part 135 is carried out development treatment in orthogonal coordinate system, to show selected regional display image on display part 14.
Further, if by the user press " on " button, input information PS according to correspondence, hand-off process is carried out in 135 pairs of selecteed zones of processing controls part, so that display part 14 shows as the highlighted image Gs that switches the moving image-region ARs of selected regional time shift, the selected regional display image Gsc that is corrected as the distortion that wherein generates by image pickup opticator 111.
Can utilize corresponding " selection " button Gua1 and Gub1 among Figure 15 (A) and 15 (B) in a different manner.For example, they can be used as the recording start button with the selected regional display image Gsc among the record present image zone ARs.Further, image-region ARs be not shown as under the situation of highlighted image Gs, they can be used as regional selection operation start button and begin regional selection operation, make the user be shown as at this image-region ARs under the condition of highlighted image Gs and can select the zone.Alternatively, they can be used as the switching push button of display mode MH or any one in various other confirming buttons.
Further, in predetermined direction, suitably be provided with under the situation of the reference position on the image pickup part 11, for example, on direction northwards, if processing controls part 135 determines that " east " button among the arrow button Gu2 is operated, then processing controls part 135 newly is provided with selecteed zone in " east " direction, thereby upgrades selected regional display image Gsc.Further, if processing controls part 135 determines that " west " button is operated, then it newly is provided with selected zone in " west " direction, thereby upgrades selected regional display image Gsc.By this way, the button of the direction of wanting by operation indication can be presented at the image that selected zone is set to the reposition of wanting on the display part 14, and not generate distortion.
Should be noted that it is exemplary that the GUI shown in Figure 15 (A) and 15 (B) shows, is not limited thereto certainly.
The display image that on display part 14, shows when order is switched display mode if Figure 17 shows when for example having selected the orthogonal coordinates pattern.Figure 17 (A) shows complete image display mode MH1 is being set, and only shows the display image under the situation of complete image Gcp.Figure 17 (B) shows at the display image that is provided with under the situation that selected image display mode MH2 only shows selected regional display image Gsc.Figure 17 (C) shows the display image under the situation that two display mode MH3 demonstration complete image Gcp and selected regional display image Gsc are set.Further, if order is switched those display modes each other, then complete image display mode MH1, selected image display mode MH2 and two display mode MH3 are switched in circulation.
Figure 18 shows the display image that shows on display part 14 when selecting polar coordinate model MS2.Further, Figure 18 shows the situation that two selected zones wherein are provided.The quantity that should be noted that selected zone is not limited to one or two, but can be provided with arbitrarily by the user.For example, can when " menu " the button Guc3 shown in operation Figure 15 (A) or 15 (B) or other unshowned any GUI demonstrations are used at every turn, this quantity be increased.In this case, processing controls part 135 is carried out according to the input information PS from importation 12 and is handled, so that a plurality of selected zones to be provided.Alternatively, with irrespectively, can in being provided with, provide a plurality of predetermined selected zones from the input information PS of importation 12 before.Further, separate display mode MH4 if be provided with, then according to the input information PS from importation 12, processing controls part 135 can independently generate the selected regional display image in selected zone, and they are simultaneously displayed on the display part 14.
Now, will suppose the description of polar coordinate model MS2 below.In order to simplify description, the hypothesis robot can be by the state of very little so that the inaccessiable pipeline of people in Figure 18.Further, suppose that this robot is equipped with the image pickup opticator of being made up of for example wide-angle lens 111.Further, suppose that pipeline has the crack 92 of the top formation of wall 91 within it.Selected regional display image Gsc1 is the first choice area area image.That is to say that it is an image of wherein the highlighted image Gs1 of the image-region ARs1 corresponding with the first selected zone being carried out the distortion correction processing.Further, selected regional display image Gsc2 is an image of wherein the highlighted image Gs2 of the image-region ARs2 corresponding with the second selected zone being carried out the distortion correction processing.
In the state of Figure 18 (A), be what to be used in the polar coordinate system shown in Figure 14 if the input information PS that comes is provided from importation 12
Figure A20068004215100241
Switch the switching command in selected zone in the direction that changes, for example, processing controls part 135 is carried out and is handled to exist according to switching command
Figure A20068004215100242
Switch selected zone in the direction that changes, and will indicate the selected regional configuration information JA that handles selected zone, back to offer highlighted demonstration processing section 132, selected zone and distortion correction processing section 133.Select regional highlighted demonstration processing section 132 to show based on selected regional configuration information JA and processing back corresponding highlighted image Gs1 and the Gs2 in selected zone.Based on selected regional configuration information JA, under the condition that the distortion that generates by image pickup opticator 111 is corrected, the image-region ARs1 corresponding with handling selected zone, back proofreaied and correct in distortion correction processing section 133 and the image of ARs2 becomes selected regional display image Gsc1 and Gsc2.Therefore, shown in Figure 18 (B), the display image after sending switching command is the image that sends switching command selected zone afterwards, and it will be shown as selected regional display image Gsc1 and Gsc2.Further, highlighted image Gs1 and Gs2 suitably indicate the zone of selected regional display image Gsc1 and Gsc2.In this case, counterclockwise mobile image-region ARs1 and ARs2 on complete image Gcp, and if switching command is rightabout, then moved in the clockwise direction.
Further, in the state of Figure 18 (A), if the input information PS from importation 12 is that the direction that is used for changing at the θ of polar coordinate system shown in Figure 14 ag is switched the switching command in selected zone, for example, processing controls part 135 is carried out according to switching command and is handled with the selected zone of switching in the direction that changes at θ ag, and will indicate the selected regional configuration information JA that handles selected zone, back to offer highlighted demonstration processing section 132, selected zone and distortion correction processing section 133.Highlighted demonstration processing section 132, selected zone shows based on selected regional configuration information JA and handles back corresponding highlighted image Gs1 and the Gs2 in selected zone.Based on selected regional configuration information JA, under the situation that the distortion that generates by image pickup opticator 111 is corrected, image-region ARs1 that distortion correction processing section 133 will be corresponding with handling selected zone, back and the image rectification of ARs2 are selected regional display image Gsc1 and Gsc2.Therefore, as shown in figure 19, the image sending the selected zone after display image is to send switching command after the switching command is shown as selected regional display image Gsc1 and Gsc2 with it.Further, highlighted image Gs1 and Gs2 suitably indicate the zone of selected regional display image Gsc1 and Gsc2.In this case, on complete image Gcp, radially image-region ARs1 and ARs2 have been moved each other and have drawn closer together, and if switching command be rightabout, it is farther then to be movable away from one another on complete image Gcp.
Should be noted that in polar coordinate model MS2 for example, can use mouse, keyboard, touch sensor to wait and switch selected zone, GUI can be any form in this case.
In polar coordinate model MS2, for example, state about the display image shown in Figure 18 (A), as shown in figure 20, can be according to the 12 input information PS that obtain from the importation, its by be rotated down 180 the degree situations under the selected regional display image Gsc1 on the bottom of display part 14 is shown, and its by rotated up 180 the degree situations under the selected regional display image Gsc2 on the top of display part 14 is shown.By this way, the user can watch image in the angle of watching easily easily.
Certainly, still in above-mentioned orthogonal coordinates pattern MS1, a plurality of selected zones are provided, make and can correspondingly show divided display mode MH4, alternatively, in orthogonal coordinates pattern MS1 and polar coordinate model MS2, even if a plurality of selected zones are provided, image processing section 13 also is alternative in the calibrated image of distortion of the combination that generates a screen, and generates the selected regional display image in a selected zone, and it is outputed to display part 14.In this case, by the processing controls part 135 in the image processing section 13, according to from the input information PS of importation 12 or predetermined configuration information, control and whether export in the view data of the selected regional display image that shows this selected zone on the screen or whether generate the selected regional display image in a plurality of selected zones and selected regional display image is presented on the screen with those.
In polar coordinate model MS2, for example, when each user operated " menu " button Guc3, processing controls part 135 was switched display mode MH similarly with orthogonal coordinates pattern MS1.Further, if a plurality of selected zones are provided, then can select divided display mode MH4.
Figure 21 is the schematic diagram that is illustrated in the change conversion of display mode under the situation of the selection that enables divided display mode MH4.Here hypothesis provides four selected zones.For example, in demonstration, can switch the pattern (Figure 21 (A)) that wherein in two display mode MH3, shows a calibrated image in selected zone, wherein it is shown the pattern (Figure 21 (B)) of a selected regional display image under condition that spins upside down in two display mode MH3, the pattern (Figure 21 C) that wherein in divided display mode MH4, shows two calibrated images in selected zone, the pattern (Figure 21 (D)) that wherein shows the image that wherein shows two calibrated images in selected zone under the condition that in divided display mode MH4, they is spun upside down, the pattern (Figure 21 (E)) that wherein in divided display mode MH4, shows four calibrated images in selected zone, and the pattern (Figure 21 (F)) that shows the image that wherein shows four calibrated images in selected zone under the condition that wherein in divided display mode MH4, they is spun upside down.Should be noted that owing to the calibrated image in not every selected zone in Figure 21 all is shown, so can switch the calibrated image in selected zone that will be shown.Should be noted that and allow to enable to switch to complete image display mode MH1 and selected image display mode MH2 certainly.
By this way, according to present embodiment, because for example by orthogonal coordinates pattern MS1 and polar coordinate model MS2 being provided as regional preference pattern according to the Zone switched preference pattern of the direction in the visual field, so can carry out directly perceived and the operation of understanding easily, and realize image processing equipment convenient and use easily to the user.
Except above-mentioned processing, image processing section 13 can also be carried out processing to zoom in or out and to rotate selected regional display image.
With reference to Figure 22 the processing that is used to amplify/dwindle selected regional display image is described below.In the state shown in Figure 22 (A), for example, the user operates " amplification " button Guc1 shown in Figure 15 (A) and 15 (B).Then, processing controls part 135 is carried out the scope of handling to dwindle selected zone according to current input information PS, and will indicate the selected regional configuration information JA that handles selected zone, back to offer highlighted demonstration processing section 132, selected zone and distortion correction processing section 133.Highlighted demonstration processing section 132, selected zone shows and processing back corresponding highlighted image Gs1 and the Gs2 in selected zone based on selected regional configuration information JA.Treatment for correcting is carried out in distortion correction processing section 133, with image-region ARs1 that will be corresponding with handling selected zone, back and the image rectification of ARs2 is selected regional display image Gsc1 and Gsc2, in Gsc1 and Gsc2, proofread and correct the distortion that generates by image pickup opticator 111 based on selected regional configuration information JA.Here, if two display mode MH3 are provided shown in Figure 22 (A), then as shown in Figure 22 B, except the viewing area of complete image Gcp, will be presented on the complete screen through the corresponding selected regional display image Gsc in the selected zone that dwindles, make to show that the display image after dwindling is amplified in the people's image GM that comprises among the image-region ARs to compare with Figure 22 A.Should be noted that owing to selected zone is dwindled, so the image-region ARs in complete image Gcp becomes littler.
On the contrary, if user's operation " dwindling " button Guc2, then processing controls part 135 is carried out the scope of handling to dwindle selected zone according to current input information PS, and will indicate the selected regional configuration information JA that handles selection zone, back to offer highlighted demonstration processing section 132, selected zone and distortion correction processing section 133.Therefore, the highlighted image Gs of the image-region ARs corresponding with amplifying selected zone makes its distortion be corrected, and on complete screen, be shown as selected regional display image, made that being presented at people's image GM included among the image-region ARs compares with Figure 22 A and reduce except the viewing area of complete image Gcp.By this way, can carry out and amplify/dwindle processing.
Subsequently, describe wherein by the walk around image in selected zone and of image processing section 13 with reference to Figure 23 and 24 below its situation about showing.Figure 23 (A) and 23 (B) are illustrated in the state before instruction " rotation " operation, and wherein Figure 23 (A) shows the situation that wherein is arranged in such a way selected zone: for example the people's image GM among the complete image Gcp is included among the image-region ARs.On the other hand, Figure 23 (B) shows selected regional display image Gsc, and it is to carry out distortion correction by the highlighted image Gs to image-region ARs to handle the image that obtains.If the user sends " rotation " operational order, shown in Figure 24 (A), 135 pairs of selected zones of then processing controls part are carried out to change and are handled, and make that it can be according to the approximate midpoint rotation of input information PS around image-region ARs.In this case, in the image-region ARs corresponding, with opposite spin people image GM with changing selected zone, back.Therefore, generate selected regional display image Gsc by highlighted image Gs for the image-region ARs corresponding with changing selected zone, back, can obtain the wherein image of rotation people image GM in the direction of rotation opposite, shown in Figure 24 (B) with the direction of rotation in selected zone.By this rotation processing, the user can watch the object of observation easily with the angle of watching easily.
Further, be alternative in the selected zone of rotation, can rotate complete image Gcp.For example, from shown in Figure 25 (A) wherein become horizontal direction along with axle mobile x axle (shake (pan) axle) among orthogonal coordinates pattern MS1 and y axle (pitching (tilt) axle) becomes the state of vertical direction, for example, complete image Gcp is counterclockwise rotated with these x axles and y axle, highlighted image Gs also is so, shown in Figure 25 (B).In this case, the variation that in being provided to the selected regional display image Gsc of display part 14, does not cause.Therefore can in above-mentioned direction of rotation, proofread and correct the inclination at camera placement angle, perhaps deliberately rotate complete image Gcp and it is shown in complete image display mode MH1 as special-effect.
Figure 26 is the block diagram that illustrates according to the formation of the image processing system of another embodiment of the present invention.Below, with those similarly descriptions of device, function etc. of the image processing system 10 shown in simplification or omission and Fig. 1, to concentrate on different aspect.
Image processing system 20 has such structure so that memory device 21 is added in the image processing system shown in Figure 1 10.Memory device 21 is for example to be used to store the view data DVa that generated by image pickup part 11 and the equipment of the various view data that generated by image processing section 13.As the storage device that in memory device 21, uses, can use such as the storage medium of CD, disk, semiconductor memory, dielectric memory, similar tape can storing image data storage device.
If for example view data DVa is stored in the memory device 21, then image processing section 13 can read the view data DVa that the user wants from memory device 21 according to the input information PS from importation 12, and it is presented on the display part 14.Specifically, can consider this respect according to input information PS based on user's operation, image processing section 13 can read in the view data DVa of the wide field-of-view image Gc in the past of being stored in the memory device 21, for the visual field of this view data DVa representative that reads is provided with selected zone, and on display part 14, showing selected regional display image, it is the image of distortion correction that wherein will this selected zone.Alternatively, can irrespectively consider this this aspect with the user, image processing section 13 can be carried out distortion correction to the image in selected zone predetermined in the wide field-of-view image in past of being stored and handle in memory device 21, and it is presented on the display part 14.
In this case, as object lesson, can consider following aspect.For example, suppose the selected zone in the wide field-of-view image that the user selects to obtain in real time from image pickup part 11, and watch complete image Gcp and selected regional display image Gsc in real time or they are stored in the memory device 21.Then, the user can also select and the different zone of selecting in real time of above-mentioned zone, and watches its selecteed regional display image Gsc in the complete image Gcp that stores watching.
Alternatively, be alternative in image data storage with complete image Gcp in memory device 21, image processing section 13 can also only be stored the view data of selected regional display image Gsc.In this case, the user can watch this selected regional display image Gsc later on.Certainly, it can store each view data of the view data of indication complete image Gcp and selected regional display image Gsc or these images.
Alternatively, image processing section 13 can also be carried out the processing of the flow chart shown in Figure 27.Image processing section 13 is obtained real-time wide field-of-view image (ST2401) from image pickup part 11, but also obtains the wide field-of-view image or the selected regional display image (ST2402) in the past in the past of storage in memory device 21.Image processing section 13 can be carried out processing and be combined as screen picture data (ST2403) with the wide field-of-view image that will obtain like this with wide field-of-view image in the past or selected regional display image in the past, and the view data of combination is outputed to display part 14 (ST2404).Alternatively, image processing section 13 can show wide field-of-view image and wide field-of-view image in the past or the selected regional display image of past that is obtained on different display part 14.Should be noted that and to consider such aspect, promptly can on order, ST2401 and ST2402 be put upside down.
Alternatively, can consider such aspect, promptly image processing section 13 can be exported by using distortion correction to handle selected regional display image Gsc of acquisition (it can be image that generates in real time or the image that generates in real time) and selected regional display image of past from real-time wide field-of-view image memory device from the past wide field-of-view image of being stored.Specifically, as shown in Figure 28, image processing section 13 is obtained wide field-of-view image in real time or is obtained wide field-of-view image (ST2501) in the past from memory device 21 from image pickup part 11.Further, image processing section 13 is obtained the selected regional display image (ST2502) in the past of being stored in memory device 21.The image in the selected zone in the above-mentioned wide field-of-view image that 13 pairs of image processing section are obtained in ST2501 is carried out distortion correction and is handled (ST2503).Image processing section 13 is carried out and is handled will being handled the selected regional display image that generates and be combined as screen picture data (ST2504) at the selected regional display image that ST2502 obtains by this distortion correction, and it is outputed to display part 14 (ST2505) as selected regional display image Gsc.Should be noted that and can consider such aspect, promptly can on the order with ST2501 with ST2502 puts upside down or on order ST2502 and ST2503 are put upside down.
In the situation of the processing shown in Figure 28, further, image processing section 13 can also be exported in such a way by carry out distortion correction at ST2503 and handle the selected regional display image that obtains (below be called real-time selected regional display image) and pass by selected regional display image, and promptly the people can distinguish them on the display part.Specifically, can consider that image processing section for example 13 generates wherein is fitted to identifier real-time selected regional display image and the image at least one in the selected regional display image in the past, perhaps generates the frame that these images both is comprised and makes the image of the color that can generate the change with this frame.
Should be noted that if the view data of being stored in memory device 21 is a moving image, then can be stored in the memory device 21, make and automatically to wipe nearest picture frame subsequently according to the motion image data of its memory capacity with specified vol.
Further, can consider to use the following aspect of memory device 21.For example, user's executable operations is to be provided with real-time wide field-of-view image or based on the selected zone the wide field-of-view image of the view data that reads from memory device 21 at any time, and corresponding to this operation, image processing section 13 only will indicate the positional information how selected zone is set to be stored in the memory device 21 then.Further, if switch selected zone in above-mentioned zone preference pattern MS, then image processing section 13 can be stored in memory device 21 and be made it can reproduce the trace information of the switching in selected zone.Figure 29 is the flow chart that the processing that is used for stored position information or trace information is shown.Image processing section 13 is obtained input information, selected zone is set wide field-of-view image that obtains in real time from image pickup part 11 or the wide field-of-view image based on the view data that reads from memory device 21 time is used this input information (ST2601).Processing controls part 135 in image processing section 13 generates the positional information in this selected zone according to input information, if handover operation is carried out in selected zone, generates and makes it can reproduce the trace information of the switching in selected zone (ST2602).Processing controls part 135 in image processing section 13 is stored in (ST2603) in the memory device 21 with positional information or the trace information that is generated.
Need therein preset range image or since for example in the situation of the image that causes of the track in the scope in the ad-hoc location this aspect be effective.For example, if therein image pickup part 11 is installed in the situation of fixing point security cameras, need can consider the image of preset range or the image of the track in the scope in the wide field-of-view image.In this case, by the selected zone corresponding with preset range in the wide field-of-view image or the track of preset range are set, the user always can monitor the selected regional display image of this scope or as its track on display part 14 of display image.In aspect this, for example, under the situation of the selected regional display image of " track " of memory device 21 storing predetermined scopes, can periodically repeat moving continuously of the selected zone from its starting point to its end point automatically.Further, the image that is used for each cycle can be stored in memory device 21.Should be noted that this aspect is not limited to security purpose certainly.
Determine that above-mentioned preset range image can be rest image or moving image.The image of the track of particular range can also be the locational rest image along this track, and can be used as the moving image storage of covering from the starting point of this track to end point.In this case, image processing section 13 can be carried out processing with image and the rest image of output behind the distortion correction.
Figure 30 is the key diagram of method that this track of preset range is set as mentioned above as the aspect of using memory device 21.
The user is set to wide field-of-view image to switch the position in selected zone with selected zone.Specifically, this realizes by repetitive operation, make to determine for example to operate which of arrow button Gua2 shown in Figure 15 (A) and 15 (B) and position of orientation button Gub2, and selected zone is mobile on the indicated direction of operated button; In the inter-process of image processing system 20, shown in the ST2602 and ST2603 of Figure 29, image processing section 13 is according to the input information of the operation of indication " selection " button Gua1 and Gub1, and the positional information in current selected zone is stored in the memory device 21.For example, if when the image-region ARs corresponding with selected zone is positioned at position on the wide field-of-view image Gc operation " selections " button Gua1 or Gub1, then its at this moment between point store the positional information in selected zone.Further, if when switching selected zone and the image-region ARs corresponding with switching selected zone, back operates " selection " button Gua1 or Gub1 when being positioned at position b on the wide field-of-view image Gc, then it stores the positional information in selected zone on this time point.Similarly, when the image-region ARs corresponding with selected zone is positioned at the position c of wide field-of-view image Gc and d, store the positional information in selected zone at this time point.
Alternatively, even if user's inoperation arrow button Gua2 or orientation button Gub2 also can use prior program to be provided with track, perhaps can be the automatic identification of use by above-mentioned various transducers generated track.In this case, track can be such, makes selected zone to be provided with image-region ARs is set to the discrete point position a, b, c and the d in Figure 30, perhaps makes selected zone can be set continuously from an a to d.Alternatively, if the user has been provided with the discrete point of position a, b, c and d, then image processing section 13 can have such program of being installed in wherein so that selected zone to be set, and makes the position that image-region ARs is set to the point that is used for interpolation position a, b, c and d.
Alternatively, can provide a plurality of trace informations in advance, make the user can select any one in them.
Figure 31 is the block diagram that illustrates according to the formation of the image processing system of further execution mode of the present invention.This image processing system 30 is provided with memory device 21, rather than above-mentioned image pickup part 11.In storage area 21, for example, such as mentioned above prior storage wide field-of-view image.This formation makes image processing section 13 can read the view data DVm of wide field-of-view image, and obtains selected regional display image by carrying out development treatment from this wide field-of-view image.
In addition, in image processing system 30, image processing section 13 can be by from based on obtaining selected regional display image by development treatment in the wide field-of-view image of the data during being stored in memory device 21 in advance, and under the condition that those wide field-of-view images and selected regional display image are relative to each other they are stored in the memory device 21.Alternatively, image processing section 13 can also be stored the wide field-of-view image based on the view data of prior storage, and indication will be carried out the information in the selected zone of development treatment to it in the wide field-of-view image from memory device 21 under the condition that they are relative to each other.
The invention is not restricted to above-mentioned execution mode, but can carry out various modification.
Can also alternatively every process scheduled time show complete image Gcp and the selected regional display image Gsc that is displayed on the display part 14 on the display part 14.In this case, in response to any input operation of user, can show complete image Gcp and selected regional display image Gsc.
In Fig. 1 and 26, the image pick-up element in the image pickup part 11 112, importation 12, image processing section 13, display part 14, memory device 21 etc. can be connected to each other by internet, Local Area Network or other networks such as special line.
Can such as safety system, TeleConference Bridge, be used to detect, manage and image processing system according to above-mentioned execution mode is used in the system of the system of test machine and setting, road traffic system, use mobile cameras (for example, being used for from the camera of vehicle, aircraft or any other the removable scene shot that moves), various fields that the baby looks after (nursing-care) system, medical system and so on.
The image processing system of the execution mode combination that can also realize Figure 26 and 31 is illustrated respectively.That is to say, can realize that such system all comprises memory device to be included on each of its front end (stage) and rear end.
Subsequently, another execution mode of various details.In the present embodiment, the processing controls part 135 in image processing section 13 can be come Zone switched preference pattern MS according to the installation direction that image pickup part 11 wherein is installed (setting angle).The formation and the operation of image processing system 40 in this case will be described below.
Figure 32 is the block diagram that the formation of the image processing system of another execution mode according to the present invention is shown.Shown in figure 32, in this embodiment, outside the element of the image processing system shown in Fig. 1 10, image processing system 40 also comprises direction detection sensor, for example, detection wherein makes optical imagery incide the gyrosensor 41 of the direction on the image pickup opticator 111.
The gyrosensor 41 that is fixed to image pickup opticator 111 detects and wherein makes optical imagery incide the direction on the image pickup opticator 111, and the result's that direction indication is detected sensor signal ES offers the processing controls part 135 in the image processing section 13.Should be noted that following description based on such hypothesis, promptly image pickup opticator 111, image pick-up element 112 and gyrosensor 41 all are integrated in the image pickup part 11.
Figure 33 is the conceptual schematic view that is used for according to the mode of the Zone switched preference pattern MS of direction that wherein places image pickup part 11 in the present embodiment.
As shown in figure 33, the state of wherein placing image pickup part 11 roughly can be divided into three kinds of situations: as shown in top Figure 10, place it in upward situation (the episphere visual field of Figure 33 (A)) such as ground, floor, desk to pick up the scenery MH that wants, place it in the situation (the preceding hemisphere visual field of Figure 33 (B)) to pick up scenery MH on the wall perpendicular to the ground, and place it in ceiling or day aerial situation (the lower semisphere visual field of Figure 33 (C)) to pick up scenery MH.
Therefore, the processing controls part in image processing section 13 135 is automatically according to being provided with or Zone switched preference pattern MS based on the angle of determining from the sensor signal of direction detection sensor about in the vertical direction of image pickup part 11.
Specifically, if image pickup part 11 then switches to polar coordinate model MS2 with regional preference pattern MS in last or the lower semisphere visual field, and if its in the preceding hemisphere visual field, then regional preference pattern MS is switched to orthogonal coordinates pattern MS1.By such switch mode, if image pickup part 11 in last or the lower semisphere visual field, then can be easily and fifty-fifty than the scenery of the center of wide field-of-view image Gc itself observe more this scenery around.Further, if image pickup part 11 is in the preceding hemisphere visual field,, can under the situation of the detailed observing scene in the center of wide field-of-view image Gc, also observe the above-below direction and the left and right directions of this scenery easily by orthogonal coordinates pattern MS1 is set.
Should be noted that in the present embodiment, will be shown in Figure 33 (A) wherein image pickup part 11 is placed on the episphere visual field and the state that regional preference pattern MS switches to polar coordinate model MS2 is called S 0, will be wherein as Figure 33 (B) shown in, image pickup part 11 be placed on the preceding hemisphere visual field and the state that regional preference pattern MS switches to orthogonal coordinates pattern MS1 is called S 1, and will be wherein shown in Figure 33 (C), image pickup part 11 be placed on the lower semisphere visual field and the state that regional preference pattern MS switches to polar coordinate model MS2 is called S 2
Figure 34 is used for switching state S 0, S 1And S 2The explanation schematic diagram of the method that threshold value is set.As shown in figure 34, in the present embodiment, for example, suppose that the state of image pickup part 11 in the episphere visual field is reference position (ψ=0 degree), at first, (ψ 1 to use two threshold values, ψ 2) whether the wide field-of-view image setting angle that picks up part 11 surpass threshold value ψ 1 or ψ 2, determine any one among above-mentioned three kinds of state S0, S1 and the S2, and based on determination result, setting area preference pattern MS.Further, if the setting angle of image pickup target 11 is changed after being provided with regional preference pattern MS, then when switching each other, use those states the threshold value except above-mentioned threshold value ψ 1 and ψ 2 to provide hysteresis.
Specifically, shown in Figure 34 (A), for example threshold value ψ 1 ± value in the 10 degree scopes is set to new threshold value ψ 3 and the ψ 4 except threshold value ψ 1 and ψ 2, and for example threshold value ψ 2 ± new threshold value ψ 5 and ψ 6 that value in the 10 degree scopes is set to.Shown in Figure 34 (B), when state from state S 0To S 1Threshold value ψ 3 is threshold values during switching, and works as state from S 1To S 0Threshold value ψ 4 is threshold values during switching.Further, threshold value ψ 5 works as state from S 1To S 2Threshold value during switching, and threshold value ψ 6 is that state is from S 2To S 1Threshold value during switching.The magnitude relationship that comprises these threshold values of threshold value ψ 1 and ψ 2 becomes the ψ 4<ψ 1<ψ 3<ψ 6<ψ 2<ψ 5 shown in Figure 34 (B).Though for example should be noted that threshold value ψ 1 is 45 degree and threshold value ψ 2 is 135 degree, they are not limited to those values.Further, the scope that above-mentioned ± 10 are spent though above-mentioned threshold value ψ 3, ψ 4, ψ 5 and ψ 6 are not limited to, but they can ± 5 degree scopes, ± 15 degree scopes etc. in, and further, they can be arranged so that threshold value ψ 1 has different absolute values with difference between threshold value ψ 3 and the ψ 4, perhaps threshold value ψ 2 has different absolute values with difference between threshold value ψ 5 and the ψ 6.
Subsequently, will describe below and switch above-mentioned state S in the present embodiment each other 0, S 1And S 2Operation.Figure 35 be wherein image processing system 40 with state S 0, S 1And S 2The flow chart of the operation in the situation of Qie Huaning each other.
As shown in figure 35, at first, image pickup part 11 is placed above-mentioned reference position (ψ=0 degree).Processing controls part 135 in the image processing section 13 is obtained the measurement result of gyrosensor 14 (ST3201) at this moment.Subsequently, image pickup part 11 is placed into the position of wanting.Processing controls part 135 is obtained the measurement result of gyrosensor 41 this moment, and determines present angle ψ p (ST3202) according to this measurement result of obtaining at ST3201.
Subsequently, processing controls part 135 determines whether present angle ψ p is equal to or less than threshold value ψ 1 (ST3203).If ψ p≤ψ 1 (being), then processing controls part 135 is determined above-mentioned state S 0Be set up and regional preference pattern MS be set to the polar coordinate model MS2 (ST3204) in the episphere visual field.
If ψ p>ψ 1 (denying) in above-mentioned steps ST3203, then processing controls part 135 determines further whether present angle ψ p is equal to or less than threshold value ψ 2 (ST3207).If ψ p≤ψ 2 (being), then processing controls part 135 is determined above-mentioned state S 1Be set up, and regional preference pattern MS be set to the orthogonal coordinates pattern MS1 (ST3208) in the preceding hemisphere visual field.
If ψ p>ψ 2 (denying) in above-mentioned steps ST3207, then processing controls part 135 is determined above-mentioned state S 2Be set up, and regional preference pattern MS be set to the polar coordinate model MS2 (ST3212) in the lower semisphere visual field.
After ST3204 was set to polar coordinate model MS2 with regional preference pattern MS, processing controls part 135 read present angle ψ p (ST3205) from gyrosensor 41 once more, and determined whether present angle ψ p is equal to, or greater than threshold value ψ 3 (ST3206).If ψ p 〉=ψ 3 (being), then processing controls part 135 determines image pickup part 11 have been changed to above-mentioned state S 1, and regional preference pattern MS is set to orthogonal coordinates pattern MS1 (ST3208).If ψ p<ψ 3 (denying), then processing controls part 135 is kept the state (ST3204) of polar coordinate model MS2.
After ST3208 was set to orthogonal coordinates pattern MS1 with regional preference pattern MS, processing controls part 135 read present angle ψ p (ST3209) from gyrosensor 41 once more, and determined whether present angle ψ p is equal to or less than threshold value ψ 4 (ST3210).If ψ p 〉=ψ 4 (being), then processing controls part 135 determines that the image pickup part has changed to above-mentioned state S 0, and regional preference pattern MS is set to polar coordinate model MS2 (ST3204).
If at ST3210 ψ p>ψ 4 (denying), then processing controls part 135 determines further whether above-mentioned present angle ψ p equals or about threshold value ψ 5 (ST3211).If ψ p 〉=ψ 5 (being), then processing controls part 135 determines that image pickup part 11 has changed to above-mentioned state S 2, and regional preference pattern MS is set to polar coordinate model MS (ST3212).If ψ p<ψ 5 (denying), then processing controls part 135 is kept the state (ST3208) of orthogonal coordinates pattern MS1.
After ST3212 was set to polar coordinate model MS2 with regional preference pattern MS, processing controls part 135 read present angle ψ p (ST3213) once more from gyrosensor 41, and determined whether present angle ψ p is equal to or less than threshold value ψ 6 (ST3214).If ψ p<ψ 6 (being), then processing controls part 135 determines that image pickup part 11 has changed to above-mentioned state S 1, and regional preference pattern MS is set to orthogonal coordinates pattern MS1 (ST3208).
If at ST3214 ψ p 〉=ψ 1 (being), then processing controls part 135 is kept the state (ST3212) of polar coordinate model MS2.
By this way, by repeating top processing, processing controls part 135 is come Zone switched preference pattern MS according to the stagger angle of image pickup part 11 automatically.
Here, use description to below at state S 0And S 2In be set to polar coordinate model MS2 or at state S 1In be set in the orthogonal coordinates pattern MS1 situation and be used for coordinates computed, and the method that in every kind of pattern, shows selected regional display image Gsc.Figure 36,37,38 and 39 is the figure about this coordinate Calculation method.
At first, if shown in Figure 36 (A), in above-mentioned polar coordinate model MS2 (the episphere visual field), suppose that shake angle (around z axle rotation) and the angle of pitch (around x axle or the rotation of y axle) of display surface 81 is respectively H (the x direction of principal axis is 0 degree) and V (x direction of principal axis be 0 spend).
Subsequently, in orthogonal coordinates pattern MS1, facing to its direction pitching 90 degree, make the reference axis exchange shown in Figure 36 (A) among the polar coordinate model MS2 to change above-mentioned value of shaking and pitching value with the visual field.
Specifically, as shown in Figure 36 (B), the corresponding axis of Figure 36 (A), be about to x axle, y axle and z axle respectively with the anglec of rotation exchange of y axle, z axle and x axle, and shake angle (H) and the angle of pitch (V) among Figure 36 (A) are respectively to shake angle (h) and the angle of pitch (v).In this case, indication as and to be displayed on display part 14 on the direction vector that direction is set [D] in selected zone in the corresponding zone of image, obtain in the matrix shown in Figure 37 by rotation x axle unit vector in each coordinate system.Like this, shake after the above-mentioned rotation of following acquisition the angle (h) and the angle of pitch (sine value of v) each and cosine value:
sin(v)=-cos(H)cos(V)
cos(v)=(1-sin(v) 2) 1/2
sin(h)=cos(H)cos(V)/cos(v)
cos(h)=-sin(v)/cos(v)
Should be noted that h=H in polar coordinate model MS2 and v=V.
On the other hand, if will be, then will be described below the coordinate of the display surface [a] that mates with this output such as the output of the fixed pixel the Video Graphics Array (VGA) as display part 14:
[a]=[a00=(0,0),a01=(r,0),...a10=(0,q),a11=(r,q),a12=(2r,q),amn=(nr,mq),...aMN=(Nr,Mq)]
Suppose that display surface [a] is three-dimensional surface [P0], wherein with the vertical point sequence of x axle and y axle, the z axle is parallel and wherein the heart channel of Hang-Shaoyin cross on the x axle, and the x coordinate is R (for example, a R=flake radius) shown in Figure 38 (B), above-mentioned [a] is three-dimensional, and be as follows shown in Figure 38 (C):
[a]=[(x0,y0),(x1,y0),...(xN,yM,1)]
→[A]=[(x0,y0,1),(x1,y0,1),...(xN,yM,1)]
This matrix [A] be multiply by matrix [K] that Figure 38 (C) illustrates so that matrix [P0] (=[K] [A]) to be provided.
Subsequently, by using this parameter as shown in Figure 39 (A) on the aforesaid operations entr screen Gu to be set at Figure 15 (A) and 15 (B), this face [P0] is expanded and moved to ball shown in Figure 38 (B).In this case, the face after supposing to move is face [P2], the point on the face [P2] corresponding with some Pj on the face [P0] be a Pj2 (x2, y2, z2).The calculating that each point on this face [P2] can pass through Figure 39 (D) by the matrix that uses [P0], [P2] shown in Figure 39 (C), [X], [Y], [Z] and [M] obtains.
Just, in orthogonal coordinates pattern MS1 [P2]=[Z] [Y] [X] [M] [P0] and in polar coordinate model MS2 [P2]=[Z] [Y] [M] [P0].Based on face [P2] coordinate figure of such calculating, and use and some Pj2 respective coordinates value (x2, y2, z2).
Then, in Figure 39 (B), by (x2, y2 z2) carries out and the above-mentioned distortion correction handling principle of use Figure 12 is similarly handled, and can obtain and put Pj2 (x2, y2, z2) Dui Ying focus Qj2 based on the some Pj2 on above-mentioned [P2].Further, carry out similar processing by each point on opposite [P2], can obtain in forming orthogonal coordinates pattern MS1 and polar coordinate model MS2, to carry out each locations of pixels on the image pick-up element 112 of the selected regional display image Gsc that distortion correction obtains after handling, use the pixel data of the locations of pixels that obtains so then, thereby enable wherein not generate the selected regional display image Gsc of the distortion that will show.
According to the image processing system 40 of present embodiment, by above-mentioned formation and operation, can pick up the stagger angle of part 11 with gyrosensor 41 detected image, suitably switching the regional preference pattern MS corresponding with this stagger angle, thus raising user degree of convenience.
Further, because Zone switched preference pattern on the condition that keeps therein lagging behind, even, therefore prevent to bother the user so when the fluctuation near the generation stagger angle above-mentioned threshold value ψ 1 and ψ 2, prevent that also regional preference pattern MS from being switched continually based on this fluctuation.
Should be noted that direction detection sensor can be any other transducer, for example, be alternative in the gravity sensor of above-mentioned gyrosensor 41, pick up the stagger angle of part 11 with detected image.
Further, though whether present embodiment according to the Zone switched preference pattern MS of the stagger angle of image pickup part 11, can contact Zone switched preference pattern MS according to for example image pickup part 11 with scenery.
Figure 40 is the conceptual schematic view that is used for switching according to contact the method for display mode.For example, under the situation that image pickup part 11 moves by pipeline 95, processing controls part 135 is set to polar coordinate model MS2 in the pipeline 95 for example in the direction of arrow T, thus pick up pipeline 95 around.Further, if its end at pipeline 95 contacts with wall surface 96, then processing controls part 135 can be switched to orthogonal coordinates pattern MS1 to pick up this wall surface.In this case, image processing system can be equipped with the detecting sensor that is used to detect contact, and the result who detects can be applied to the processing controls part 135 in the image processing section 13.Detecting sensor can be mechanical pick-up device or optical pickocff.
Further, can be in the automatic switchover of carrying out under the situation of not service orientation detecting sensor or any other detecting sensor between orthogonal coordinates pattern MS1 and the polar coordinate model MS2.At last, the situation that orthogonal coordinates pattern MS1 and polar coordinate model MS2 are switched each other will be described wherein below under the situation of not using transducer.
Here, understand for convenience and automatically switch, will under the situation of the visual field of hypothesis image pickup part 11, describe below for for example 270 degree.Figure 41 illustrates image pickup opticator 111 wherein by using bugeye lens to obtain the situation in the 270 degree visuals field.The light that incides on the image pickup opticator 111 advances towards image pick-up element 112, to form the wide field-of-view image Gc in the 270 degree visuals field on the sensor surface of image pick-up element 112.
Further, if it is shown in Figure 42, if placing the image pickup parts 11 with 270 degree visuals field makes as the direction of the arrow OCt of visual field center position and to provide such situation, hemisphere visual field 53h and episphere visual field 53u before wherein obtaining about horizontal direction 45 degree upwards.
Figure 43 shows the example of wherein placing the image pickup part 11 with 270 degree visuals field.For example, image pickup part 11 is installed to by this way the bow (bow) of ship: the center position in the visual field is about horizontal direction 45 degree upwards.Further, seat FS is arranged on after the image pickup part 11.Under the situation that image pickup part 11 is installed by this way, if show view (landscape) forward on the display part 14 before being suitable among the orthogonal coordinates pattern MS1 of the situation in the hemisphere visual field, then can easily in the direction of wanting, selected zone be set to view, thereby in the direction of wanting on the display part 14, show the distortionless image of view.Further, if be suitable for showing the passenger who is sitting on any back seat FS down or on the display part 14 among the polar coordinate model MS2 of the situation in the episphere visual field, then can easily on the direction of wanting, selected zone be set to the passenger, thereby in the direction of wanting on the display part 14, show passenger's distortionless image.Therefore, image processing section 13 is come Zone switched preference pattern MS according to the direction that selected zone wherein is set automatically, thereby the image that enables wherein the distortion correction that will cause owing to image pickup opticator 111 is presented on the display part 14.
Here, be provided with in the situation in selected zone by the direction of specifying selected zone therein, the angular range of scope that can be by the selected zones of indication such as input information PS etc. waits to determine the direction in selected zone based on the input information of specifying selected zone.Further, because selected zone and image-region ARs correspond to each other, so can determine wherein to be provided with the direction in selected zone from the picture position of it being carried out the image-region ARs that distortion correction handles.
Figure 44 shows according to the automatically switch situation of regional preference pattern MS of the direction that selected zone wherein is set.If place image pickup part 11 as shown in figure 43, the wide field-of-view image Gc that then have the 270 degree visuals field become as shown in figure 44 one.Should note, if image pickup part 11 is installed by this way: the center of wide field-of-view image Gc in the optical axis direction of image pickup opticator 111 and the center position in the visual field about horizontal direction 45 degree upwards, then the video front position becomes some Pf on the image for example to have the visuals field of 90 degree.
In Figure 44 (A), make that to comprise the front end image and selected zone is arranged in such direction for example the center of image-region ARs (not shown) can be included among the regional AS1 if provide regional AS1, orthogonal coordinates pattern MS1 then is set.Further, make that to comprise the rear end image and selected zone is arranged in such direction for example the center of image-region ARs is included among the regional AS2, polar coordinate model MS2 then is set if provide regional AS2.Should be noted that if the center of image-region ARs is not included among regional AS1 or the AS2, then keep setting area preference pattern MS.
Further, shown in Figure 44 (B), in advance in zone that the matrix neutron is cut apart wide field-of-view image Gc to distribute the regional preference pattern MS that will be set to the zone that each son cuts apart, come setting area preference pattern MS according to any one zone that wherein for example comprises the center of image-region ARs then.For example, each the regional ASm1 for image before comprising is assigned as the regional preference pattern MS that will be set up with orthogonal coordinates pattern MS1 in advance.Further, each the regional ASm2 for comprising the back image is assigned as the regional preference pattern MS that will be set up with polar coordinate model MS2 in advance.Here, if the central area of the image-region ARs corresponding with selected zone is corresponding with regional ASm1, orthogonal coordinates pattern MS1 is set then.On the other hand, if the center of image-region ARs is corresponding with regional ASm2, polar coordinate model MS2 is set then.
By this way, can automatically regional preference pattern MS be set to optimal mode, thereby make the image of the scenery in needing direction, placed on display part 14, easily show according to the image in any visual field that on display part 14, shows.
Further, owing to do not use transducer, so can be applied in image processing system 10,20 and 30 any one.Can also be applied to use the image processing system 40 of transducer.In this case, by based on the position or the area size that regulate regional AS1, AS2, ASm1 and the ASm2 corresponding from the sensor signal ES of gyrosensor 41 automatically with the inclination of image pickup part 11, if even image pickup part 11 is not installed as about horizontal direction 45 degree upwards, also can be regional preference pattern MS is carried out hand-off process with image pickup part 11 being installed as about the horizontal direction characteristic that upwards those characteristics in the situations of 45 degree are identical.Further, if position and the area size of regional AS1, AS2, ASm1 and the ASm2 corresponding with the inclination of image pickup part 11 can be set, the then all right switching characteristic of setting area preference pattern MS arbitrarily.
Should be noted that in the situation of regional preference pattern MS that automaticallyes switch, show, can easily determine regional preference pattern MS is set to that pattern by the GUI corresponding with each regional preference pattern is provided.Figure 45 shows the GUI that wherein moves at the situation hypograph zone of the regional preference pattern MS that automaticallyes switch ARs and shows and direction.If orthogonal coordinates pattern MS1 is set, shown in Figure 45 (A), for example, provide " on ", D score, " right side " and " left side " be as arrow button Gua2.Should be noted that Figure 45 (B) shows the direction that wherein image-region ARs moves when direction of operating button Gua2 in complete image Gcp.If polar coordinate model MS2 is set, shown in Figure 45 (C), for example, provide that " " center ", " outwards ", " right rotation " and " anticlockwise " button are as arrow button Gud2.Should be noted that Figure 45 (D) illustrates the direction that wherein image-region ARs moves when direction of operating button Gud2 in complete image Gcp.
By this way, by providing the GUI corresponding to show the user can determine easily which pattern is regional preference pattern be set to regional preference pattern.Further, when the image of the scenery that will need to place in the direction was presented on the display part 14, the user is the choice direction button easily.
And the angle of pitch that image pickup part 11 can depend on about vertical (perhaps level) direction stops automatically switching operation, and orthogonal coordinates pattern MS1 or polar coordinate model MS2 are set then.As shown in Figure 46, if angle ψ in the scope of " 337.5 degree≤ψ<22.5 degree " or " 157.5 degree≤ψ<202.5 are spent ", then with the location independent of image-region ARs polar coordinate model MS2 is set.If angle ψ in the scope of " 67.5 degree≤ψ<112.5 degree " or " 247.5 degree≤ψ<292.5 are spent ", then with the location independent of image-region ARs orthogonal coordinates pattern MS1 is set.If angle ψ is in the scope of " 22.5 degree≤ψ<67.5 degree ", " 112.5 degree≤ψ<157.5 degree ", " 202.5 degree≤ψ<247.5 degree " or " 292.5 degree≤ψ<337.5 degree ", then the position according to above-mentioned image-region ARs is set to integrated mode automatically with orthogonal coordinates pattern MS1 or polar coordinate model MS2.
Figure 47 is the flow chart that is illustrated in the handover operation between the regional preference pattern that comprises integrated mode.Processing controls part 135 is carried out the angle detection and is picked up the angle of pitch ψ (ST3301) of part 11 with detected image.Subsequently, processing controls part 135 determines whether to be provided with integrated mode (ST3302) based on detected angle of pitch ψ.If integrated mode (NO) is not set, then processing controls part 135 detects which (ST3303) that regional preference pattern is set among orthogonal coordinates pattern MS1 or the polar coordinate model MS2 based on the result of the detection of angle ψ.If integrated mode (the ST3302 place is "Yes") is set, then processing controls part 135 detects which (ST3304) that regional preference pattern is set to orthogonal coordinates pattern MS1 and polar coordinate model MS2 based on the position in selected zone.
If finished the detection of regional preference pattern at ST3303 and ST3304, then processing controls part 135 is set to regional preference pattern wherein to its coordinate model that detects (ST3305).Subsequently, processing controls part 135 provides the GUI corresponding with regional preference pattern set in ST3305 to show, and process turns back to ST3301 (ST3306).By Zone switched like this preference pattern, can provide can be by the easy-to-use interface of user.
Figure 48 is the flow chart that the operation when the direction of operating button is shown.Processing controls part 135 determines whether regional preference pattern MS is set to orthogonal coordinates pattern MS1 (ST3401).
Here, if orthogonal coordinates pattern MS1 ("Yes") is set, then processing controls part 135 determines whether direction of operating button (ST3402) based on input information PS.If determined inoperation arrow button ("No"), then process turns back to ST3401, and if definite direction of operating button ("Yes") then determines whether operation " right side " button.
If determined to have operated " right side " button ("Yes"), the 135 execution processing of then processing controls part make selected zone are switched to upwards position, and process turns back to ST3401 (ST3404).If determined not operation " right side " button (is "No" at ST3403), then processing controls part 135 determines whether operation " left side " button (ST3405).
If determined to have operated " left side " button ("Yes"), then processing controls part 135 is carried out to handle and is made selected zone is switched to left position, and process turns back to ST3401 (ST3406).If determined not operation " left side " button (is "No" at ST3405), then processing controls part 135 determine whether operation " on " button (ST3407).
If determined to have operated " on " button ("Yes"), the 135 execution processing of then processing controls part make selected zone are switched to position upwards, and process turns back to ST3401 (ST3408).If determined not operate " on " button (is "No" at ST3407), then processing controls part 135 determines whether to have operated the D score button, processing is feasible to switch to downward position with selected zone if then carry out, and process turns back to ST3401 (ST3409).
If determined to be provided with polar coordinate model MS2 (is "No" at ST3401), then processing controls part 135 determines whether direction of operating button (ST3410) based on input information PS.If determined not have direction of operating button ("No"), turn back to ST3401 in process, and if determined to have operated arrow button ("Yes"), then determine whether to have operated " right rotation " button (ST3411).
If determined to have operated " right rotation " button ("Yes"), then processing controls part 135 is carried out and is handled with to right rotation choice area territory, and process turns back to ST3401 (ST3412).If determined not operation " right rotation " button (is "No" at ST3411), determined whether to have operated " anticlockwise " button (ST3413) in processing controls part 135.
If determined to have operated " anticlockwise " button ("Yes"), then processing controls part 135 is carried out and is handled with to anticlockwise choice area territory, and process turns back to ST3401 (ST3414).If determined not operation " anticlockwise " button (is "No" at ST3413), determined whether to have operated " " center " button (ST3415) in processing controls part 135.
" " center " button ("Yes"), then processing controls part 135 are carried out to handle and are made the position that selected zone is switched to center oriented, and process turns back to ST3401 (ST3416) if determined to have operated.If determined not operate " " center " button (is "No" at ST3415); determine whether to have operated " outwards " button in processing controls part 135; if; then carry out to handle making selected zone be switched to conduct and central party outside position in the opposite direction, and process turns back to ST3401 (ST3417).
Handle by these, can easily selected zone be switched to the direction of wanting.That is to say, can be therein owing on the screen of display screen 14, be presented at the image of the scenery of placing in the direction of wanting in the condition that the distortion that image pickup opticator 111 causes is corrected.
Yet the image processing system of present embodiment has the above-mentioned a plurality of patterns as display mode MH, if make selected regional display mode MH2 for example is set, cannot determine selected zone is set to that visual field direction.Therefore, in order to confirm how selected zone is set, need executable operations to change display mode MH to complete image display mode MH1 or two display mode MH3.Therefore, if switch selected zone or Zone switched preference pattern MS, then change display mode and make demonstration complete image Gcp pass through the scheduled time at least, thereby also make the user confirm selected zone easily even do not change display mode.Should be noted that the formation of image processing system in the present embodiment is identical with the formation of the above-mentioned image processing system 40 shown in Figure 32, so the descriptions thereof are omitted.
Figure 49 and 50 shows the operation that shows complete image Gcp, wherein Figure 49 shows the mode that is used for changing in response to the switching in selected zone display mode MH, and Figure 50 shows the mode that is used for changing in response to the switching of regional preference pattern MS display mode MH.
If display mode is switched to the display mode that does not wherein show complete image Gcp, if perhaps switched selected zone by instruction or under the condition that the display mode wherein do not show complete image Gcp is set Zone switched preference pattern MS, then the image processing section in image processing system 13 is automatically changed to wherein with set display mode and irrespectively complete image Gcp is shown display mode through preset time.
Figure 49 (A) shows the wherein situation of display image in selected image display mode MH2.If switch selected zone in response to selected regional switching command, then carry out and be used for the processing with the image combination of set corresponding demonstrations of display mode complete image Gcp, and the conduct of the image after will making up display image demonstration newly.For example, display mode MH is changed to two display mode MH3 to show also at the complete image Gcp shown in Figure 49 (B).Then, after having passed through the scheduled time, the combined treatment of complete image Gcp finishes, and from display screen complete image Gcp is wiped.For example, display mode MH turns back to selected image display mode MH2, and if wipe complete graph Gcp shown in Figure 49 (C).Further, if in response to selected regional switching command, selected zone turns back to its home position, and then display mode MH changes to two one scheduled times of display mode MH3, and carries out demonstration as shown in these figures with the order of Figure 49 (C), 49 (D) and 49 (A).
Figure 50 (A) shows wherein in selected image display mode MH2 display image and regional preference pattern is set to the situation of polar coordinate model MS2.Here, if regional preference pattern MS is set to orthogonal coordinates pattern MS1, then handle with complete image Gcp with the combination of the image of the corresponding demonstration of set display mode, and the image after will make up is as new display image demonstration.For example, display mode MH is changed to two display mode MH3, but also shown in Figure 50 (B), show complete image Gcp.Then, after the process scheduled time, the combined treatment of complete image Gcp finishes, and wipes complete image Gcp from display screen.For example, display mode MH turns back to selected image display mode MH2, and wipes complete image Gcp shown in Figure 50 C.
Further, if regional preference pattern MS is switched to polar coordinate model MS2 from orthogonal coordinates pattern MS1, then display mode MH changes to two display modes through a scheduled time, and carries out demonstration as shown in these figures with the order of Figure 50 (C), 50 (D) and 50 (A).The scheduled time that should be noted that process can be but be not limited to for example three seconds, five seconds etc.
Further, image processing section 13 can not only show complete image Gcp when regional preference pattern MS is switched but also when being switched selected zone by instruction, and, change it to the such pattern that wherein only shows selected regional display image Gsc through after the scheduled time.
Further, image processing section 13 can be not only when generating in response to selected regional switching command that selected zone is switched or regional preference pattern MS when being switched, and when the image in response to the change of the picking up direction image-region ARs corresponding with selected zone is changed, display mode is changed a preset time section.For example, if the direction of picking up of image pickup part 11 changes, the image modification of wide field-of-view image Gc then is not even change if make the position of the image-region ARs on the sensor surface of image pick-up element 112 change selected regional display image Gsc yet.That is to say the state that makes the change of selected zone.Therefore, for example, if determine that based on the sensor signal ES from gyrosensor 41 the image pickup zone is changed, then display mode changes a preset time section to show complete image Gcp.Therefore the user can determine easily that how to be switched selected zone by instruction is placed on the image that needs the scenery in the direction to show on display part 14.
Figure 51 is the flow chart that illustrates as shown in Figure 49 or 50 in the operating process of the situation hypograph treatment system that changes display mode.Processing controls part 135 in image processing section 13 determines whether to switch (moving) selected zone (ST3601).That is to say, determine whether that from the importation 12 provide the indication user for the purpose of the selected regional switching command input information PS to the operation of above-mentioned arrow button Gua2 or Gud2 or orientation button Gub2, perhaps when automaticallying switch selected zone, whether generate the switching of selected zone based on above-mentioned trace information.Here, if switch selected zone ("Yes"), then display control information JH is offered the image output processing part and divide 134, make the image output processing part divide 134 combination images (ST3606) that can on display part 14, show complete image Gcp and selected regional display image Gsc.Then, processing controls part 135 determines whether passed through preset time (ST3607) from the image output beginning of this combination, if passed through preset time ("Yes"), then display control information JH is offered the image output processing part and divides 134, with will show from the combination image modification to the selected regional display image Gsc (ST3608) that wipes complete image Gcp from it.That is to say, divide as output processing part under 134 the condition by processing controls part 135 control charts therein, shown in Figure 49 and 50, after passing through the scheduled time, display mode MH is changed to selected image display mode MH2 from two display mode MH3.
If when determining whether, determined not switch selected zone ("No") by the selected zone of 135 switchings of the processing controls part in the image processing section 13, then processing controls part 135 determines whether to switch above-mentioned display mode MH or regional preference pattern MS, that is to say, whether 12 provide the indication user for the purpose of mode switch input information PS (ST3602) to the operation of above-mentioned " selection " button Gua1 or Gub1 or " menu " button Gub3 from the importation.Here, if with mode switch ("Yes"), then carry out above-mentioned ST3606 and step subsequently.On the other hand, if determined pattern not to be switched ("No"), then image processing section 13 determine whether to change image pickup part 11 attitude (setting angle) (ST3603).That is to say that processing controls part 135 is based on the sensor signal ES of the indication gyrosensor 41 detected angles of pitch, determine poor between the measurement result of the gyrosensor shown in Figure 32 41 for example and the predetermined initial value.If the stagger angle of image pickup part 11 is changed ("Yes"), then image processing section 13 is carried out above-mentioned ST3606 and step subsequently.
Further, if determine not change the stagger angle ("No") of image pickup part 11 at ST3606, then processing controls part 135 determines whether user's operation have been carried out in importation 12, if determined to have carried out Any user operation ("Yes"), then carry out the processing of ST3606 and follow-up above-mentioned steps, if determined also not carry out user's operation ("No"), then repeated the processing of ST3601 and follow-up above-mentioned steps.Should note, if such user operation relates to the switching of above-mentioned display mode MH or regional preference pattern MS or the switching in selected zone, then its processing will be identical with the processing of above-mentioned ST3601 or ST3602 step, so that any other user's operation except those is all experienced at ST3604 be determined.
Should note, though in Figure 51, if determine that selected zone is not switched, then determined whether switched pattern with such order, whether changed attitude, the operation of whether importing, can also be at the ST3605 shown in Figure 52, show that by determining whether to exist any selected regional switching, mode switch, attitude to switch and operate input complete image Gcp is through a scheduled time, if one in them is confirmed as " affirming ", then carries out the processing of ST3606 and subsequent step.
By top processing, if switched selected zone, perhaps switched display mode MH or regional preference pattern MS, then change the attitude of image pickup part 11, perhaps, provide this demonstration to make complete image Gcp demonstration is passed through the scheduled time at least from user's input operation.Therefore, if selected zone or pattern are switched, then how the user can easily confirm at the highlighted image Gs of the image-region ARs among the complete image Gcp selected zone to be set, and after affirmation, observes selected regional display image Gsc and is not stopped by complete image.
Should note, on the contrary, complete image Gcp at first shows in two display mode MH3, then, after the process scheduled time, it switches to the selected image display mode MH2 that does not wherein show complete image Gcp, if above-mentioned display mode MH or regional preference pattern MS are switched, then in selected image display mode MH2, image processing section 13 can at first only be exported selected regional display image Gsc, and after the process scheduled time, generation view data DVd makes and will show the image that makes up among two display mode MH3 of complete image Gcp combination therein.Therefore can confirm the selected regional display image Gsc of demonstration on display part 14, and not have the blind area, and, observe the new selected regional display image Gsc that shows, and confirm current selected zone through after the scheduled time.
Figure 53 is the schematic diagram that the another kind of state of display mode hand-off process is shown.When showing the image of combinations thereof, image processing section 13 can provide a screen display as for example combinations thereof image, wherein selected regional display image is dwindled making it not overlapping in complete image Gcp.Further, it can export complete image Gcp by it is combined to selected regional display image Gsc translucently.In any situation, image processing section 13 can be carried out this processing, makes the user observe selected regional display image Gsc and not have the blind area under the condition of demonstration complete image Gcp.
And image processing section 13 can dynamically change the above-mentioned predetermined elapsed time, rather than is fixed to constant value.For example, if Zone switched preference pattern MS, then the position in selected zone may change bigger, make the above-mentioned in this case predetermined elapsed time with in same area preference pattern MS, have the switching in any selected zone or exist above-mentioned elapsed time under the situation of any variation of attitude etc. of image pickup part 11 to compare longer, make the user can confirm selected zone definitely.Further, for example, quantity as the fruit cut zone increases in divided display mode MH4, confirm that a plurality of selected zones may need the long time, if make the user carry out the quantity (quantity of highlighted image Gs among complete image Gcps) of this operation with the zone that increases son and cut apart, then compare and to prolong in the predetermined elapsed time, make the user can confirm selected zone definitely with the predetermined elapsed time under the situation that switches to any other pattern or attitude generation change.
The utilizability of industry
In the present invention, according to indication cutting by the selected zone of the part in the visual field of pictorial data representation Change, wherein in display mode, to use so that the view data of its selected regional discernible scene image, Replace wherein in display mode, not using so that the image of its selected regional operable scene image Data. Therefore, its region division that is very suitable for wherein with needs is in the wide field-of-view image that picks up Selected zone is with the situation such as the image of confirming this selected zone etc.

Claims (9)

1. image processing equipment, its processing comprises the view data of the distortion of image pickup opticator, and described distortion is to pick up optical imagery by the image pickup opticator that provides this distortion from scenery to obtain, and described image processing equipment comprises:
Data output unit, it is by the view data of the calibrated image output of the distortion of using scene image and the selected zone display image corresponding with display mode, wherein, the selected zone of this scene image of the part in the visual field of feasible this pictorial data representation of indication can be identified;
Display mode is provided with part, and it is provided with display mode; With
Control section, it is according to the switching in selected zone, first display mode is changed into second display mode, in this first display mode, do not use to make its discernible scene image in selected zone, and in second display mode, make its discernible scene image in selected zone.
2. image processing equipment according to claim 1, wherein according to the switching in selected zone, described control section changes to second display mode, one predetermined amount of time with first display mode, and is changing it to first display mode through after this predetermined amount of time.
3. image processing equipment according to claim 1 further comprises the importation, and it carries out the switching command in selected zone,
Wherein, this control section switches selected zone according to switching command.
4. image processing equipment according to claim 1, wherein, this control section has first area preference pattern and second area preference pattern, and first display mode is changed to second display mode according to the switching in selected zone, in the preference pattern of first area, move in the direction of principal axis of orthogonal coordinate system based on switching command in selected zone, and in the second area preference pattern, move in the direction of the argument of polar coordinate system based on switching command in selected zone.
5. image processing equipment according to claim 1 comprises that also detected image picks up the direction detection sensor of direction,
Wherein, when determining to change the selected zone of image pickup direction based on sensor signal from direction detection sensor, control section changes to second display mode in response to this change with first display mode, makes display mode is changed a scheduled time.
6. image processing method, its processing comprises the view data of the distortion of image pickup opticator, and this distortion is to pick up optical imagery by the image pickup opticator that provides distortion from scenery to obtain, and this image processing method comprises:
Data output step, it is by the view data of the calibrated image output of the distortion of using scene image and the selected zone display image corresponding with display mode, wherein, the selected zone of this scene image of the part in the visual field of feasible this pictorial data representation of indication can be discerned;
Display mode is provided with step, and it is provided with display mode; With
Display mode changes step, it is according to the switching in selected zone, first display mode is changed into second display mode, in first display mode, do not use to make its discernible scene image in selected zone, and in second display mode, make its discernible scene image in selected zone.
7. one kind makes computer carry out the program of the image processing of the view data of handling the distortion that comprises the image pickup opticator, and this distortion is to pick up optical imagery by the image pickup opticator that provides distortion from scenery to obtain, and this program comprises:
Data output step, it is by the view data of the calibrated image output of the distortion of using scene image and the selected zone display image corresponding with display mode, wherein, the selected zone of this scene image of the part in the visual field of feasible this pictorial data representation of indication can be discerned;
Display mode is provided with step, and it is provided with display mode; With
Display mode changes step, it is according to the switching in selected zone, first display mode is changed into second display mode, in first display mode, do not use to make its discernible scene image in selected zone, and in second display mode, make its discernible scene image in selected zone.
8. a record makes computer carry out the recording medium of program of the image processing of the view data of handling the distortion that comprises the image pickup opticator, this distortion is to pick up optical imagery by the image pickup opticator that provides distortion from scenery to obtain, and this program comprises:
Data output step, it is by the view data of the calibrated image output of the distortion of using scene image and the selected zone display image corresponding with display mode, wherein, the selected zone of this scene image of the part in the visual field of feasible this pictorial data representation of indication can be discerned;
Display mode is provided with step, and it is provided with display mode; With
Display mode changes step, it is according to the switching in selected zone, first display mode is changed into second display mode, in first display mode, do not use to make its discernible scene image in selected zone, and in second display mode, make its discernible scene image in selected zone.
9. image pick up equipment, its generation comprises the view data of the distortion of image pickup opticator, and this distortion is to pick up optical imagery by the image pickup opticator that provides distortion from scenery to obtain, and described image pick up equipment comprises:
Data output unit, it is by the view data of the calibrated image output of the distortion of using scene image and the selected zone display image corresponding with display mode, wherein, the selected zone of this scene image of the part in the visual field of feasible this pictorial data representation of indication can be discerned;
Display mode is provided with part, and it is provided with display mode; With
Control section, it is according to the switching in selected zone, first display mode is changed into second display mode, in first display mode, do not use to make its discernible scene image in selected zone, and in second display mode, make its discernible scene image in selected zone.
CN2006800421517A 2005-11-11 2006-11-10 Image processing device, image processing method and image picking device Expired - Fee Related CN101305596B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2005327749 2005-11-11
JP327749/2005 2005-11-11
JP176915/2006 2006-06-27
JP2006176915 2006-06-27
PCT/JP2006/322499 WO2007055336A1 (en) 2005-11-11 2006-11-10 Image processing device, image processing method, program thereof, recording medium containing the program, and imaging device

Publications (2)

Publication Number Publication Date
CN101305596A true CN101305596A (en) 2008-11-12
CN101305596B CN101305596B (en) 2010-06-16

Family

ID=40114397

Family Applications (2)

Application Number Title Priority Date Filing Date
CN2006800421517A Expired - Fee Related CN101305596B (en) 2005-11-11 2006-11-10 Image processing device, image processing method and image picking device
CN2006800421038A Expired - Fee Related CN101305595B (en) 2005-11-11 2006-11-10 Image processing device and image processing method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN2006800421038A Expired - Fee Related CN101305595B (en) 2005-11-11 2006-11-10 Image processing device and image processing method

Country Status (1)

Country Link
CN (2) CN101305596B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101969531A (en) * 2009-07-27 2011-02-09 索尼公司 Composition control device, imaging system, composition control method, and program
CN102265599A (en) * 2009-01-20 2011-11-30 歌乐株式会社 obstacle detection display device
CN103841879A (en) * 2011-09-26 2014-06-04 奥林巴斯株式会社 Image processing device for use with endoscope, endoscope, and image processing method
CN104994288A (en) * 2015-06-30 2015-10-21 广东欧珀移动通信有限公司 Shooting method and user terminal
CN106133794A (en) * 2014-03-18 2016-11-16 株式会社理光 Information processing method, messaging device and program
CN106716985A (en) * 2014-09-08 2017-05-24 富士胶片株式会社 Imaging control device, imaging control method, camera system, and program
CN108243310A (en) * 2016-12-26 2018-07-03 佳能株式会社 Information processing unit and information processing method
CN110611749A (en) * 2019-09-30 2019-12-24 深圳市大拿科技有限公司 Image processing method and device
CN111050152A (en) * 2019-12-30 2020-04-21 联想(北京)有限公司 Image processing method, display device and electronic device
CN111095923A (en) * 2017-09-08 2020-05-01 索尼互动娱乐股份有限公司 Calibration device, calibration system, and calibration method

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013214947A (en) * 2012-03-09 2013-10-17 Ricoh Co Ltd Image capturing apparatus, image capturing system, image processing method, information processing apparatus, and program
JP2015177467A (en) * 2014-03-17 2015-10-05 キヤノン株式会社 Imaging apparatus and control method thereof
CN105141827B (en) 2015-06-30 2017-04-26 广东欧珀移动通信有限公司 Distortion correction method and terminal
CN107657595B (en) * 2015-06-30 2020-02-14 Oppo广东移动通信有限公司 Distortion correction method, mobile terminal and related medium product
JP6627352B2 (en) * 2015-09-15 2020-01-08 カシオ計算機株式会社 Image display device, image display method, and program
CN106600546B (en) * 2016-11-14 2020-12-22 深圳市Tcl高新技术开发有限公司 Distortion correction method and system for ultra-wide-angle camera
CN107610044A (en) * 2017-08-29 2018-01-19 歌尔科技有限公司 Image processing method, computer-readable recording medium and virtual reality helmet
CN112406706B (en) * 2020-11-20 2022-07-22 上海华兴数字科技有限公司 Vehicle scene display method and device, readable storage medium and electronic equipment
CN113377077B (en) * 2021-07-08 2022-09-09 四川恒业硅业有限公司 Intelligent manufacturing digital factory system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4378994B2 (en) * 2003-04-30 2009-12-09 ソニー株式会社 Image processing apparatus, image processing method, and imaging apparatus
US7596286B2 (en) * 2003-08-06 2009-09-29 Sony Corporation Image processing apparatus, image processing system, imaging apparatus and image processing method
JP2005086279A (en) * 2003-09-04 2005-03-31 Equos Research Co Ltd Imaging apparatus and vehicle provided with imaging apparatus
JP2005266178A (en) * 2004-03-17 2005-09-29 Sharp Corp Driver for display device, the display device and method for driving the display device

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102265599A (en) * 2009-01-20 2011-11-30 歌乐株式会社 obstacle detection display device
CN101969531B (en) * 2009-07-27 2012-12-26 索尼公司 Composition control device, imaging system, composition control method
CN101969531A (en) * 2009-07-27 2011-02-09 索尼公司 Composition control device, imaging system, composition control method, and program
CN103841879A (en) * 2011-09-26 2014-06-04 奥林巴斯株式会社 Image processing device for use with endoscope, endoscope, and image processing method
CN103841879B (en) * 2011-09-26 2017-03-22 奥林巴斯株式会社 Image processing device for use with endoscope, endoscope, and image processing method
CN106133794B (en) * 2014-03-18 2021-11-23 株式会社理光 Information processing method, information processing apparatus, and program
CN106133794A (en) * 2014-03-18 2016-11-16 株式会社理光 Information processing method, messaging device and program
CN106716985B (en) * 2014-09-08 2019-07-30 富士胶片株式会社 Video camera controller, camera shooting control method and camera system
CN106716985A (en) * 2014-09-08 2017-05-24 富士胶片株式会社 Imaging control device, imaging control method, camera system, and program
CN104994288B (en) * 2015-06-30 2018-03-27 广东欧珀移动通信有限公司 A kind of photographic method and user terminal
CN104994288A (en) * 2015-06-30 2015-10-21 广东欧珀移动通信有限公司 Shooting method and user terminal
CN108243310A (en) * 2016-12-26 2018-07-03 佳能株式会社 Information processing unit and information processing method
CN111095923A (en) * 2017-09-08 2020-05-01 索尼互动娱乐股份有限公司 Calibration device, calibration system, and calibration method
CN111095923B (en) * 2017-09-08 2022-01-04 索尼互动娱乐股份有限公司 Calibration device, calibration system, and calibration method
US11232593B2 (en) 2017-09-08 2022-01-25 Sony Interactive Entertainment Inc. Calibration apparatus, calibration system, and calibration method
CN110611749A (en) * 2019-09-30 2019-12-24 深圳市大拿科技有限公司 Image processing method and device
CN111050152A (en) * 2019-12-30 2020-04-21 联想(北京)有限公司 Image processing method, display device and electronic device

Also Published As

Publication number Publication date
CN101305596B (en) 2010-06-16
CN101305595A (en) 2008-11-12
CN101305595B (en) 2011-06-15

Similar Documents

Publication Publication Date Title
CN101305596B (en) Image processing device, image processing method and image picking device
EP1954029B1 (en) Image processing device, image processing method, program thereof, and recording medium containing the program
US11558431B2 (en) Communication terminal, communication system, communication method, and display method
US10298839B2 (en) Image processing apparatus, image processing method, and image communication system
US8994785B2 (en) Method for generating video data and image photographing device thereof
US9124867B2 (en) Apparatus and method for displaying images
US20180097682A1 (en) Communication terminal, method for controlling display of image, and non-transitory computer-readable storage medium
CN101241590B (en) Image processing apparatus and method, program, and recording medium
KR20010072917A (en) All-around video output method and device
CN102111542A (en) Image processing apparatus, imaging apparatus, image processing method, and program
JP2020149635A (en) Imaging apparatus, image communication system, image processing method, and program
US11743590B2 (en) Communication terminal, image communication system, and method for displaying image
US11818492B2 (en) Communication management apparatus, image communication system, communication management method, and recording medium
US20220103751A1 (en) Communication management apparatus, image communication system, communication management method, and recording medium
CN103262561A (en) Video distribution system, and video distribution method
US10587812B2 (en) Method of generating a digital video image using a wide-angle field of view lens
WO2023095642A1 (en) Image processing device, image processing method, and program
JP2007148488A (en) Image conversion device, image conversion method and image conversion program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20170111

Address after: Kanagawa Japan Atsugi Asahi 4-14-1

Patentee after: SONY SEMICONDUCTOR SOLUTIONS Corp.

Address before: Tokyo, Japan

Patentee before: Sony Corp.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100616

Termination date: 20211110