WO2008050374A1 - Dispositif, procédé, programme de formation d'images de décor et support d'enregistrement pouvant être lu par ordinateur - Google Patents
Dispositif, procédé, programme de formation d'images de décor et support d'enregistrement pouvant être lu par ordinateur Download PDFInfo
- Publication number
- WO2008050374A1 WO2008050374A1 PCT/JP2006/318980 JP2006318980W WO2008050374A1 WO 2008050374 A1 WO2008050374 A1 WO 2008050374A1 JP 2006318980 W JP2006318980 W JP 2006318980W WO 2008050374 A1 WO2008050374 A1 WO 2008050374A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- landscape
- image
- imaging
- cell
- determined
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 101
- 238000000034 method Methods 0.000 claims description 52
- 238000004364 calculation method Methods 0.000 claims description 24
- 239000006185 dispersion Substances 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 description 22
- 238000004891 communication Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 240000004153 Hibiscus sabdariffa Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000032823 cell division Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
Definitions
- the present invention relates to a landscape imaging apparatus that captures a landscape, a landscape imaging method, a landscape imaging program, and a computer-readable recording medium.
- a landscape imaging apparatus that captures a landscape, a landscape imaging method, a landscape imaging program, and a computer-readable recording medium.
- the use of the present invention is not limited to the above-described landscape imaging apparatus, landscape imaging method, landscape imaging program, and computer-readable recording medium.
- Patent Document 1 Japanese Unexamined Patent Publication No. 2003-198904
- a landscape imaging apparatus includes an imaging unit that captures a landscape image in an arbitrary direction, and an arbitrary image captured by the imaging unit.
- a dividing unit that divides a landscape image in the direction (hereinafter referred to as “initial landscape image”) into a plurality of cells, and corresponds to each cell divided by the dividing unit among the initial landscapes corresponding to the initial landscape image.
- calculating means for calculating a distance to a part of the initial landscape, and determining means for determining an imaging direction using the distance calculated by the calculating means.
- a landscape imaging method includes an imaging step of capturing a landscape image in an arbitrary direction, and a landscape image of the arbitrary direction captured by the imaging step (hereinafter referred to as "initial landscape"). And a distance to a part of the initial scenery corresponding to each cell divided by the division process among the initial scenery corresponding to the initial scenery image. And a determination step of determining an imaging direction using the distance calculated by the calculation step.
- a landscape imaging program according to an invention of claim 11 causes a computer to execute the landscape imaging method according to claim 10.
- a computer-readable recording medium according to the invention of claim 12 records the landscape imaging program according to claim 11.
- FIG. 1 is a block diagram showing an example of a functional configuration of a landscape imaging apparatus that is useful in the present embodiment.
- FIG. 2 is a flowchart showing the contents of processing of the landscape imaging apparatus according to the present embodiment.
- FIG. 3 is a block diagram showing an example of a hardware configuration of an in-vehicle device that works on the embodiment.
- FIG. 3 is a block diagram showing an example of a hardware configuration of an in-vehicle device that works on the embodiment.
- FIG. 4 is an explanatory view showing an example of a landscape image used in the present embodiment.
- FIG. 5 is an explanatory view showing an example of the division of a landscape image which is related to the present embodiment.
- Fig. 6 is an explanatory diagram showing an example of calculating the distance of each cell and determining a distant view in a landscape image related to the present embodiment.
- Fig. 7 is an explanatory view showing determination of the driving angle of the camera which is effective in the present embodiment.
- FIG. 8 is an explanatory view showing a view index calculation which is useful in the present embodiment.
- FIG. 9 is a flowchart showing the contents of the processing of the in-vehicle device that works on the present embodiment.
- FIG. 10 is a flowchart showing the contents of drive direction determination processing (step S 905 in FIG. 9) in the in-vehicle device that is relevant to the present embodiment.
- FIG. 11 is a flowchart showing the contents of the sight index calculation process (step S908 in FIG. 9) in the in-vehicle device that is effective in the present embodiment.
- FIG. 12 shows an empty determination process (step S in FIG.
- FIG. 13 shows a sea determination process (step S in FIG.
- FIG. 1 is a block diagram showing an example of a functional configuration of the landscape imaging apparatus according to the present embodiment.
- the landscape imaging apparatus 100 is also configured with an imaging unit 101, a dividing unit 102, a calculating unit 103, a determining unit 104, a determining unit 105, a storage unit 106, and a force.
- the imaging unit 101 captures a landscape image in an arbitrary direction.
- the imaging unit 101 may take an image of the front of the moving body using a movable in-vehicle camera mounted on the moving body.
- the imaging unit 101 may capture a landscape image in the imaging direction determined by the determination unit 104 described later, for example.
- a driving unit (not shown) is driven so that the in-vehicle camera is directed in the imaging direction determined by the determination unit 104. Let's do it.
- the dividing unit 102 divides a landscape image in an arbitrary direction imaged by the imaging unit 101 (hereinafter referred to as “initial landscape image”) into a plurality of cells. Further, the dividing unit 102 may divide the captured landscape image captured by the imaging unit 101 into a plurality of cells.
- the cell is an image piece of a predetermined size in an image such as an initial landscape image or a captured landscape image.
- the predetermined size is about 30 X 30 pixels, which is desirable for the accuracy and processing speed of the calculation unit 103 and the determination unit 105 described later. It may be set accordingly. If the entire image cannot be divided into uniform cells due to the relationship between the size of the image to be divided and the cell size, the edge of the image may be excluded from the division target.
- the calculation unit 103 calculates a distance to a part of the initial scenery corresponding to each cell divided by the dividing unit 102 among the initial scenery corresponding to the initial scenery image.
- the distance to a part of the initial landscape is, for example, calculated by the calculation unit 103 that is a relative distance between a part of the initial landscape and a reference position such as the imaging unit 101 or the landscape imaging device 100. Therefore, the distance to the part of the landscape corresponding to each of the plurality of cells is calculated.
- the distance is calculated by calculating an optical flow between two initial landscape images captured in chronological order and included in each cell based on the calculated optical flow.
- a distance to a part of the initial landscape corresponding to the plurality of pixels is calculated.
- the average value of the distance to a part of the initial scenery corresponding to the plurality of pixels included in each cell is calculated as the distance to the part of the initial scenery corresponding to each cell.
- the calculated distance may be calculated as infinite if the distance is very large compared to other parts of the scenery, for example, when a part of the initial scenery corresponds to the sky. Good.
- the determining unit 104 determines the imaging direction using the distance calculated by the calculating unit 103, for example. Specifically, for example, the determination unit 104 calculates an angle, a distance, and the like for driving a driving unit (not shown) to direct the imaging unit 101 so that a landscape image in the determined imaging direction can be captured. It is good.
- the determination unit 104 is based on the number of cells in which a part of the initial landscape is determined to be a distant view by the determination unit 105 described later in an image region having one or more cell forces in the initial landscape image.
- the imaging direction may be determined. More specifically, for example, the determination of the imaging direction may be performed by determining the imaging direction as a direction in which a cell determined to be a distant view out of the cells in the image region has a good view of the image region.
- the determination unit 105 determines whether a part of the initial landscape corresponding to each cell is a distant view.
- the distant view means, for example, a part of the initial landscape corresponding to the arbitrary subject portion when the distance to the arbitrary subject portion constituting the initial landscape is located farther than a predetermined distance.
- This predetermined distance may be, for example, a predetermined distance set in advance, or a distance that can be variably set according to the type of scenery or the preference of the operator.
- the determination unit 105 determines that a part of the initial landscape corresponding to the cell is a distant view (for example, if it is less than the specified value, the part of the landscape corresponding to the cell is a close-up view (for example, a nearby road, a preceding vehicle, etc.). It may be determined that it is a landscape. You can also define one or more intermediate distances between the distant view and the close view! /.
- the determination unit 105 determines that a part of the captured scenery corresponding to each cell divided by the dividing unit 102 among the captured scenery corresponding to the captured landscape image is a sky, sea, mountain, and night view. It is determined whether or not the force falls within at least one of these.
- a determination for example, by setting at least one of the sky, the sea, the mountains, and the night view, the sky, which is set in advance while the vehicle is running, Landscape images corresponding to distant views such as the sea, mountains, and night views can be automatically stored by the storage unit 106 described later. Or, for example, the music and video in the car can be switched to music and video according to the sky, sea, mountains or night view.
- the determination unit 105 determines the captured scene from the variance value of the brightness of the pixels included in the cell positioned above. It is also possible to determine whether or not a part of the power is empty.
- the determination unit 105 is color information of the color information corresponding to the cell
- the color information includes hue, brightness, and saturation as parameters.
- color information for example, HLS color information
- the determination unit 105 determines that the brightness variance values of all of the cells to be determined and eight cells located in the vicinity thereof (that is, cells in the vicinity of 8) have a predetermined value (for example, 10) In the following cases, it is determined that a part of the imaging scene corresponding to the cell is empty. In addition, the determination unit 105 has a lightness variance value of one of the cells to be determined and eight cells located in the vicinity thereof (that is, cells in the vicinity of eight) larger than a predetermined value (for example, 10). In this case, it is determined that a part of the imaging scenery corresponding to the cell is not empty.
- a predetermined value for example, 10
- the determination unit 105 determines, for example, that a cell that has not been determined to be empty is empty. Based on the average hue and average saturation of a plurality of pixels included in the cell, it may be determined whether or not the power of the part of the captured scene is the sea.
- the determination unit 105 determines the average hue (or color tone) power of a plurality of pixels included in a cell that is determined not to be empty as a plurality of pixels included in the cell determined as empty.
- the average brightness of a plurality of pixels included in a cell that is within a predetermined range including the average hue of the pixel and is determined not to be empty is the average brightness of the plurality of pixels included in the cell determined to be empty. If it is less than 1 degree, the cell may be determined to be the sea.
- the color of the sea reflects the color of the sky and is similar to the color of the sky. It is possible to determine whether or not a part of the captured landscape is the sea, and it is possible to accurately determine whether or not a part of the captured landscape is the sea.
- the determination unit 105 analyzes the color property of each cell adjacent to the cell determined to be empty among the cells not determined to be empty to determine the boundary between the sky and the non-sky landscape. By detecting, it may be determined whether or not the captured scenery includes a mountain.
- the determination unit 105 determines that, in one cell, the pixels that are determined to be the sky color are equal to or higher than a certain ratio, and the pixels that are not determined to be the sky color are a certain ratio. If it is less than that, it is judged that there is a high possibility that the cell contains a ridgeline.
- “determining the color of the sky” means determining that the hue, lightness, and saturation of the pixel are included in the range of the hue, lightness, and saturation in the above-described determination of the sky. . Then, the determination unit 105 determines that the landscape includes a mountain if there is a certain number or more of cells that are determined to have a high possibility of including a ridgeline.
- the determination unit 105 determines, for each of the plurality of cells, the higher one of the plurality of pixel groups that are distinguished from each other by performing the binary value of the pixel on the basis of the lightness of the predetermined value. Based on the number of pixel groups thus distinguished, it may be determined whether or not a part of the imaged landscape is a night scene. [0038] More specifically, for example, the determination unit 105 uses a binary value to determine a pixel in which a plurality of pixels having brightness higher than a predetermined value are adjacent to each other from a plurality of pixels included in one cell. A group (including pixels with a brightness below a certain value and one adjacent pixel that is lighter than a certain value), and so that there are many white particles on a black background than in the surroundings in one cell High brightness, pixels are scattered.
- the determination unit 105 calculates the number of pixel groups in which a plurality of pixels having a lightness higher than a predetermined value are adjacent, and a pixel group in which a plurality of pixels having a lightness higher than the predetermined value are adjacent to each other. If there are more than the specified value (for example, 10), it is determined that there is a high possibility that the cell contains a night view.
- the night view refers to a night landscape that includes a lot of light such as a building, a residential area, and an automobile.
- the determination unit 105 may include a night view. If the cell power determined to be high is greater than a certain number (for example, 10), it will be determined that the landscape includes night scenes! /.
- the night view often includes many points of light with higher brightness than the surroundings, and this reflects the characteristic that a part of the imaged landscape is a night view. Therefore, it is possible to accurately determine whether or not a part of the imaged landscape is a night view.
- the level or degree of the night view may be determined according to the number of cells determined that the night view is included and is likely to be included.
- the processing speed can be improved by limiting the cells that are subject to night view judgment to only those that are judged to be distant views. Furthermore, it may be limited to the case where there are a certain number (for example, 10) or more of cells that are determined to be night sky by the above-described determination regarding the sky.
- the determination unit 105 makes a determination regarding the distant view of the initial landscape, and the determination of the captured landscape regarding the sky, the sea, the mountains, and the night view. There is nothing. Specifically, for example, the determination unit 105 may make a determination regarding a distant view of the photographed landscape, or may determine a sky, sea, mountain, night view, or the like for the initial landscape.
- the storage unit 106 stores the captured scenery image according to the determination result determined by the determination unit 105. Specifically, for example, when it is determined that the captured scene image includes the sea and mountains as a determination result, the storage unit 106 does not illustrate the captured landscape image. It may be configured to save in the storage area of the device. More specifically, for example, it may be configured to automatically save a captured landscape image that satisfies a predetermined condition by a user or factory default settings.
- the storage unit 106 may store the captured landscape image and the positional information of the captured landscape image in association with each other.
- FIG. 2 is a flowchart showing the contents of the processing of the landscape imaging apparatus that works on the present embodiment.
- the imaging unit 101 determines whether or not a landscape image has been captured (step S 201).
- the landscape image may be captured continuously with the movement of the user or the landscape imaging apparatus 100, or a start input may be received by the user.
- step S201 when a landscape image is waited to be captured and captured (step S201: Yes), the landscape image captured in step S201 by the dividing unit 102 (hereinafter referred to as "initial stage").
- the landscape image is divided into a plurality of cells (step S202).
- the calculation unit 103 calculates a distance to a part of the initial scenery corresponding to each cell divided in step S202 among the initial scenery corresponding to the initial scenery image (step S203).
- the determination unit 105 determines a distant view using the distance calculated in step S203 (step S204).
- the determination of the distant view is, for example, a configuration for determining whether or not a part of the initial landscape corresponding to each cell is a distant view.
- step S204 based on the number of cells in which a part of the initial landscape determined in step S204 is determined to be a distant view by the determination unit 104 in the image area having one or more cell forces in the initial landscape image.
- the imaging direction is determined (step S205).
- the imaging unit 101 captures an imaged landscape image in the imaging direction determined in step S205 (step S206).
- the captured landscape image is captured, for example, by driving an unillustrated drive unit so that the in-vehicle camera is directed in the image capturing direction determined in step S205.
- the dividing unit 102 divides the captured landscape image captured in step S206 into a plurality of cells (step S207).
- the determination unit 105 determines a captured landscape corresponding to the captured landscape image captured in step S206 (step S208).
- the determination of the captured landscape is, for example, that at least one of the captured landscape corresponding to each cell divided in step S207 is at least one of the sky, the sea, the mountain, and the night view among the captured landscape corresponding to the captured landscape image. Determine if one of them applies.
- the storage unit 106 stores the captured landscape image according to the determination result determined in step S208 (step S209), and the series of processing ends.
- the landscape imaging device, the landscape imaging method, the landscape imaging program, and the computer readable recording medium of the present invention realize the function by the landscape imaging device 100 shown in FIG.
- the apparatus is not limited to the landscape imaging apparatus 100, and a plurality of apparatuses may be used as long as the functional unit shown in FIG.
- the connection between the devices may be established by performing communication by Bluetooth (registered trademark) or the like, regardless of wired or wireless.
- a distant view is determined from the captured initial landscape image. Since the shooting direction is determined and the shooting landscape image is picked up, it is possible to pick up a landscape with a good view without having to register it.
- the captured landscape image is stored by determining the landscape based on the captured landscape image, the conditions are determined in advance by the user or by the factory default settings. Accordingly, the captured landscape image can be appropriately stored. In other words, a captured landscape image obtained by capturing a landscape according to a user's preference can be efficiently stored, and the storage area can be effectively used.
- the user can operate the landscape imaging device or Even if you do not register, you can automatically capture a landscape with a good view and save an appropriate captured landscape image.
- the driver can check the scenery after driving, it is not necessary to pay attention to the scenery during driving, and safe driving can be achieved.
- FIG. 3 is a block diagram showing an example of the hardware configuration of the in-vehicle device that works on the present embodiment.
- an in-vehicle device 300 is mounted on a moving body such as a vehicle, and includes a CPU 301, a ROM 302, a RAM 303, a magnetic disk drive 304, a magnetic disk 305, an optical disk drive 306, Optical disk 307, audio I / F (interface) 308, speaker 309, human power device 310, video I / F 311, display 312, camera 313, communication IZF 314, GPS unit 315, and various sensors 316 And.
- Each component 301 to 316 is connected by a bus 320.
- the CPU 301 governs overall control of the in-vehicle device 300.
- the ROM 302 records various programs such as a boot program, an imaging program, an image determination program, and an image storage program.
- the RAM 303 is used as a work area for the CPU 301.
- the imaging program causes, for example, a landscape image in front of the vehicle to be captured by a camera 313 described later.
- the imaging program determines a driving angle for driving the driving unit of the camera 313 and drives the driving unit in order to capture an imaging direction toward a point of good view determined by the determination program described later.
- a landscape image in the imaging direction may be captured.
- the image determination program makes a determination regarding a landscape of a landscape image captured by a camera 313, which will be described later. The details will be described with reference to FIGS. 4 to 8. However, the image determination program, for example, shows a landscape corresponding to a cell constituting a part of a landscape image. Or whether the landscape corresponding to the cells constituting a part of the landscape image corresponds to any one of the sky, the sea, the mountains, and the night view.
- the image determination program determines a direction with a good view of a landscape corresponding to a landscape image from the number of distant cells in an image region having a plurality of cell forces, It is a good idea to determine whether the scenery is a good view by calculating the view index according to the type of scenery corresponding to the cells that make up the part.
- the image storage program controls, for example, a magnetic disk drive 304 and an optical disk drive 306 described later, and stores a landscape image on a recording medium such as the magnetic disk 305 and the optical disk 307.
- the data to be stored is a landscape image determined as a landscape with a good view by the determination program. In this way, by storing a landscape with a good view, the user can acquire a landscape image in which the left and right landscapes are not noticed while the vehicle is running.
- the magnetic disk drive 304 controls reading and writing of data to the magnetic disk 305 according to the control of the CPU 301.
- the magnetic disk 305 records data written under the control of the magnetic disk drive 304.
- the magnetic disk 305 for example, HD (node disk) or FD (flexible disk) can be used.
- An example of information recorded on the magnetic disk 305 is a landscape image. It should be noted that the power for recording landscape images on the magnetic disk 305 may be recorded on the optical disk 307 described later.
- the landscape image may be provided outside the in-vehicle device 300, not limited to the one that is provided integrally with the hardware of the in-vehicle device 300.
- the in-vehicle device 300 outputs a landscape image via the network through the communication IZF 314, for example.
- the optical disk drive 306 controls data reading / writing to the optical disk 307 according to the control of the CPU 301.
- the optical disk 307 is a detachable recording medium from which data is read according to the control of the optical disk drive 306.
- a writable recording medium can be used as the optical disk 307.
- this removable recording medium The body may be a power MO of the optical disk 307, a memory card, or the like.
- the audio IZF 308 is connected to an audio output speaker 309, and audio is output from the speaker 309.
- Examples of the input device 310 include a remote controller, a keyboard, a mouse, and a touch panel that are provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like.
- the video I / F 311 is connected to the display 312 and the camera 313.
- the video I / F 311 includes, for example, a graphic controller that controls the entire display 312, a buffer memory such as VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and a graphic Based on the image data output from the controller, it is configured by a control IC that controls the display 312 display.
- VRAM Video RAM
- the display 312 displays icons, cursors, menus, windows, or various data such as characters and images.
- a CRT for example, a CRT, a TFT liquid crystal display, a plasma display, or the like can be adopted.
- a plurality of displays 312 may be provided in the vehicle, for example, for the driver and for a passenger seated in the rear seat.
- the camera 313 is installed on the dashboard or ceiling of the vehicle and captures a landscape image outside the vehicle.
- the landscape image can be either a still image or a moving image.
- a landscape image in the front and determined imaging direction is captured by the camera 313, and the captured landscape image is determined based on the determination result determined by the image determination program.
- the data may be output to a recording medium such as the magnetic disk 305 or the optical disk 307 by a storage program.
- the camera 313 may be a movable camera 313 that can image the determined imaging direction by driving a driving unit.
- the communication I / F 314 is connected to a network via radio and functions as an interface between the in-vehicle device 300 and the CPU 301.
- the communication I / F 314 is further connected to a communication network such as the Internet via wireless, and also functions as an interface between the communication network and the CPU 301.
- Communication networks include LANs, WANs, public line networks, mobile phone networks, and the like.
- the communication IZF314 is, for example, an FM tuner, VICS (Vehicle Information and Communication System) Consists of Z-Beacon Resino, wireless navigation device, and other navigation devices, and acquires road traffic information such as traffic jams and traffic regulations that are distributed by the VICS Center.
- VICS Vehicle Information and Communication System
- the GPS unit 315 receives radio waves from GPS satellites and outputs information indicating the current location of the vehicle.
- the output information of the GPS unit 315 is used when the CPU 301 calculates the current location of the vehicle together with output values of various sensors 316 described later.
- the information that indicates the current location is information that identifies one point on the map information, such as latitude and longitude, and altitude. For example, the location of the landscape image with a good view can be confirmed by associating it with the saved landscape image. Moh.
- the various sensors 316 include a vehicle speed sensor, an acceleration sensor, an angular velocity sensor, and the like.
- the output values of the various sensors 316 are used by the CPU 301 to calculate the current position of the vehicle and to measure the amount of change in speed and direction. .
- the image capturing unit 101, the dividing unit 102, the calculating unit 103, the determining unit 104, the determining unit 105, and the storage unit 106 included in the landscape imaging apparatus 100 illustrated in FIG. 1 are the ROM 302 in the in-vehicle device 300 illustrated in FIG.
- the CPU 301 executes a predetermined program using programs and data recorded in the RAM 303, the magnetic disk 305, the optical disk 307, etc., and controls each part in the in-vehicle apparatus 300 to realize the function.
- the in-vehicle device 300 executes the various programs recorded in the ROM 302 as a recording medium in the in-vehicle device 300, so that the landscape imaging device 100 illustrated in FIG. Can be executed according to the procedure shown in FIG.
- the in-vehicle device 300 that is useful in the present embodiment installs a movable camera 313 around the dashboard of the vehicle, takes a landscape image in front of the vehicle, and makes various determinations (analysis).
- the reason why the front landscape image is used is that, as a point with a good view, the front or the diagonally forward of the left and right is a distant landscape (distant view).
- the landscape image in front of the vehicle is analyzed to determine whether or not there is a distant landscape in the landscape image. If there is a distant landscape, the camera 313 is set to indicate the direction in which the distant landscape exists. 31 3 Drive as driving direction. Then, the landscape image after driving is picked up, and the view index is calculated. Details will be described with reference to FIG. 8, but the view index may be expressed by dividing the landscape image into a plurality of cells and representing the number of cells determined to be distant scenery, for example. When the view index is high, the passenger is notified and the position information of the landscape image and the landscape image is stored in association with each other.
- FIG. 4 is an explanatory diagram showing an example of a landscape image that is helpful in the present embodiment.
- a landscape image 400 is a landscape in front of the vehicle, captured by a camera 313 installed in the vehicle.
- the landscape image 400 may be captured continuously, for example, when the vehicle is running, or may be configured to capture one frame at a predetermined interval.
- the captured landscape image 400 is output to, for example, a buffer area of the video IZF 311 or a work area of other recording media.
- the captured landscape image 400 may be configured to determine whether or not the CPU 301 has properly captured the image. Specifically, for example, it may be determined whether or not an obstacle is included in a landscape image by image recognition. And if the landscape is captured properly V, if not, discard the landscape image 400.
- FIG. 5 is an explanatory diagram showing an example of landscape image division that is useful in the present embodiment.
- a landscape image 500 is composed of cells 501, 501,... Divided into a plurality of the landscape image 400 shown in FIG.
- the CPU 301 divides the landscape image 400 recorded in the buffer area of the video IZF 311 and the work area of the other recording medium into a plurality of cells 501.
- the size of each cell 501 is preferably about 30 X 30 pixels. Note that the size of the cell 501 is determined in consideration of accuracy and processing speed such as distance calculation and landscape determination described later.
- the landscape image 400 Exclude the target power of splitting the edge.
- FIG. 6 shows the calculation of the distance of each cell and the determination of the distant view in the landscape image that is useful in this embodiment. It is explanatory drawing which shows a fixed example.
- an image 600 shows a result of calculating a distance and determining a distant view for each cell 501 constituting the landscape image 500 shown in FIG.
- feature points are detected for each cell in two landscape images taken in succession.
- it is characterized by the fact that two scene images that are continuous in time series meet a predetermined condition (in other words, a pixel) that is determined from among a part of the landscape included in each cell.
- a predetermined condition in other words, a pixel
- Detect as a point for example, one or more feature points may be detected for each cell.
- the detected feature point force also calculates an optical flow.
- the optical flow is calculated by obtaining the change (motion vector) of the feature points detected for each cell for two landscape images taken after one another. Then, the average distance of each cell is calculated.
- the distance between the camera 313 and the plurality of feature points is calculated using a plurality of vectors (optical flow) respectively corresponding to the plurality of feature points detected for each cell, and the calculated distances are averaged. Then, the average distance of each cell is calculated. It should be noted that the average distance of each cell calculated in this way is only required to recognize a relative difference between the cells, so in this embodiment, for example, the sky is treated as infinity.
- a part of the landscape included in each cell is a distant view based on the average distance of each cell calculated by the above-described distance calculation. This is a process for determining whether or not.
- a distant view if the calculated average distance of each cell is larger than a preset threshold value, one of the scenery corresponding to the cell is determined. Is a distant view (for example, a distant view of the sky, sea, mountains, etc.), and if the average distance of each cell is less than the preset threshold, the view of the landscape corresponding to that cell A part is determined to be a close-up view (for example, a close-up view of a road, a preceding vehicle, or the like).
- a close-up view for example, a close-up view of a road, a preceding vehicle, or the like.
- each cell is represented in multiple stages such as white, gray, black, etc., the brighter (i.e., closer to white), the closer the background (the smaller the average distance), the darker (i.e., the smaller the average distance) The closer it is to black, the farther the view is (the longer the average distance is).
- FIG. 7 is an explanatory diagram showing determination of the driving angle of the camera that is useful in this embodiment.
- an image 700 is composed of cells 701, 702, 7003, 704, and 705 based on multi-stage distant view determination shown in FIG.
- the multi-stage distant view is determined by determining the distant view of five stages, and the color is classified from dark to bright in order from the black cell 701 which is the farthest. It is shown in white.
- the left region 710 and the right region 720 are set in the image 700, and a plurality of Senore 701 (702, 703, 704) forces are set.
- the left region 710 and The driving angle of the camera 313 is determined based on the quantity of the farthest cell 701 included in the right area 720.
- the driving angle is determined by driving the driving angle so that the left region 710 and the right region 720 are driven at a constant angle in the direction where the number of the farthest cells 701 is larger.
- the configuration may be determined.
- the camera 313 is driven to the left by a certain angle. .
- the camera 313 may be directed in the direction of the region including the farthest cell 701 more than a predetermined number. Specifically, for example, if the number of cells 701 in the farthest view is 30 or more, the driving angle may be determined as a landscape direction with a good view. In this way, it is possible to accurately determine the direction in which the view is good just by comparing the left and right regions 710 and 720.
- the angle may be calculated according to the difference between the left and right regions 710 and 720. Specifically, for example, a large angle may be used if the difference is large, and a small angle may be used if the difference is small. It is also possible to count the points by scoring 5 distant views. Specifically, for example, cell 701 has one point, cell 702 has two points, cell 703 has three points, cell 704 has four points, and cell 705 has five points. But you can. [0100]
- the size of the region is not limited to the image 700, but may be three or more by providing a region in the middle.
- the road vanishing point may be detected in the vertical direction, and the driving angle may be determined for the region above the road vanishing point. In this way, by adjusting the region to be determined, the processing speed can be reduced and the load can be reduced.
- FIG. 8 is an explanatory diagram showing the view index calculation that is useful in this embodiment.
- a landscape image 800 is a landscape imaged by driving the driving angle component 313 of the camera 313 determined based on the number of cells 701 in the farthest view shown in FIG. .
- the captured landscape image 800 is output, for example, to the buffer area of the video IZF311 or the work area of other recording media.
- the view index is calculated by image analysis. Specifically, for example, the view index can be calculated by dividing the landscape image 800 into a plurality of cells, and the number of cells in the sea and in the sky can be calculated. It may be calculated by counting. The determination of sea and sky will be described with reference to FIGS. Here, the cell division is almost the same as in FIG.
- the calculated view index is equal to or greater than a certain value, it is determined that the landscape image 800 is a landscape with a good view, and the passenger is notified and the position information of the landscape image and the landscape image is associated with each other. And save. You can also use the display 312 to show the direction of the vantage point, or show the scenery of the vantage point and direction.
- FIG. 9 is a flow chart showing the contents of the processing of the in-vehicle device that works on the present embodiment.
- the start of imaging may be started when the vehicle travels or an input of start by the user may be accepted.
- step S901 the camera 301 is initialized by the CPU 301 (step S901: Yes) after waiting for the start of taking a landscape image (step S901: Yes).
- the initialization may be, for example, directing the movable camera 313 to the front of the vehicle.
- a landscape image in front of the vehicle is captured by the camera 313 (step S 903).
- the landscape image 400 shown in FIG. 4 is captured and output to the buffer memory of the video IZF 311 or the work area of another recording medium.
- the CPU 301 performs image division of the landscape image captured in step S903 (step S904). Specifically, for example, in the image division, the landscape image 400 is divided into a plurality of cells 501 as shown in FIG.
- step S905 the CPU 301 performs drive direction determination processing for driving the camera 313 using the landscape image divided in step S904 (step S905).
- the details of the driving direction determination process which may be performed by, for example, a configuration in which the cell divided in step S904 is performed based on the determination regarding the distant view, will be described later with reference to FIG.
- the CPU 301 drives the drive unit of the camera 313 by the drive angle determined in the drive direction determination process of step S905 (step S906).
- the camera 313 captures a landscape image in the direction driven in step S906 (step S907).
- the CPU 301 performs a view index calculation process on the landscape image captured in step S907 (step S908).
- the view index calculation process may be configured to calculate the ratio of the sea and the sky in the landscape image 800 shown in FIG. 8. Details of the view index calculation process will be described later with reference to FIG.
- the CPU 301 determines whether or not the view index calculated in the view index calculation process in step S908 is greater than or equal to a certain value (step S909).
- This fixed value for judging the sighting index can be set by the user or automatically changed according to the weather.
- step S909 if the view index is greater than or equal to a certain level (step S909: Yes), the view is good depending on the recording medium such as magnetic disk 305 or optical disk 307.
- the landscape image captured in step S907 is saved as a landscape (step S910).
- the scenery image may be stored in association with the position information of the scenery image and the scenery image, or may be notified to the passenger.
- the display 312 can be used to notify the passenger of the direction of the vantage point or the vantage point.
- V or display the landscape in the direction.
- step S909 when the view index is less than a certain value (step S9
- step S911 moves to step S911 and performs processing.
- the CPU 301 determines whether or not to continue taking landscape images (step S911).
- the decision to continue can be made, for example, if the vehicle is running.
- step S911 when continuing (step S911: Yes), the process returns to step S902 to repeat the process, and when not continuing (step S911: No), the series of processes is terminated as it is.
- the car navigation device can provide a good view. By registering the location, the passenger can use it as a recommended stop location. In addition, if the vehicle can be stopped, the passenger can stop the vehicle and take a walk around a point with a good view. In addition, a configuration may be adopted in which a location where parking is possible, such as a parking lot or a park, is searched and guided by a surrounding search using a car navigation device.
- FIG. 10 is a flowchart showing the contents of the drive direction determination process (step S905 in FIG. 9) in the in-vehicle device according to the present embodiment.
- the CPU 301 detects feature points for each cell divided in step S904 (step S1001).
- the CPU 301 also calculates the optical flow of the feature point force detected in step S1001 (step S1002), and uses the optical flow to calculate the level of each cell.
- the average distance is calculated (step S1003).
- step S1003 instead of calculating the average distance using an optical flow, when using a compound-eye camera 313, each of the two scene images taken simultaneously from different positions is analyzed based on the disparity in disparity. The average cell distance may be calculated.
- the CPU 301 determines whether or not the processing has been completed for all cells (step S 1004).
- step S1004 If all cells have not been processed in step S1004 (step S1004: No), return to step S1001 and repeat the processing, and all cells have been processed.
- the CPU 301 counts distant view cells in each area (Step S1005).
- each area may count the quantity of the farthest cell 7001 for each of the left area 710 and the right area 720 shown in FIG.
- step S1006 determines a driving angle for driving the camera 313 based on the number of distant view cells counted in step S1005 (step S1006). Then, the driving direction determination process in step S905 in FIG. 9 is terminated, and the process proceeds to step S906.
- FIG. 11 is a flowchart showing the contents of the view index calculation process (step S908 in FIG. 9) in the in-vehicle device according to the present embodiment.
- the CPU 301 performs sky determination processing on the landscape image captured in step S907 (step S1101).
- sky determination process for example, for a cell positioned above a landscape image, it is determined whether or not a part of the landscape is empty based on the brightness dispersion values of a plurality of pixels included in the cell positioned above. The details of the sky determination process that can be performed will be described later in FIG.
- the CPU 301 performs sea determination processing on the landscape image captured in step S907 (step S1102).
- sea determination process for example, a cell that is not determined to be empty in step S1101 is included in the cells determined to be empty.
- the details of the sea determination process which may be based on the average hue and average saturation of a number of pixels and determining whether or not a part of the landscape is a sea, will be described later in FIG.
- the CPU 301 performs mountain determination processing on the landscape image captured in step S907 (step S1103).
- the mountain determination processing is performed by analyzing the color characteristics of each of the cells adjacent to the cell determined to be empty among the cells determined not to be empty in step S1101. By detecting the boundary, it is possible to determine whether or not the landscape includes a mountain.
- step S1104 the CPU 301 performs night scene determination processing on the landscape image captured in step S907 (step S1104).
- night scene determination processing is performed by distinguishing a plurality of pixels that are higher than the reference out of a plurality of pixel groups that are differentiated by performing binary pixel values based on a predetermined value of brightness. Based on the number of pixel groups, it is possible to determine whether or not a part of the landscape is a night view!
- the CPU 301 counts predetermined feature cells based on the scenery determined in steps S1101 to S1104 (step S1105). Then, the view index calculation process in step S908 in FIG. 9 is terminated, and the process proceeds to step S909.
- the count of the predetermined feature cell may be set, for example, by setting the content of the feature cell by the user, which may be a view index by counting the number of sea and sky cells shown in FIG. But! /
- the view index is used as a feature of the landscape.
- a city degree, a natural degree, a foliage degree, or the like may be used.
- the degree of autumn leaves is analyzed for fractal dimensions for landscape images divided into multiple cells, and for each cell, green leaves (green), autumn leaves (red and yellow) and fallen leaves ( (Brown). Then, the number of feature cells is accumulated, and the ratio of the feature cells to all the cells of the landscape image is defined as the green leaf rate, the autumn leaf rate, and the leaf fall rate.
- the autumnal leaf degree may be expressed by the following equation (1).
- Foliage Degree (Autumn Leave Rate, Deciduous Rate) Z (Green Leaf Rate + Autumn Leave Rate + Deciduous Rate) (1)
- FIG. 12 is a flow chart showing the contents of the empty determination process (step S 1101 in FIG. 11) in the in-vehicle device that is relevant to the present embodiment.
- the CPU 301 calculates the average value and the variance value of the HLS in the cell from the hue H, lightness L, and saturation S of a plurality of pixels included in each cell (step S1201). For example, the calculated average value and variance value are recorded in the buffer area of the video IZF 311 and the work area of other recording media.
- the CPU 301 reads out the average value and the variance value for all the pixels of the determination target cell and its eight neighboring cells with respect to the average value and the variance value calculated in step S1201 (step S1202). That is, the determination is made by reading out the average value and the variance value for each pixel of each of the eight cells located in the periphery of the one cell to be determined and the one cell.
- the cell to be determined may be, for example, a cell located above the landscape image, typically in the upper half. Then, a series of operations from step S1203 to step S1205 are repeated for each cell to be determined.
- RGB color information When the color information corresponding to the cell is RGB color information, this color information is color system color information using hue, brightness (that is, luminance), and saturation as parameters, for example, HLS system.
- the color information may be converted into color information.
- the CPU 301 determines whether or not the variance value of all the lightness L of one cell to be determined read in step S1202 and its eight neighboring cells is 10 or less (step S 1203 ).
- step S1203 when the variance value of lightness L of all the cells to be judged and its neighboring cells is 10 or less (step S1203: Yes), CPU 301 Thus, a part of the landscape included in one cell to be determined is determined to be “sky” (step S 1204). That is, the determination target cell is determined to be an “empty” cell, and the result determined to be “empty” is recorded in, for example, the buffer area of the video IZF 311 or the work area of another recording medium. Become.
- CPU 301 determines whether or not the series of operations from step S 1202 to step S 1204 has been performed for all the cells to be determined (step S 12 05).
- step S1203 when the variance value of lightness L of one of the cells to be determined and the cells in the vicinity thereof is greater than 10 (step S1203: No), the process proceeds to step S1205. Then, it is determined whether or not the series of operations from step S 1202 to step S 1204 has been performed for all the cells to be determined (step S 12 05).
- Step S 1205! /, Step S 1202 and Step S 1204 are performed for all cells to be judged (Step S 1205: Yes ) Ends the empty determination process of step S1101 in FIG. 11 as it is, and proceeds to step S1102.
- step S 1205 perform the series of operations from step S 1202 and step S 1204 to all the cells to be judged! If so (step S 1205: No), the process returns to step S 1202 and the process is repeated.
- step S1204 it is determined whether or not a cell included in the landscape image is an "empty" cell.
- the description is omitted in FIG. 12, it is also possible to determine whether or not a cell determined to be a distant view among the cells included in the landscape image is an “empty” cell! /.
- the sky pattern is determined as follows. ⁇
- the hue H, lightness L, and saturation of the cell to be determined are determined from the average value of hue H, lightness L, and saturation of the cell to be determined.
- the sky pattern is judged by whether or not the average value of degree S satisfies the criteria that are determined in advance. For example, “Blue Sky”, “White Cloud”, “Black Cloud”, “Sunset” and “Night” can be set as the cell sky pattern.
- the determination condition can be set as follows, for example. Hue H, lightness, and saturation S are shown in the range of 0-255.
- the judgment condition for a cell to be judged as “blue sky” is that the average value of hue H is 100 or more and less than 160, the average value of lightness L is 100 or more and less than 180, and the average of saturation S The value is 60 or more.
- the determination condition for a cell to be determined as a “white cloud” is an average value of brightness L of 180 or more.
- the determination conditions for a cell to be determined as a “black cloud” are an average value of lightness L of less than 180 and an average value of saturation S of less than 60.
- the determination condition for a cell to be determined to be “sunset” is that the average value of hue H is 0 or more and less than 50 or 220 or more and 250 or less, the average value of lightness L is less than 180, and saturation S The average value of is over 60.
- the determination condition for a cell to be determined to be “night” is an average value of lightness L of less than 100.
- the number of cells of each sky pattern is counted. That is, the number of cells determined as “blue sky”, the number of cells determined as “white cloud”, the number of cells determined as “black, cloud”, the number of cells determined as “sunset” and “ The number of cells determined as “night” will be counted.
- the sky pattern of the landscape image is determined based on the count number.
- the determination of the sky pattern may be performed by determining that the largest number of sky patterns among the number of cells of each sky pattern is the sky pattern of the landscape image. More specifically, for example, the number of cells determined to be “blue sky” is the number of cells determined to be other sky patterns such as “white cloud”, “black cloud”, “sunset”, “night”, etc. If there are more than the gaps, the sky in the landscape image will be “ It will be determined that it is “blue sky”.
- the cell to be determined is fixed to the upper part (typically the upper half) in the landscape image, but the area of the cell to be determined is variable. Please do it.
- cells belonging to the upper side of the horizontal line (or horizon) in the landscape image may be determined as the determination target of the sky determination process.
- the vanishing point of the road in the landscape image is detected, and the position of the horizontal line is specified based on the vanishing point.
- the cloud shape may be determined based on the determination result of the sky pattern for each cell by the sky determination processing shown in FIG. That is, the boundary between the cell judged as “blue sky” or “sunset” and the cell judged as “white V cloud” or “black, cloud” is detected, and the shape of the cloud is determined. For example, if the boundary where a group of “white clouds” and “blue sky” cells are adjacent to each other extends in the vertical direction in the landscape image, it is determined to be an “incoming cloud (or long cloud)”. can do. Or, for example, if the boundary between the “white cloud” cell and the “blue sky” cell is mottled or wavy in the landscape image, it is “scale cloud (or mottled cloud)”. Can be determined.
- the determination condition for determining a cell as a “white cloud” is an average value of lightness L of 180 to less than 240, and the cell is “out-of-white”.
- the average value of lightness L may be 240 or more. In this case, it can be determined that the sky image has been blown out by backlighting, and the sky pattern determination Processing accuracy can be increased.
- FIG. 13 is a flowchart showing the contents of the sea determination process (step S1102 in FIG. 11) in the in-vehicle device that is useful in this embodiment.
- the CPU 301 reads out the empty determination result by the empty determination process shown in FIG. 12 (step S1301). At this time, one cell is selected as a cell to be processed in the subsequent step S1302.
- Step S1302 based on the empty determination result read out in step S1301 by the CPU 301, it is determined whether or not the target cell is positioned below the “empty” cell.
- step S1302 No If the target cell is not located below the “empty” cell in step S1302 (step S1302: No), the process returns to step S1301 and the process is repeated.
- step S1302 the target cell is located below the "empty" cell
- Step S1302 Yes
- the CPU 301 calculates an average value and a variance value of the hue H, lightness L, and saturation S of a plurality of pixels included in each cell (Step S1303).
- the calculated average value and variance value are recorded in, for example, the buffer area of the video IZF 311 and the work area of other recording media.
- the CPU 301 reads out the average value and the variance value for all pixels of the determination target cell and its left and right cells (step S 1304). .
- an average value and a variance value are read for each pixel of one cell to be determined and its left and right cells.
- a series of operations from step S 1304 to step S 1307 are repeated for each cell to be determined.
- a sea determination in which the average value of the hue H, lightness L, and saturation S of one cell to be determined and its left and right cells is a predetermined determination condition, is performed. It is determined whether or not the force satisfies the condition (step S 1305).
- the sea determination condition can be set as follows, for example.
- Hue H, lightness L, and saturation S are expressed in the range of 0 to 255.
- the sea judgment condition is an average value of brightness L of cells whose average value of Hue H is "Sky” and within the range of average value of Hue H of ⁇ 10 and whose average value of Brightness L is "Sky”
- the average value of saturation S is 60 or more.
- step S1305 determines that the determination target cell is a cell of "sea”. Determination is made (step S 1306).
- the result determined as “sea” is recorded in, for example, the buffer area of the video IZF 311 or the work area of the other recording medium.
- CPU 301 determines whether or not a series of operations from step S 1304 to step S 1306 has been performed for all the cells to be determined (step S 13 07).
- step S1305 If the sea judgment condition is not satisfied in step S1305 (step S1305: No), the process proceeds to step S1307, and the series of operations in step S1304 is also subject to judgment. It is determined whether or not the test has been performed for all the cells to be set (step S 1307).
- step S1307 if the series of operations from step S1304 to step S1306 is performed for all the cells to be judged (step S1307: Yes), the figure remains unchanged.
- the sea determination process in step S1102 in 11 is terminated, and the process proceeds to step S1103.
- step S1307 perform the series of operations from step S1304 to step S1306 for all the cells to be judged! : No) returns to Step SI 304 and repeats the process.
- the power omitted in FIG. 13 is counted for all the cells, and the number of “sea” cells is counted. When the number is 1Z 5) or more, it may be determined that the landscape image includes a landscape with a view of the sea.
- the user automatically captures a landscape with a good view without having to operate the camera or register the shooting location, Scenery images can be saved.
- the driver can check the scenery after driving, it is not necessary to pay attention to the scenery during driving, and safe driving can be achieved.
- a landscape when a landscape is captured by driving a movable camera, a plurality of landscape images are acquired in accordance with the driving to obtain a panoramic landscape image. It may be configured. Specifically, for example, when a wild rama landscape image is determined as a point with a good view, a movable camera is driven to the side of the front force of the vehicle and is taken at every driving interval of a certain angle. Then, a plurality of captured landscape images may be translated and matched to obtain a traveling no-rama landscape image.
- panoramic landscape images when a point with a good view continues along the road, instead of taking a movable camera at every driving interval of a certain angle, move the movable camera in a direction with a good view.
- a plurality of landscape images may be taken at a fixed mileage interval while facing.
- the vehicle is mounted on a vehicle and images a landscape. It is good also as a structure which is not listed. Specifically, for example, it is possible to capture a landscape with a fixed point camera or the like and collect landscape images with a good view. If it is a fixed-point camera, there is no need to take measures against vibrations when it is mounted on the vehicle, so an inexpensive system can be provided.
- the object to be imaged is a landscape, but is not limited to a landscape.
- an image of a building or the like may be captured to collect beautiful building images. By doing so, it is possible to collect images of subjects desired by a wide range of users.
- the landscape imaging method described in the present embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
- This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed when the recording medium force is also read by the computer.
- the program may be a transmission medium that can be distributed through a network such as the Internet.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2006/318980 WO2008050374A1 (fr) | 2006-09-25 | 2006-09-25 | Dispositif, procédé, programme de formation d'images de décor et support d'enregistrement pouvant être lu par ordinateur |
JP2008540796A JP4870776B2 (ja) | 2006-09-25 | 2006-09-25 | 風景撮像装置、風景撮像方法、風景撮像プログラムおよびコンピュータに読み取り可能な記録媒体 |
US12/442,643 US8233054B2 (en) | 2006-09-25 | 2006-09-25 | Scenery imaging apparatus, scenery imaging method, scenery imaging program, and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2006/318980 WO2008050374A1 (fr) | 2006-09-25 | 2006-09-25 | Dispositif, procédé, programme de formation d'images de décor et support d'enregistrement pouvant être lu par ordinateur |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008050374A1 true WO2008050374A1 (fr) | 2008-05-02 |
Family
ID=39324190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/318980 WO2008050374A1 (fr) | 2006-09-25 | 2006-09-25 | Dispositif, procédé, programme de formation d'images de décor et support d'enregistrement pouvant être lu par ordinateur |
Country Status (3)
Country | Link |
---|---|
US (1) | US8233054B2 (ja) |
JP (1) | JP4870776B2 (ja) |
WO (1) | WO2008050374A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011003180A (ja) * | 2009-06-19 | 2011-01-06 | Ricoh Co Ltd | 画像収集装置に用いる天空検出装置及び方法 |
JP2011215974A (ja) * | 2010-03-31 | 2011-10-27 | Aisin Aw Co Ltd | 画像処理システム |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4915859B2 (ja) * | 2007-03-26 | 2012-04-11 | 船井電機株式会社 | 物体の距離導出装置 |
FR2948483B1 (fr) * | 2009-07-23 | 2012-02-03 | Airbus Operations Sas | Procede pour afficher une image sur un ecran d'un aeronef |
ITVI20120303A1 (it) * | 2012-11-09 | 2014-05-10 | St Microelectronics Srl | Metodo per rilevare una linea retta in un'immagine digitale |
US10421412B2 (en) * | 2014-04-17 | 2019-09-24 | The Hertz Corporation | Rotatable camera |
US20170195579A1 (en) * | 2016-01-05 | 2017-07-06 | 360fly, Inc. | Dynamic adjustment of exposure in panoramic video content |
CN108399629B (zh) * | 2018-02-05 | 2020-06-02 | 西南交通大学 | 一种人工复眼相机的图像阵列光流估计方法 |
US10775174B2 (en) * | 2018-08-30 | 2020-09-15 | Mapbox, Inc. | Map feature extraction system for computer map visualizations |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000244800A (ja) * | 1999-02-22 | 2000-09-08 | Victor Co Of Japan Ltd | 撮像装置 |
JP2001257920A (ja) * | 2000-03-13 | 2001-09-21 | Fuji Photo Film Co Ltd | カメラシステム |
JP2005045398A (ja) * | 2003-07-24 | 2005-02-17 | Canon Inc | 撮影支援方法、撮影支援装置及び撮影装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7106376B1 (en) * | 1998-10-22 | 2006-09-12 | Flashpoint Technology, Inc. | Method and system for improving image quality of portrait images using a focus zone shift |
US7262798B2 (en) * | 2001-09-17 | 2007-08-28 | Hewlett-Packard Development Company, L.P. | System and method for simulating fill flash in photography |
JP2003198904A (ja) | 2001-12-25 | 2003-07-11 | Mazda Motor Corp | 撮像方法、撮像システム、撮像装置、撮像制御サーバ、並びに撮像プログラム |
US7215828B2 (en) * | 2002-02-13 | 2007-05-08 | Eastman Kodak Company | Method and system for determining image orientation |
US7620246B2 (en) * | 2002-07-30 | 2009-11-17 | Fujifilm Corporation | Method and apparatus for image processing |
US20050212950A1 (en) * | 2004-03-26 | 2005-09-29 | Chinon Kabushiki Kaisha | Focal length detecting method, focusing device, image capturing method and image capturing apparatus |
US7804980B2 (en) * | 2005-08-24 | 2010-09-28 | Denso Corporation | Environment recognition device |
-
2006
- 2006-09-25 WO PCT/JP2006/318980 patent/WO2008050374A1/ja active Application Filing
- 2006-09-25 US US12/442,643 patent/US8233054B2/en not_active Expired - Fee Related
- 2006-09-25 JP JP2008540796A patent/JP4870776B2/ja not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000244800A (ja) * | 1999-02-22 | 2000-09-08 | Victor Co Of Japan Ltd | 撮像装置 |
JP2001257920A (ja) * | 2000-03-13 | 2001-09-21 | Fuji Photo Film Co Ltd | カメラシステム |
JP2005045398A (ja) * | 2003-07-24 | 2005-02-17 | Canon Inc | 撮影支援方法、撮影支援装置及び撮影装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011003180A (ja) * | 2009-06-19 | 2011-01-06 | Ricoh Co Ltd | 画像収集装置に用いる天空検出装置及び方法 |
JP2011215974A (ja) * | 2010-03-31 | 2011-10-27 | Aisin Aw Co Ltd | 画像処理システム |
Also Published As
Publication number | Publication date |
---|---|
JP4870776B2 (ja) | 2012-02-08 |
US8233054B2 (en) | 2012-07-31 |
US20100085440A1 (en) | 2010-04-08 |
JPWO2008050374A1 (ja) | 2010-02-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4870776B2 (ja) | 風景撮像装置、風景撮像方法、風景撮像プログラムおよびコンピュータに読み取り可能な記録媒体 | |
JP4717073B2 (ja) | 風景解析装置及び方法 | |
KR101648339B1 (ko) | 휴대용 단말기에서 영상인식 및 센서를 이용한 서비스 제공 방법 및 장치 | |
US9536325B2 (en) | Night mode | |
JP6549898B2 (ja) | 物体検出システム、物体検出方法、poi情報作成システム、警告システム、及び誘導システム | |
US7653485B2 (en) | Route guidance system and method | |
JP6426433B2 (ja) | 画像処理装置、画像処理方法、poi情報作成システム、警告システム、及び誘導システム | |
US20140297185A1 (en) | Method, Device and System for Presenting Navigational Information | |
CN111126182A (zh) | 车道线检测方法、装置、电子设备及存储介质 | |
EP2672230A1 (en) | Providing navigation instructions while device is in locked mode | |
JP6274177B2 (ja) | 車両制御システム | |
KR20130033446A (ko) | 고 다이나믹 레인지 이미지를 캡처하도록 장치를 동작시키는 방법 | |
WO2007000999A1 (ja) | 画像分析装置および画像分析方法 | |
KR20130107697A (ko) | 내비게이션 단말기의 배경 화면 표시 장치 및 방법 | |
CN113711268A (zh) | 一种将散景效果应用于图像的电子装置及其控制方法 | |
CN111192341A (zh) | 生成高精地图的方法、装置、自动驾驶设备及存储介质 | |
CA2743941C (en) | Method, device and system for presenting navigational information | |
US10282819B2 (en) | Image display control to grasp information about image | |
WO2017058449A1 (en) | Navigation application with novel declutter mode | |
CN108335507A (zh) | 利用摄像头的拍摄影像的驾驶引导提供方法及装置 | |
CN113225550A (zh) | 偏移检测方法、装置、摄像头模组、终端设备及存储介质 | |
JP2014146989A (ja) | 撮像装置、撮像方法および撮像プログラム | |
CN110163862B (zh) | 图像语义分割方法、装置及计算机设备 | |
CN110971889A (zh) | 一种获取深度图像的方法、摄像装置以及终端 | |
CN113492756A (zh) | 一种车外信息展示方法、装置、设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 06798306 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008540796 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12442643 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06798306 Country of ref document: EP Kind code of ref document: A1 |