WO2022195312A1 - 画像処理装置及び画像処理方法 - Google Patents
画像処理装置及び画像処理方法 Download PDFInfo
- Publication number
- WO2022195312A1 WO2022195312A1 PCT/IB2021/000167 IB2021000167W WO2022195312A1 WO 2022195312 A1 WO2022195312 A1 WO 2022195312A1 IB 2021000167 W IB2021000167 W IB 2021000167W WO 2022195312 A1 WO2022195312 A1 WO 2022195312A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- image processing
- captured
- blur correction
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 237
- 238000003672 processing method Methods 0.000 title claims description 7
- 238000012937 correction Methods 0.000 claims abstract description 112
- 230000033001 locomotion Effects 0.000 claims abstract description 21
- 238000000034 method Methods 0.000 claims description 20
- 238000003384 imaging method Methods 0.000 claims description 12
- 230000005540 biological transmission Effects 0.000 claims description 7
- 230000002194 synthesizing effect Effects 0.000 claims 1
- 230000008719 thickening Effects 0.000 claims 1
- 230000006399 behavior Effects 0.000 description 94
- 238000004891 communication Methods 0.000 description 23
- 238000001514 detection method Methods 0.000 description 17
- 206010025482 malaise Diseases 0.000 description 10
- 230000006870 function Effects 0.000 description 6
- 201000003152 motion sickness Diseases 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000003702 image correction Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/683—Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20201—Motion blur correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
Definitions
- the present invention relates to an image processing device and an image processing method.
- Driving means for changing the imaging direction of the imaging means by rotating imaging means comprising an imaging optical system and an imaging element, image correction means for correcting the captured image captured by the imaging means by image processing, and vibration of the imaging means and a control means for acquiring a detection signal from a detection means for detecting the .
- the control means controls the image blur correction of the captured image by controlling the drive means, and the image correction for the vibration remaining in the image blur correction by the first control.
- a second control for controlling image blur correction of the captured image is executed by controlling means (Patent Document 1).
- Patent Document 1 requires image blur correction if image blur correction is performed on the entire captured image when there is a part in the captured image that becomes stationary due to shaking with the vibration of the imaging means. There is a problem that the portion of the stationary state that is not set is subjected to correction processing, and the portion of the stationary state moves unnaturally.
- the problem to be solved by the present invention is to provide an image processing apparatus and an image processing method that can perform blur correction processing on an image that shakes due to the behavior of a moving object during movement.
- the present invention it is determined whether or not a captured image includes a scene outside a moving body, and if it is determined that the captured image includes a scene outside the moving body, a behavior indicating behavior during movement of the moving body. Based on the data, the above problem is solved by performing blur correction processing on the captured image so as to cancel the shake of the captured image due to the behavior.
- the present invention it is possible to suppress the shake of the image due to the behavior of the mobile body when it moves.
- FIG. 1 is a block diagram of an image processing system according to this embodiment.
- FIG. 2 is a diagram showing an example of an image captured by an in-vehicle camera.
- FIG. 3 is a diagram showing an example of an image captured by an in-vehicle camera.
- FIG. 4 is a flow chart showing an example of a sequence chart in the image processing system of this embodiment.
- FIG. 5 is a flow chart showing an example of the procedure of the image processing method in the image processing apparatus of this embodiment.
- FIG. 1 is a block diagram showing an example configuration of an image processing system 10 including an image processing apparatus 100 according to this embodiment.
- the image processing system 10 includes an image processing device 100, a vehicle 200, and a terminal device 300 in this embodiment.
- the image processing device 100 is a server capable of communicating with the vehicle 200 and the terminal device 300 and exchanging information.
- the image processing system 10 performs task processing with a high computational load via the network, and also facilitates algorithm changes. Therefore, it is possible to provide high-speed and versatile services.
- the image processing device 100 acquires an image of the interior and/or exterior of the vehicle 200 from the vehicle 200, performs image processing on the acquired image as necessary, and displays the interior and/or exterior of the vehicle 200. , the image is transmitted to the terminal device 300 in order to be displayed by the second user.
- Terminal device 300 is located in a remote space away from vehicle 200 .
- the remote space is a place where the image processing apparatus 100 is installed and a place away from the vehicle 200, such as the location of the second user.
- the vehicle 200 is taken as an example of a moving body, but the moving body is not limited to the vehicle 200, and may be a moving body such as a ship or an airplane.
- a first user is an occupant on board the vehicle 200 .
- the first user is the driver of the vehicle 200, but may be an occupant in the front passenger seat or the rear seat.
- the second user wears the display device 340 of the terminal device 300 over his or her eyes, looks at the display screen of the display device 340, and enjoys driving in virtual space with the first user riding in the vehicle 200. can. Also, the second user can use the terminal device 300 to have a conversation with the first user.
- the image processing device 100 includes a controller 110 and a communication device 120 .
- Controller 110 comprises a computer with hardware and software.
- This computer includes a ROM storing a program, a CPU executing the program stored in the ROM, and a RAM functioning as an accessible storage device.
- an MPU, DSP, ASIC, FPGA, or the like can be used instead of or together with the CPU.
- the controller 110 includes at least an image acquisition unit 111, a behavior data acquisition unit 112, a determination unit 113, an image processing unit 114, a transmission unit 115, and a reception unit 116 as functional blocks.
- Each functional block executes each function through the cooperation of hardware and software for realizing each function or executing each process.
- the functions of the controller 110 are divided into six blocks, and the function of each functional block will be described. Blocks, or may be divided into seven or more functional blocks.
- the controller 110 acquires captured images captured by the in-vehicle camera 221 and/or the exterior camera 222 from the running vehicle 200, and acquires behavior data indicating the behavior of the vehicle 200 during running.
- the controller 110 Based on the behavior data, the controller 110 performs blur correction processing on the captured image so as to cancel the shaking of the captured image due to the behavior of the vehicle 200 , and transmits the captured image after the blur correction processing to the terminal device 300 .
- Each functional block included in the controller 110 will be described below.
- the image acquisition unit 111 acquires images captured by the in-vehicle camera 211 and/or the exterior camera 222 .
- the captured image includes a vehicle interior image (moving body interior image) representing the interior of the vehicle such as a dashboard and pillars, and/or a landscape image representing scenery including buildings and nature outside the vehicle.
- vehicle interior image moving body interior image
- landscape image representing scenery including buildings and nature outside the vehicle.
- the image captured by the in-vehicle camera 211 includes the scenery image and the vehicle interior image.
- FIG. 2( a ) is an example of an image captured by the in-vehicle camera 211 .
- the image within the area A showing the scenery is the landscape image
- the image within the area B showing the pillars and the dashboard is the vehicle interior image.
- the behavior data acquisition unit 112 acquires behavior data that indicates the behavior of the vehicle when it is running (moving).
- the behavior is the vibration of vehicle 200 during running.
- the behavior data is data detected by behavior detection sensor 240 in vehicle 200 .
- the behavior of the vehicle 200 is represented by vertical movement (vibration) of the vehicle 200, rotation of the vehicle 200 about the vertical axis of the vehicle 200, inclination of the vehicle 200, and the like.
- the vertical direction of vehicle 200 is normal to the running surface of vehicle 200 .
- the data detected by the behavior detection sensor 240 is vibration wave data representing the vertical motion of the vehicle 200 and the rotation of the vehicle 200 about the vertical direction.
- behavior data acquisition unit 112 identifies the movement of vehicle 200 in the vertical direction. Further, when the data detected by behavior detection sensor 240 includes data indicating rotation of vehicle 200 about the vertical direction, behavior data acquiring unit 112 detects rotation of vehicle 200 about the vertical direction. Identify. The behavior data acquirer 112 may identify the vertical movement and rotation about the vertical axis of the vehicle 200 . Note that the behavior data acquisition unit 112 may acquire, as behavior data, images captured by the in-vehicle camera 221 and/or the exterior camera 222 instead of the data detected by the behavior detection sensor 240 .
- the determination unit 113 determines whether or not the vehicle behaves based on the behavior data acquired by the behavior data acquisition unit 112 . Specifically, the determination unit 113 identifies vibration waves having a specific frequency from the behavior data, and compares the amplitude value (magnitude of amplitude) of the identified vibration waves with a predetermined amplitude threshold. Then, when the amplitude value of the vibration wave is equal to or greater than the amplitude threshold value, the determination unit 113 determines that the vehicle behaves.
- the specific frequency of the vibration wave is a frequency in a preset specific frequency band, and the specific frequency band is a vibration frequency range that affects human motion sickness due to image shake.
- the amplitude threshold indicates the lower limit of vibration amplitude that causes VR sickness, and is determined by an experimental value, an empirical value, or the like. Note that "VR sickness" will be described later. Further, the determination unit 113 determines whether or not the captured image acquired by the image acquisition unit 111 includes the scenery outside the vehicle 200 . Since the in-vehicle camera 221 and the exterior camera 222 are fixed to the vehicle 200, the captured images of the in-vehicle camera 221 and the exterior camera 222 shake according to the behavior of the vehicle 200 when the vehicle behaves. In addition, since the vehicle interior image is an image showing a part of the vehicle such as a pillar, it vibrates according to the behavior of the vehicle 200 .
- the scenery image does not vibrate according to the behavior of the vehicle 200 .
- the images captured by the vehicle interior camera 221 and the exterior camera 222 shake in accordance with the behavior of the vehicle 200, and the vehicle interior image shakes in synchronization with the vehicle behavior. do not do.
- the scenery image does not shake according to the behavior of the vehicle 200, it appears to vibrate. That is, when there is behavior of the vehicle 200, the scenery image appears to shake, and the vehicle interior image does not shake. In the example of FIG. 2, the image in area A shakes, and the image in area B does not shake. Also, when there is behavior of the vehicle, the captured image of the vehicle exterior camera 22 shakes as a whole.
- the second user may feel "VR sickness".
- the second user is in a remote location and does not directly feel the shaking of vehicle 200 .
- the image shakes without feeling the vibration. The senses you feel will not match, and you will be in a state like motion sickness. Such a state is "VR sickness”.
- the determination unit 113 determines whether or not the vehicle behaves based on the image captured. You may The determination unit 113 identifies the movement of the vehicle 200 using image movement information such as optical flow that can be acquired from images captured by the in-vehicle camera and/or the exterior camera 222, and determines whether the movement of the vehicle 200 is due to behavior. determine whether For example, if the motion of vehicle 200 is a periodic motion, it can be estimated that the motion is not caused by running of vehicle 200 but is vibration of vehicle 200 received from the road surface or vibration caused by the engine of vehicle 200 . Determining unit 113 determines whether or not there is behavior of vehicle 200 by identifying the periodic movement of vehicle 200 due to the behavior of vehicle 200 from the captured image.
- image movement information such as optical flow that can be acquired from images captured by the in-vehicle camera and/or the exterior camera 222
- the image processing unit 114 Based on the determination result of the determination unit 113 and the behavior data acquired by the behavior data acquisition unit 112, the image processing unit 114 performs blur correction processing on the captured image so as to cancel the shaking of the captured image due to the behavior of the vehicle 200. Run.
- the image processing unit 114 performs blur correction processing on the captured image so as to cancel the fluctuation of the economic image.
- the image processing unit 114 performs blur correction processing on the entire captured image.
- the blur correction process is a process of processing a scenery image so as to cancel the shake of the scenery image.
- a known image processing method for correcting blurring may be used for blur correction processing. For example, when the scenery image shakes at a predetermined cycle, the image processing unit 114 performs blur correction processing by applying an opposite-phase displacement to the scenery image. As a result, image shaking caused by the behavior of the vehicle 200 can be reduced.
- the image processing unit 114 calculates the blur amount and blur period of the image due to the shaking of the economic image from the vibration amplitude and vibration frequency of the vehicle 200 specified by the behavior data acquisition unit 112, and calculates the calculated blur.
- the position of the scenery image is corrected in accordance with the period so as to return the displacement corresponding to the blur amount.
- FIG. 2B is an example of an image captured by the in-vehicle camera 211 when the vehicle 200 behaves.
- 2(a) and 2(b) are images when the vehicle 200 is in the same position and the same scenery is seen from the windshield.
- FIG. 2B shows a captured image in a state in which the landscape image is shaken due to the behavior of the vehicle 200.
- the scenery image is shifted clockwise due to shaking.
- the image of the portion indicated by the arrow P is omitted in order to explain the shaking of the scenery image due to shaking, but the scenery image is actually shown.
- the image processing unit 114 corrects the captured image by rotating the scenery image counterclockwise by the displacement indicated by the arrow P.
- FIG. 2B the blur correction processing is performed by rotating the scenery image. The position of the image may be corrected.
- the image processing unit 114 performs image processing on the boundary portion between the landscape image after blur correction processing and the vehicle interior image, thereby correcting the deviation between the landscape image after blur correction processing and the vehicle interior image. can be corrected.
- FIG. 3 is an example of an image captured by the in-vehicle camera 211. As shown in FIG. FIG. 3(a) is a captured image in a state in which the deviation between the scenery image and the vehicle interior image after blur correction processing is not corrected. FIG. 3(b) is a captured image in which the deviation between the landscape image and the vehicle interior image after blur correction processing is corrected. As shown in FIG. 3A, the scenery image shakes due to the behavior of the vehicle 200, and blur correction processing is performed on the scenery image so as to cancel the shaking.
- the image processing unit 114 identifies the vehicle interior image and the scenery image from the captured image.
- the image processing unit 114 performs blur correction processing so that the landscape image moves diagonally to the upper right on the paper surface of FIG. 3(a).
- the image processing unit 114 identifies pillars included in the vehicle interior image as boundaries.
- the boundary between the vehicle interior image and the landscape image is the boundary between the image that has not undergone blur correction processing and the image that has not undergone blur correction processing. be.
- the image processing unit 114 performs image processing on the pillars to fill in the gaps. As shown in FIG. 3B, the image processing unit 114 thickens the pillars so as to fill the gaps. Note that the image processing unit 114 enlarges the image of the boundary portion in the image inside the vehicle, not limited to the pillar, so that the boundary between the image that has undergone the blur correction processing and the image that has not undergone the blur correction processing among the captured images. Image processing may be performed so that gaps between portions are eliminated.
- the blur correction boundary portion image is the boundary between the scenery image and the vehicle interior image.
- the image processing unit 114 enlarges the image of the boundary portion of the scenery image so as to fill the gap of the boundary portion between the image that has undergone blur correction processing and the image that has not undergone blur correction processing among the captured images. image processing may also be performed. Further, the image processing unit 114 not only enlarges the image, but also processes the color of the boundary portion so that the difference between the image that has undergone blur correction processing and the image that has not undergone blur correction processing in the captured image becomes less noticeable. You may do so.
- the image processing unit 114 processes the landscape image processed by the blur correction processing and the image processed by the blur correction processing. Synthesize with an unprocessed vehicle interior image. It should be noted that the image processing unit 114 does not need to synthesize the image when the entire captured image is a landscape image and the blur correction process is performed on the landscape image. In addition, when the image processing unit 114 performs image processing on the vehicle interior image so as to fill the gap at the boundary with the image that has not been subjected to the blur correction processing, the image processing unit 114 performs image processing on the scenery image after the blur correction processing. , and an image inside the vehicle after image processing.
- the image processing unit 114 When the image processing unit 114 performs blur correction processing on the captured image, the image data after the blur correction processing is stored in the memory.
- the image data after the blur correction processing is data of the captured image that has undergone the synthesis processing.
- the image processing unit 114 does not perform blur correction processing on the captured image when the vehicle does not behave in such a way as to cause VR sickness, and the image processing unit 114 does not perform blur correction processing on the captured image. data in memory.
- the transmission unit 115 transmits to the terminal device 300 via the communication device 120 a control signal for displaying the image processed by the image processing unit 114 on the terminal device 300 .
- the receiving unit 116 receives operation information input to the input device 320 by the second user.
- the operation information is an operation command or the like for acquiring a VR image with respect to the vehicle 200 .
- Vehicle 200 includes in-vehicle controller 210 , in-vehicle camera 221 , exterior camera 222 , in-vehicle communication device 230 , and behavior detection sensor 240 .
- the in-vehicle controller 210 controls the functions of each device and sensor of the vehicle 200 .
- the in-vehicle controller 210 outputs the image captured by the in-vehicle camera 221 and/or the exterior camera 222 and the detection data detected by the behavior detection sensor 240 to the in-vehicle communication device 230 .
- the in-vehicle camera 221 is fixed in the vehicle interior, captures an image of the interior and/or exterior of the vehicle from a position fixed in the vehicle interior, and outputs the captured image data to the in-vehicle controller 210 .
- the in-vehicle camera 221 images the scenery outside the vehicle through the window.
- the in-vehicle camera 221 may be a camera that can be removed from a predetermined fixed position inside the vehicle.
- a mobile terminal such as a smartphone may be brought into the vehicle, fixed to a folder or the like, and used as the in-vehicle camera 221 .
- the exterior camera 222 is fixed to the body of the vehicle 200 or the like, captures an image of the scenery outside the vehicle interior and the surroundings of the vehicle 200 , and outputs the captured image data to the in-vehicle controller 210 .
- the vehicle interior camera 221 and the vehicle exterior camera 222 do not necessarily have to be one, and may be multiple. Note that when a plurality of cameras are used, a plurality of images may be combined to form a vehicle interior image and a scenery image. Thereby, the resolution of the picked-up image can be improved.
- the in-vehicle communication device 230 communicates with the communication device 120 to exchange data. Specifically, the in-vehicle communication device 230 receives a signal including an operation command for acquiring a VR image of the vehicle 200 from the communication device 120, and transmits a signal including a captured image and behavior data to the communication device 120. do.
- Behavior detection sensor 240 is a sensor that detects the behavior of vehicle 200 and is provided in vehicle 200 .
- a G sensor or the like is used for the behavior detection sensor 240 .
- the behavior detection sensor 240 is not limited to the sensor provided in the vehicle 200 , and a mobile terminal with a built-in acceleration sensor may be brought into the vehicle and used as the behavior detection sensor 240 .
- Behavior detection sensor 240 which is a mobile terminal, connects to in-vehicle controller 210 via an in-vehicle network and transmits data indicating behavior to in-vehicle controller 210 .
- the terminal device 300 is a device used by the second user in a remote space, and outputs a virtual image when viewed from a predetermined position in the vehicle 200 such as the front passenger's seat.
- the terminal device 300 may be installed in a remote space, for example.
- the terminal device 300 includes a terminal controller 310 , an input device 320 , a terminal communication device 330 and a display device 340 .
- the terminal controller 310 controls the functions of each device of the terminal device 300 . Specifically, the terminal controller 310 outputs a control signal for outputting the captured image indicated by the image data received by the terminal communication device 330 from the display device 340 to the second user.
- the input device 320 is a device operated by the second user, and is used by the second user to input operation information to the image processing device 100 for acquiring a VR image viewed from a predetermined position of the vehicle 200 .
- the terminal communication device 330 communicates with the image processing device 100 and the in-vehicle communication device 230 of the vehicle 200 to exchange information. Specifically, the terminal communication device 330 receives image data from the image processing device 100 and transmits operation information to the image processing device 100 .
- the display device 340 is a device that outputs the environment inside and outside the vehicle at a predetermined position inside or outside the vehicle to the second user.
- the display device 340 is, for example, a glasses-type or goggles-type VR head-mounted display, and is worn on the head of the second user.
- the second user can visually recognize virtual reality (VR) in which the environment inside and outside the vehicle is displayed via the terminal device 300 .
- the display device 340 outputs a virtual image representing the environment inside and outside the vehicle, and virtual audio information representing audio heard from a predetermined position inside or outside the vehicle.
- the second user can see and hear the scenery as if he or she were in the vehicle 200 .
- the second user can also see the scenery outside the vehicle 200 from the captured image of the vehicle exterior camera 222 .
- the display device 340 has, for example, a non-transmissive display, a speaker, and the like.
- FIG. 4 is a sequence chart for explaining the data flow in the image processing system 10. As shown in FIG. 4,
- step S1 the second user operates the input device 320 to switch the VR system from off to on.
- the VR system is a system for realizing virtual reality of the environment inside and outside the vehicle by displaying images on the display device 340 and providing a space as if the second user were in the vehicle 200 .
- the terminal controller 310 transmits an operation signal for acquiring a VR image to the image processing device 100 .
- step S2 when the controller 110 of the image processing device 100 receives the VR request signal from the communication device 120, it starts VR request processing.
- the controller 110 transmits a VR request signal to the in-vehicle communication device 230 .
- step S ⁇ b>3 the in-vehicle controller 210 determines whether or not to accept the VR request based on the VR request signal received by the in-vehicle communication device 230 . For example, when the VR request signal is received, the in-vehicle controller 210 displays on the in-vehicle display a selection screen asking whether to accept the VR request. When the first user performs an operation to accept the VR request, the in-vehicle controller 210 determines that the VR request can be accepted.
- the in-vehicle controller 210 determines that the VR request cannot be accepted. When it is determined that the VR request can be accepted, the in-vehicle controller 210 uses the in-vehicle communication device 230 to transmit a VR permission signal to the image processing device 100 .
- step S ⁇ b>4 the controller 110 of the image processing device 100 receives the VR permission signal from the vehicle 200 , starts the VR system, and transmits an image request signal to the vehicle 200 through the communication device 120 .
- step S ⁇ b>5 in-vehicle controller 210 acquires VR data from in-vehicle camera 221 , exterior camera 222 , and behavior detection sensor 240 .
- the VR data is an image captured by the in-vehicle camera 221 and/or the exterior camera 222 and data (behavior data) detected by the behavior detection sensor 240 .
- the controller 110 transmits the acquired VR data to the image processing device 100 .
- step S6 the controller 110 of the image processing device 100 determines whether blur correction processing is necessary for the captured image based on the captured image and behavior data included in the VR data. If the captured image includes a scenery image and there is behavior of the vehicle 200, the controller 110 determines that blur correction processing is necessary. If the captured image does not include a scenery image, or if the vehicle 200 does not behave, the controller 110 determines that blur correction processing is not necessary.
- controller 110 When it is determined that blur correction processing is necessary, the controller 110 performs blur correction processing on the captured image in step S7.
- step S8 controller 110 generates an image for VR display based on the image after blur correction processing.
- the image processing unit 114 of the controller 110 performs the blur correction processing on the landscape image and the vehicle interior image. are synthesized, and a screen for VR display is generated based on the synthesized image.
- the image processing unit 114 processes the image so that the image representing the character is included in the synthesized image. I do.
- the image processing unit 114 generates a captured image for VR display.
- the controller 110 then transmits data including the captured image for VR display to the terminal device 300 .
- character display is performed to protect the privacy of the first user. Instead of the character, only the first user's face area may be masked.
- step S9 the terminal controller 310 outputs the captured image for VR display received by the terminal communication device 330 to the display device 340, and the display device 340 displays the VR image.
- FIG. 5 is a flow chart showing the control flow of the controller 110. As shown in FIG.
- step S11 the image acquisition unit 111 of the controller 110 acquires the captured image captured by the vehicle interior camera 221 and/or the vehicle exterior camera 222.
- step S12 the behavior data acquisition unit 112 acquires behavior data indicating the behavior of the vehicle 200 when moving.
- step S13 determination unit 113 determines whether vehicle 200 behaves based on the behavior data. When it is determined that the vehicle does not behave, the controller 110 executes control processing in step S22, which will be described later.
- the determination unit 113 determines whether or not the captured image includes the scenery outside the vehicle 200 . Whether or not scenery is included is determined by whether or not the captured image includes a scenery image. When the captured image does not include the scenery outside the vehicle 200, the controller 110 executes control processing in step S22, which will be described later.
- step S15 the determination unit 113 determines whether the captured image includes an internal image of the vehicle 200 (vehicle internal image).
- vehicle internal image the captured image includes an internal image of the vehicle 200 (vehicle internal image).
- the controller 110 executes control processing in step S20, which will be described later.
- the image processing unit 114 identifies the scenery image and the vehicle interior image from the captured image.
- the image processing unit 114 performs blur correction processing on the landscape image based on the behavior data so as to cancel the shaking of the captured image caused by the behavior.
- the image processing unit 114 does not perform blur correction processing on the vehicle interior image. That is, the image processing unit 114 does not perform blur correction processing on images that are not affected by the behavior of the vehicle 200 , but performs blur correction processing on images that are affected by the behavior of the vehicle 200 . This makes it possible to separate the image processing according to the shaking of the images for the images inside and outside the vehicle 200 .
- step S18 the image processing unit 114 synthesizes the scenery image and the vehicle interior image that have undergone blur correction processing.
- the image processing unit 114 performs image processing so that the synthesized captured image becomes an image for VR display, and stores data of the processed image in the memory.
- the image processing unit 114 When it is determined in the control flow of step S15 that the captured image does not include the vehicle interior image, the image processing unit 114 performs blur correction processing on the entire captured image in step S20. In step S21, the image processing unit 114 performs image processing so that the image after blur correction processing becomes an image for VR display, and stores data of the processed image in the memory.
- step S22 the image processing unit 114 performs image processing so that the captured image acquired by the image acquiring unit 111 becomes an image for VR display. and store the processed image data in memory.
- step S ⁇ b>23 the transmission unit 115 transmits the captured image saved in the memory to the terminal device 300 via the communication device 120 .
- the image processing device acquires captured images captured by the in-vehicle camera 221 and/or the exterior camera 222 fixed to the vehicle 200, and acquires the behavior data indicating the behavior of the vehicle 200 when moving. and determines whether the captured image includes the scenery outside the vehicle 200, and if it is determined that the captured image includes the scenery outside the vehicle 200, the behavior acquired by the behavior data acquisition unit 112 Based on the data, blur correction processing is performed on the picked-up image so as to cancel the shaking of the picked-up image due to behavior. As a result, it is possible to suppress image shaking caused by the behavior of the vehicle 200 when it is moving.
- the image processing unit 114 identifies a scenery image and a vehicle interior image from the captured image, performs blur correction processing on the scenery image, and processes the scenery image processed by the blur correction processing and the blur corrected image.
- a vehicle interior image that has not undergone image processing is synthesized.
- the image processing device 100 includes a transmission section 115 that transmits a signal including the captured image synthesized by the image processing section 114 to the display device 340 located outside the vehicle 200 .
- the second user can feel as if he or she is actually riding with the vehicle, and an image with reduced VR motion sickness can be provided to the second user.
- the behavior data acquisition unit 112 identifies the vertical movement of the vehicle 200 and/or the rotation of the vehicle 200 about the vertical direction based on the behavior data, and determines the movement and/or rotation of the vehicle 200.
- a blur correction process is performed on the captured image so as to cancel the shaking of the captured image caused by Thereby, the behavior of the vehicle 200 can be identified with high accuracy.
- the behavior data acquisition unit 112 identifies the magnitude of behavior and the frequency of behavior based on the behavior data, and the image processing unit 114 determines if the magnitude of behavior having the specific frequency is equal to or greater than a threshold value.
- blur correction processing is performed on the captured image. This makes it possible to perform blur correction processing on the captured image when a behavior that causes VR sickness is occurring.
- the image processing unit 114 thickens the window frame included in the vehicle interior image. As a result, the deviation between the scenery image after the blur correction process and the inside of the vehicle can be eliminated.
- the image processing unit 114 corrects the captured image by enlarging one of the scenery image and the vehicle interior image. As a result, the deviation between the image after blur correction processing and the image without blur correction processing can be eliminated.
- the image processing unit 114 performs color processing on the boundary portion between the landscape image after blur correction processing and the vehicle interior image. Accordingly, it is possible to provide the user with an image that does not look unnatural between an image that has been subjected to blur correction processing and an image that has not been subjected to blur correction processing.
- the image processing unit 114 does not perform blur correction processing on the captured image when the captured image is an image inside the vehicle, and when the captured image is a landscape image, the image processing unit 114 performs blur correction processing on the captured image. Perform blur correction processing. As a result, blur correction processing is performed in accordance with the timing of displaying an image that causes VR sickness, and image shaking that causes VR sickness can be prevented.
- the image processing unit 114 has a correction mode in which blur correction processing is performed on the captured image and a non-correction mode in which blur correction processing is not performed on the captured image.
- a mode with correction and a mode without correction can be selected according to the user's selection. Thereby, the mode with correction and the mode without correction can be selected according to the user's command.
- the image processing device 100 may use the image processing unit 114 to distribute images for VR display to the network. This allows the second user to view VR images online or offline.
- the image processing device 100 may be provided in the vehicle 200 in this embodiment. That is, the in-vehicle controller 210 has functional blocks such as the image acquisition unit 111 included in the controller 110, and constructs a VR system by communication between the vehicle 200 and the terminal device 300 without going through a server. good too. Further, when the in-vehicle controller 210 has functional blocks such as the image acquisition unit 111 included in the controller 110, the blur correction processing of the image processing unit 114 is performed by a mobile terminal equipped with a camera that can be brought into the vehicle. good too.
- the image processing unit 114 when the in-vehicle camera 211 is fixed inside the vehicle via a vibration-absorbing head (camera mounting portion), the image processing unit 114 , the blur correction process may be performed on the vehicle interior image, and the blur correction process may not be performed on the landscape image.
- the head has a mechanism that moves a pedestal that fixes the camera according to the behavior of the vehicle 200, for example.
- the head may have a mechanism with a support member such as a tripod.
- the behavior is transmitted to the internal equipment of the vehicle such as the dashboard and pillars. Therefore, when the image captured by the in-vehicle camera 211 includes a scenery image, the scenery image does not shake, but the vehicle interior image shakes. In such a case, the image processing unit 114 performs blur correction processing on the vehicle interior image, and does not perform blur correction processing on the landscape image. The image processing unit 114 synthesizes the vehicle interior image processed by the blur correction process and the scenery image not processed by the blur correction process. Then, the image processing unit 114 stores data of the captured image that has undergone the synthesis processing in the memory.
- the image processing unit 114 identifies a scenery image and a vehicle interior image from the captured image, performs blur correction processing on the vehicle interior image, and performs image processing on the vehicle interior image by the blur correction processing.
- a scenery image that has not been image-processed by blur correction processing is synthesized. That is, in the present embodiment, the image processing unit 114 identifies the scenery image and the vehicle interior image from the captured image, performs blur correction processing on either one of the scenery image or the vehicle interior image, Among the images inside the vehicle, one image that has been image-processed by blur correction processing and the other image that has not been image-processed by blur correction processing are synthesized. Thereby, when the second user views the synthesized image, the second user can feel as if he or she is actually riding with the vehicle, and can be provided with an image with reduced VR motion sickness.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
本発明に係る画像処理装置の一実施形態を図面に基づいて説明する。図1は、本実施形態に係る画像処理装置100を含む画像処理システム10の構成の一例を示すブロック図である。図1で示されるように、本実施形態では、画像処理システム10は、画像処理装置100と、車両200と、端末装置300とを備える。画像処理装置100は、車両200及び端末装置300と通信を行い、情報の授受が可能なサーバである。画像処理システム10は、画像処理装置100と、車両200と、端末装置300との間でネットワークを構築することで、ネットワーク経由で計算負荷の高いタスク処理を行い、またアルゴリズムの変更も容易になるため、高速かつ汎用性のあるサービスを提供することが可能となる。画像処理装置100は、車両200の車室内及び/又は車室外の画像を車両200から取得し、取得した画像に対して必要に応じた画像処理を行い、車両200の車室内及び/又は車室外の画像を第2ユーザに表示させるために、画像を端末装置300に送信する。端末装置300は、車両200から離れた遠隔地空間に位置する。遠隔地空間は、画像処理装置100が設置されている場所及び車両200から離れた場所にあり、例えば第2ユーザの居所等である。なお、以下、画像処理システム10の説明では、移動体として車両200を例に挙げているが、移動体は車両200に限らず、例えば船舶や飛行機等の移動体でもよい。
100…画像処理装置
111…画像取得部
112…挙動データ取得部
113…判定部
114…画像処理部
115…送信部
116…受信部
120…通信装置
200…車両
300…端末装置
Claims (11)
- 移動体に固定された撮像装置により撮像される撮像画像を取得する画像取得部と、
前記移動体の移動時の挙動を示す挙動データを取得する挙動データ取得部と、
前記撮像画像が、前記移動体の外の景色を含むか否か判定する判定部と、
前記撮像画像が前記移動体の外の景色を含むと判定された場合には、前記挙動データ取得部により取得された挙動データに基づき、前記挙動による前記撮像画像の揺れを打ち消すように前記撮像画像に対してブレ補正処理を行う画像処理部とを備える画像処理装置。 - 請求項1記載の画像処理装置において、
前記画像処理部は、前記撮像画像から、前記移動体の外の景色を表す景色画像と、前記移動体の内部を表す移動体内部画像を特定し、
前記景色画像及び前記移動体内部画像のいずれ一方の画像に対して前記ブレ補正処理を行い、
前記景色画像及び前記移動体内部画像のうち、前記ブレ補正処理により画像処理された一方画像と、前記ブレ補正処理により画像処理されていない他方の画像を合成する画像処理装置。 - 請求項2記載の画像処理装置において、
前記画像処理部により合成された前記撮像画像を含む信号を、前記移動体の外部に位置する表示装置に送信する送信部を備える画像処理装置。 - 請求項1~3のいずれか一項に記載の画像処理装置において、
前記挙動データ取得部は、前記挙動データに基づき、前記移動体の垂直方向の動き及び/又は前記移動体の垂直方向を軸とした前記移動体の回転を特定し、
前記移動体の垂直方向は、前記移動体の走行面に対して法線方向であり、
前記画像処理部は、前記動き及び/又は前記回転により生じる前記撮像画像の揺れを打ち消すように前記撮像画像に対して前記ブレ補正処理を行う画像処理装置。 - 請求項1~4のいずれか一項に記載の画像処理装置において、
前記挙動データ取得部は、前記挙動データに基づき、前記挙動の大きさ及び前記挙動の周波数を特定し、
前記画像処理部は、特定の周波数をもつ前記挙動の大きさが閾値以上の場合に、前記撮像画像に対してブレ補正処理を行う画像処理装置。 - 請求項1~5のいずれか一項に記載の画像処理装置において、
前記画像処理部は、
前記撮像画像から、前記移動体の外の景色を表す景色画像と前記移動体の内部を表す移動体内部画像を特定し、
前記景色画像に対して前記ブレ補正処理を行い、
前記移動体内部画像に含まれる前記移動体の窓枠を太くする画像処理装置。 - 請求項1~5のいずれか一項に記載の画像処理装置において、
前記画像処理部は、
前記撮像画像から、前記移動体の外の景色を表す景色画像と、前記移動体の内部を表す移動体内部画像を特定し、
前記景色画像に対して前記ブレ補正処理を行い、
前記景色画像及び前記移動体内部画像のいずれか一方の画像を拡大する画像処理装置。 - 請求項1~5のいずれか一項に記載の画像処理装置において、
前記画像処理部は、前記撮像画像から、前記移動体の外の景色を表す景色画像と、前記移動体の内部を表す移動体内部画像を特定し、
前記景色画像に対して前記ブレ補正処理を行い、
前記ブレ補正処理後の景色画像と前記移動体内部画像との間の境界部分に対して色加工処理を行う画像処理装置。 - 請求項1~8のいずれか一項に記載の画像処理装置において、
前記画像処理部は、
前記撮像画像が前記移動体の内部を表す移動体内部画像である場合には、前記撮像画像に対して前記ブレ補正処理を行わず、
前記撮像画像が前記移動体の外の景色を表す景色画像である場合には、前記撮像画像に対して前記ブレ補正処理を行う画像処理装置。 - 請求項1~7のいずれか一項に記載の画像処理装置において、
前記画像処理部は、前記撮像画像に対して前記ブレ補正処理を行うモードと、前記撮像画像に対して前記ブレ補正処理を行わないモードとを選択できる画像処理装置。 - プロセッサにより画像処理を行う画像処理方法において、
移動体に固定された撮像装置により撮像された撮像画像を取得し、
前記移動体の移動時の挙動を示す挙動データをし、
前記撮像画像が、前記移動体の外の景色を含むか否か判定し、
前記撮像画像が前記移動体の外の景色を含むと判定された場合には、前記挙動データに基づき、前記挙動による前記撮像画像の揺れを打ち消すように前記撮像画像に対してブレ補正処理を行う画像処理方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023506365A JPWO2022195312A1 (ja) | 2021-03-18 | 2021-03-18 | |
CN202180095604.7A CN117178561A (zh) | 2021-03-18 | 2021-03-18 | 图像处理装置和图像处理方法 |
US18/282,745 US20240163555A1 (en) | 2021-03-18 | 2021-03-18 | Image processing device and image processing method |
PCT/IB2021/000167 WO2022195312A1 (ja) | 2021-03-18 | 2021-03-18 | 画像処理装置及び画像処理方法 |
EP21931373.1A EP4311218A4 (en) | 2021-03-18 | 2021-03-18 | IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2021/000167 WO2022195312A1 (ja) | 2021-03-18 | 2021-03-18 | 画像処理装置及び画像処理方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022195312A1 true WO2022195312A1 (ja) | 2022-09-22 |
Family
ID=83319894
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2021/000167 WO2022195312A1 (ja) | 2021-03-18 | 2021-03-18 | 画像処理装置及び画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240163555A1 (ja) |
EP (1) | EP4311218A4 (ja) |
JP (1) | JPWO2022195312A1 (ja) |
CN (1) | CN117178561A (ja) |
WO (1) | WO2022195312A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017028527A (ja) * | 2015-07-23 | 2017-02-02 | 株式会社Jvcケンウッド | 動画撮影装置、動画撮影方法及びプログラム |
JP2019033408A (ja) * | 2017-08-09 | 2019-02-28 | キヤノン株式会社 | 撮像装置およびその制御方法 |
JP2019092001A (ja) * | 2017-11-13 | 2019-06-13 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP2020129019A (ja) | 2019-02-07 | 2020-08-27 | キヤノン株式会社 | 撮像装置およびその制御方法 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5233631B2 (ja) * | 2008-12-11 | 2013-07-10 | ソニー株式会社 | 振れ補正装置と振れ補正方法および撮像装置 |
-
2021
- 2021-03-18 CN CN202180095604.7A patent/CN117178561A/zh active Pending
- 2021-03-18 EP EP21931373.1A patent/EP4311218A4/en active Pending
- 2021-03-18 WO PCT/IB2021/000167 patent/WO2022195312A1/ja active Application Filing
- 2021-03-18 US US18/282,745 patent/US20240163555A1/en active Pending
- 2021-03-18 JP JP2023506365A patent/JPWO2022195312A1/ja active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017028527A (ja) * | 2015-07-23 | 2017-02-02 | 株式会社Jvcケンウッド | 動画撮影装置、動画撮影方法及びプログラム |
JP2019033408A (ja) * | 2017-08-09 | 2019-02-28 | キヤノン株式会社 | 撮像装置およびその制御方法 |
JP2019092001A (ja) * | 2017-11-13 | 2019-06-13 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP2020129019A (ja) | 2019-02-07 | 2020-08-27 | キヤノン株式会社 | 撮像装置およびその制御方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4311218A4 |
Also Published As
Publication number | Publication date |
---|---|
CN117178561A (zh) | 2023-12-05 |
US20240163555A1 (en) | 2024-05-16 |
EP4311218A4 (en) | 2024-05-01 |
EP4311218A1 (en) | 2024-01-24 |
JPWO2022195312A1 (ja) | 2022-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106062826B (zh) | 图像生成装置以及图像生成方法 | |
JP5652037B2 (ja) | 模擬映像生成装置、方法、プログラム | |
EP0709816B1 (en) | Display apparatus and its control method | |
JP2017185988A (ja) | 車両用装置、車両用プログラム、フィルタ設計プログラム | |
JP2009017020A (ja) | 画像処理装置及び表示画像生成方法 | |
JP5739624B2 (ja) | 光学機器、撮像装置、及び制御方法 | |
JP2008139600A (ja) | 表示装置 | |
US7940295B2 (en) | Image display apparatus and control method thereof | |
US11887281B2 (en) | Information processing device, head-mounted display, and image displaying method | |
WO2019098198A1 (ja) | 画像生成装置、ヘッドマウントディスプレイ、画像生成システム、画像生成方法、およびプログラム | |
JP2001346200A (ja) | 画像切出し/表示システム | |
JP2021180425A (ja) | 遠隔制御システムとその遠隔作業装置、映像処理装置およびプログラム | |
WO2022195312A1 (ja) | 画像処理装置及び画像処理方法 | |
CN108058643B (zh) | 车辆后方区域图像显示装置以及存储车辆后方区域图像显示程序的计算机可读介质 | |
JP2023179496A (ja) | 画像処理装置及び画像処理方法 | |
WO2017168953A1 (ja) | 車両用装置、車両用プログラム、フィルタ設計プログラム | |
WO2017108990A1 (en) | Rear view device for a vehicle | |
US20230071690A1 (en) | Remote control system, remote operation apparatus, video image processing apparatus, and computer-readable medium | |
JP4696825B2 (ja) | 車両用死角映像表示装置 | |
JP6655751B1 (ja) | 映像表示制御装置、方法およびプログラム | |
JP2011234272A (ja) | 携帯端末装置、サーバ装置および画像補正方法 | |
CN113519166A (zh) | 远程操作指示系统和安装式设备 | |
JPWO2022195312A5 (ja) | ||
JP2024084218A (ja) | 画像処理装置、画像処理方法 | |
JP7161470B2 (ja) | 画像処理装置およびその制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21931373 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023506365 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18282745 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021931373 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021931373 Country of ref document: EP Effective date: 20231018 |