WO2019000947A1 - 图像处理方法和系统、存储介质和移动系统 - Google Patents

图像处理方法和系统、存储介质和移动系统 Download PDF

Info

Publication number
WO2019000947A1
WO2019000947A1 PCT/CN2018/075057 CN2018075057W WO2019000947A1 WO 2019000947 A1 WO2019000947 A1 WO 2019000947A1 CN 2018075057 W CN2018075057 W CN 2018075057W WO 2019000947 A1 WO2019000947 A1 WO 2019000947A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
edge
image processing
image
matrix
Prior art date
Application number
PCT/CN2018/075057
Other languages
English (en)
French (fr)
Inventor
周莉
何璇
张凯亮
康锦刚
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to EP18764983.5A priority Critical patent/EP3648053B1/en
Priority to US16/085,652 priority patent/US10933931B2/en
Publication of WO2019000947A1 publication Critical patent/WO2019000947A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D57/00Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
    • B62D57/02Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
    • B62D57/024Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members specially adapted for moving on inclined or vertical surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01DCONSTRUCTION OF BRIDGES, ELEVATED ROADWAYS OR VIADUCTS; ASSEMBLY OF BRIDGES
    • E01D19/00Structural or constructional details of bridges
    • E01D19/10Railings; Protectors against smoke or gases, e.g. of locomotives; Maintenance travellers; Fastening of pipes or cables to bridges
    • E01D19/106Movable inspection or maintenance platforms, e.g. travelling scaffolding or vehicles specially designed to provide access to the undersides of bridges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • Embodiments of the present disclosure are directed to an image processing method, an image processing system, a storage medium, a mobile system including the image processing system, and a mobile system for use with the image processing system.
  • the Climbing Robot can perform functions such as helping disabled people and elderly people to go up and down stairs or carry heavy loads on stairs.
  • a wheeled stair climber is a common type of stair climber.
  • the common step size of a wheeled climber (for example, a single-wheeled walker) is fixed and the height of the climb is fixed. It cannot be automatically identified according to the height or width of the stairs. Manual assist can realize the function of climbing stairs and turning at the corner of the stairs.
  • Embodiments of the present disclosure provide an image processing method, an image processing system, a storage medium, a mobile system including the image processing system, and a mobile system used in conjunction with the image processing system, and embodiments of the present disclosure can automatically recognize steps such as stairs structure.
  • ; calculating the edge determination average absolute difference d 0 vector elements adjacent ⁇ D (
  • the image processing method further includes: capturing, according to the camera, a rotation angle ⁇ with respect to a horizontal plane when acquiring a left image and a right image of the two-dimensional depth matrix D p , and the edge to the light
  • the distance d of the heart connection determines the horizontal distance of the edge to the optical line and the vertical distance of the edge to the optical line.
  • the image processing method further includes: step A, the camera photographs the target object at different rotation angles relative to a horizontal plane to acquire a plurality of sets of left and right images; and step B, according to the plurality of groups of left
  • the image and the right image acquire a plurality of two-dimensional depth matrices; in step C, the number of edges corresponding to each two-dimensional depth matrix is determined; and in step D, the corresponding cameras corresponding to the two-dimensional depth matrices corresponding to 0 edges and one edge respectively are determined.
  • the rotation angles ⁇ 0 and ⁇ 1 in the horizontal plane; and the steps A to C in the angle range of ⁇ 0 to ⁇ 1 are repeated to determine a critical rotation angle ⁇ c of the camera with respect to the horizontal plane, wherein the critical rotation angle ⁇ c is a rotation angle corresponding to a two-dimensional depth matrix in which the number of edges corresponding to the two-dimensional depth matrix is changed between 0 and 1 as the number of edges whose change threshold is 1.
  • the image processing method further includes calculating a horizontal distance and a vertical distance between adjacent edges according to the critical rotation angle ⁇ c .
  • the image processing method further includes: respectively acquiring a left image and a right image including the target object by using a left camera and a right camera of the camera before acquiring the two-dimensional depth matrix D p ; and the left image And comparing at least one of the right image with the reference image to determine whether the target object in the at least one of the left image and the right image includes a rectilinear edge.
  • the image processing method further includes: after determining that the target object in the at least one of the left image and the right image includes a rectilinear edge, acquiring a plurality of sets of correspondence according to the left image and the right image a pixel, wherein each set of corresponding pixel points includes a left pixel point in a left image and a right pixel point in a right image corresponding to the same object point of the target object; and comparing left pixel points of each group of corresponding pixel points The magnitude of the ordinate of the right pixel is determined to determine whether the linear edge of the target object is parallel to the optical center of the left and right cameras.
  • the linear edge is parallel to the optical center line of the left camera and the right camera; at the corresponding pixel point In a case where the ordinate of the left pixel is greater than the ordinate of the right pixel, determining that the distance from the optical center of the left camera to the linear edge is greater than the distance from the optical center of the right camera to the linear edge; And if the ordinate of the left pixel point of the corresponding pixel is smaller than the ordinate of the right pixel, determining that the distance from the optical center of the left camera to the linear edge is smaller than the optical center of the right camera to the The distance from the straight edge.
  • the image processing method further includes acquiring a parallax matrix D x using the left image and the right image in a case where the linear edge is parallel to the optical line connection, wherein the parallax
  • the value of each element in the matrix D x is an absolute value of a difference between the abscissas of the left pixel point and the right pixel point of the corresponding set of pixel points corresponding to each element; and according to the parallax matrix D x ,
  • the focal length f of the left camera and the right camera, and the distance D c between the left camera and the optical center of the right camera obtain the two-dimensional depth matrix D p .
  • At least one embodiment of the present disclosure provides an image processing system including a processor and a memory for storing executable instructions that are loaded and executed by the processor: acquiring a two-dimensional shape including depth information of a target object the depth D p matrix, wherein the elements d ij is the matrix of the two-dimensional depth D p of the i-th row and j-th column element, and the depth values d ij values; calculating a two-dimensional depth of the matrix D p
  • ; calculating an average value d 0 of the absolute values of adjacent element differences in the edge determination vector ⁇ D (
  • At least one embodiment of the present disclosure provides a mobile system that includes the image processing system described above.
  • the mobile system also includes a camera and a motion control system.
  • the camera includes a left camera and a right camera and is configured to output an image signal to the image processing system.
  • the action control system includes: an active mobile device configured to move the mobile system; a passive mobile device coupled to the active mobile device and configured to be movable by the active mobile device a stair climbing device configured to drive the mobile system to implement a stair function; and a drive control device configured to control the active mobile device and the climber according to a processing result of the image processing system The action of the device.
  • the stair climbing device includes a stair wheel that includes a plurality of rod-like structures that project radially outward from a center of the stair wheel.
  • the climbing device further includes a liftable bar that is coupled to the climbing wheel.
  • the active moving device includes two active moving wheels, the two active moving wheels are equal in diameter, and the centers of the two active moving wheels are connected to the left and right cameras of the camera.
  • the lines of the heart are parallel.
  • the drive control device includes a travel drive shaft, a climbing drive shaft, an engine, and a brake, the engine being configured to drive the active mobile device through the travel drive shaft and drive the drive through the climb drive shaft A stair climbing device, the brake being configured to control braking of the active mobile device and the passive mobile device.
  • the passive moving device includes a driven moving wheel, a half shaft, and a differential, the differential being coupled to the traveling drive shaft, the half shaft to the driven moving wheel and the differential connection.
  • the motion control system further includes a chassis at a front end of the chassis, the active moving device, the passive moving device, and the climbing device being located below the chassis.
  • At least one embodiment of the present disclosure provides a mobile system for use with the image processing system described above, including a camera and a motion control system.
  • the camera includes a left camera and a right camera and is configured to output an image signal to the image processing system.
  • the action control system includes: an active mobile device configured to move the mobile system; a passive mobile device coupled to the active mobile device and configured to be movable by the active mobile device a stair climbing device configured to drive the mobile system to implement a stair climbing function, wherein the stair climbing device includes a stair wheel, the stair wheel comprising a plurality of rod-shaped structures, the plurality of rods a structure projecting radially outward from a center of the climbing wheel; and a drive control device configured to control an action of the active moving device and the climbing device according to a processing result of the image processing system .
  • FIG. 1 is a flowchart of an image processing method according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram showing a relationship between a critical rotation angle of a camera, a width of a step, and a height of a step in the embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a triangulation principle adopted by an image processing method in an embodiment of the present disclosure
  • FIG. 4 is another flowchart of an image processing method according to an embodiment of the present disclosure.
  • FIG. 5 is a schematic structural diagram of an image processing system according to an embodiment of the present disclosure.
  • FIG. 6 is a structural block diagram of a mobile system in an embodiment of the present disclosure.
  • FIG. 7a is another structural block diagram of a mobile system in an embodiment of the present disclosure.
  • FIG. 7b is a schematic structural diagram of a mobile system in an embodiment of the present disclosure.
  • Embodiments of the present disclosure provide an image processing method, an image processing system, a storage medium, a mobile system including the image processing system, and a mobile system for use with the image processing system.
  • the image processing method or image processing system can be used to identify an object including a step such as a staircase.
  • a mobile system employing the image processing method or image processing system also has a similar automatic recognition function and can also be configured to have the function of automatically climbing a building.
  • the mobile system is a leg system of a stair climber or a stair climber.
  • At least one embodiment of the present disclosure provides an image processing method that can be used to identify a linear edge of a stepped structure such as a stair.
  • the method includes at least the following steps S01 to S05.
  • the depth value of the element refers to the distance from the object point corresponding to the element to the camera, that is, the distance from the object point to the optical connection of the left camera and the right camera of the camera.
  • d i (d i1 +d i2 +d i3 +...+d im )/m.
  • Step S05 Comparing the absolute value
  • the image processing method provided by the embodiment of the present disclosure utilizes the following properties of a stepped structure such as a staircase, taking a staircase as an example: (1) a stepped staircase appears in front of the camera and the step edge of the stairs and the left and right cameras of the camera In the case where the optical core lines are parallel, the depth values of the object points of the same step edge to the camera are the same. Therefore, the depth can be obtained by averaging each line element of the two-dimensional depth matrix D p in step S02. Vector D; (2) In the vicinity of the edge of the step, the depth value is abruptly changed. With this property, the position of the mutation can be identified by comparing the magnitude of
  • step S05 in the case where
  • the distance of the camera of the depth information that is, the distance d to the optical connection of the left camera and the right camera included in the camera.
  • the image processing party provided by at least one embodiment of the present disclosure further includes step S06: after identifying the linear edge of the target object, capturing the left for acquiring the two-dimensional depth matrix D p according to the camera.
  • the angle of rotation ⁇ of the image and the right image with respect to the horizontal plane, and the distance d from the edge to the optical line connecting the left and right cameras of the camera, the horizontal distance d*cos ⁇ of the edge to the optical line connection and the edge to light can be determined.
  • the vertical distance d*sin ⁇ of the heart line according to the horizontal distance and the vertical distance, can determine the linear movement distance and the lifting distance of the mobile system.
  • the rotation angle ⁇ of the camera with respect to the horizontal plane can be recorded while the target object is photographed to acquire the left image and the right image, thereby facilitating acquisition of the rotation angle ⁇ recorded at the time of photographing after the rectilinear edge is recognized.
  • the image processing method provided by at least one embodiment of the present invention may further include: Step A, the camera photographs the target object at different rotation angles with respect to the horizontal plane to acquire a plurality of sets of left and right images; Step B, according to Multiple sets of left and right images acquire a plurality of two-dimensional depth matrices; step C, determine the number of edges corresponding to each two-dimensional depth matrix; and step D, determine that the two-dimensional depth matrices corresponding to 0 edges and one edge respectively correspond to The rotation angles ⁇ 0 and ⁇ 1 of the camera relative to the horizontal plane; and, in the angular range of ⁇ 0 to ⁇ 1 , repeat steps A to C to determine the critical rotation angle ⁇ c of the camera with respect to the horizontal plane.
  • the critical rotation angle ⁇ c is a rotation angle corresponding to the two-dimensional depth matrix having the number of edges whose change point is 1 when the number of edges corresponding to the two-dimensional depth matrix is changed between 0 and 1.
  • the critical rotation angle ⁇ c is a rotation angle corresponding to the two-dimensional depth matrix having the number of edges whose change point is 1 when the number of edges corresponding to the two-dimensional depth matrix is changed between 0 and 1.
  • the angle of rotation of the camera can be adjusted at a selected step angle (for example, an angular change of 5°) within an initial angle range of ⁇ /2 to 45° ( ⁇ is the angle of view of the camera) (ie, camera and The angle between the horizontal planes is obtained to obtain a two-dimensional depth matrix at different rotation angles, and the angle between the steps 1 and 0 is selected by judging the number of stairs corresponding to the two-dimensional depth matrix corresponding to different rotation angles. Range; thereafter, the depth vector D is further acquired at different angles of rotation within the range of angles until a rotation angle of exactly 1 is obtained, which is the critical rotation angle ⁇ c .
  • the angle of view of the camera
  • the width and height of the step can be determined based on the critical rotation angle ⁇ c .
  • the image processing method provided by at least one embodiment of the present invention further includes: as shown in FIG. 2, the critical rotation angle ⁇ c is equal to the angle of the plane of the adjacent step edge with respect to the horizontal plane (ie, the camera is at a critical rotation angle).
  • the critical rotation angle ⁇ c is equal to the angle of the plane of the adjacent step edge with respect to the horizontal plane (ie, the camera is at a critical rotation angle).
  • the horizontal distance s (ie the width of the step) and the vertical distance h (ie the height of the step) between adjacent edges are calculated according to the critical rotation angle ⁇ c .
  • s d * cos ⁇ c
  • h d * sin ⁇ c .
  • C denotes an optical center of the left camera and the right camera of the camera which coincide with each other, and a broken line indicates a step of the stairs.
  • the rotation angle ⁇ c of the camera with respect to the horizontal plane is equal to the plane between the plane of the adjacent straight edge Ed and the horizontal plane. Angle.
  • the image processing method provided by at least one embodiment of the present disclosure further includes: respectively acquiring a left image and a right image including the target object by using a left camera and a right camera of the camera before acquiring the two-dimensional depth matrix D p ; At least one of the right image and the right image are compared with a reference image stored in the image library to determine whether the target object in at least one of the left image and the right image includes a rectilinear edge.
  • the image processing method provided by at least one embodiment of the present disclosure further includes: acquiring a plurality of sets of corresponding pixel points according to the left image and the right image after determining that the target object in at least one of the left image and the right image includes a linear edge (eg, at least three sets of corresponding pixel points), wherein each set of corresponding pixel points includes a left pixel point in a left image and a right pixel point in a right image corresponding to the same object point of the target object; thereafter, comparing each group corresponding The size of the ordinate of the left pixel and the right pixel of the pixel to determine whether the linear edge of the target object is parallel to the optical connection of the left and right cameras.
  • a linear edge eg, at least three sets of corresponding pixel points
  • FIG. 3 is a schematic diagram of a triangulation principle adopted by the image processing method in the embodiment of the present disclosure.
  • C l represents the optical center of the left camera
  • C r represents the optical center of the right camera
  • D c represents the distance between the optical centers of the left camera and the right camera
  • f represents the left camera and the right camera.
  • Focal length (the focal lengths of the two cameras are equal); P represents the object point on the rectilinear edge; P l and P r are the imaging points of the object point P on the focal plane of the camera and the two form a corresponding set of pixels, (x l , y l ) and (x r , y r ) are the coordinate points of the positions of the presentation points P l and P r in the corresponding images, respectively; d i represents the optical center connection C l of the object point P to the left camera and the right camera .
  • the linear edge is determined to be connected to the optical center of the left camera and the right camera.
  • the line C l C r is parallel; in the case where the ordinate of the left pixel of the corresponding pixel is larger than the ordinate of the right pixel, that is, in the case of y l >y r , the optical center C l of the left camera is determined
  • the distance between the straight edges is greater than the distance from the optical center C r of the right camera to the straight edge; in the case where the ordinate of the left pixel of the corresponding pixel is smaller than the ordinate of the right pixel, ie, at y l ⁇ y r case, it is determined that the optical center of the left camera C l a distance less than the straight edges to the right camera optical center C r of the distance to the straight edges.
  • the image processing method provided by at least one embodiment of the present disclosure further includes: acquiring a parallax matrix D x using a left image and a right image in a case where the linear edge is parallel to the optical line connection C l C r , wherein the parallax
  • the left image taken by the left camera can be represented by a matrix R l
  • the right image acquired by the right camera can be represented by a matrix R r
  • R l [l 11 , l 12 , l 13 ,...;l 21 ,l 22 ,l 23 ,...;...;...l n1 ,l n2 ,l n3 ,...]
  • R r [r 11 ,r 12 ,r 13 ,...;r 21 ,r 22 ,r 23 ,...;...;...r n1 , r n2 , r n3 ,...]
  • the image processing method includes the following steps S1 to S8.
  • Step S1 The left image and the right image including the target object are acquired by the camera, and the rotation angle of the camera is recorded.
  • Step S2 Compare at least one of the left image and the right image with a reference image stored in the image library.
  • Step S3 According to the comparison result, it is determined whether the target object includes a linear edge to determine whether the target object is a staircase. If the comparison result is that the target object does not include a straight edge but includes a stair turn, determine the steering displacement (also called the reverse displacement or steering angle); if the comparison result does not include a straight edge or a stair turn, then return to the step S1, continue to shoot the environment in front of the camera.
  • the steering displacement also called the reverse displacement or steering angle
  • step S41 is performed to determine whether the linear edge is parallel to the optical center line of the left camera and the right camera of the camera.
  • step S42 is performed to adjust the distance between the left camera and the right camera of the camera to the target object so that the straight edge is parallel to the optical line, and then step S1 is performed.
  • step S5 is performed: acquiring a two-dimensional depth matrix including depth information of the target object from the left image and the right image.
  • Step S6 According to the above steps S01 to S05, determine the number of linear edges obtainable by using the two-dimensional depth matrix, that is, how many elements in the two-dimensional depth matrix satisfy
  • Step S7 Repeat the above steps multiple times to make the camera take multiple shots at different rotation angles until the critical two-dimensional depth matrix and the critical rotation angle ⁇ c are determined .
  • steps refer to the related description above, and the repeated description will not be repeated.
  • Step S8 Determine the distance d from the edge of the step to the camera according to the obtained critical two-dimensional depth matrix, and calculate the width and height of each step of the staircase according to d and the critical rotation angle ⁇ c .
  • the instructions in the memory can be executed after being executed by the processor: determining the distance d i from the linear edge to the camera, determining the critical two-dimensional depth matrix D p , determining the critical rotation angle, and determining the horizontal distance from the linear edge to the camera. And the vertical distance, or to determine whether the linear edge is parallel to the camera, and so on.
  • the memory can be a semiconductor memory, a magnetic surface memory, a laser memory, a random access memory, a read only memory, a serial access memory, a non-permanent memory, a permanently memory, or any other form of memory known in the art.
  • the processor can be a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic device, discrete gate or transistor logic device, discrete hardware component .
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor can be a microprocessor or any conventional processor or the like.
  • the instructions in the storage medium can be executed after being executed by the processor: determining the distance d i from the linear edge to the camera, determining the critical two-dimensional depth matrix D p , determining the critical rotation angle, determining the level of the linear edge to the camera Distance and vertical distance, or determine whether the linear edge is parallel to the camera, and so on.
  • the processor determining the distance d i from the linear edge to the camera, determining the critical two-dimensional depth matrix D p , determining the critical rotation angle, determining the level of the linear edge to the camera Distance and vertical distance, or determine whether the linear edge is parallel to the camera, and so on.
  • the storage medium can be a semiconductor memory, a magnetic surface memory, a laser memory, a random access memory, a read only memory, a serial access memory, a non-permanent memory, a permanent memory, or any other form of storage well known in the art. medium.
  • the processor can be a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic device, discrete gate or transistor logic device, discrete hardware component .
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor can be a microprocessor or any conventional processor or the like.
  • At least one embodiment of the present disclosure provides a mobile system including a camera, the image processing system and the motion control system described in any of the above embodiments.
  • the camera includes a left camera and a right camera for photographing the front environment (such as a stair, or a turn of a stair, or other front environment) to obtain a left image and a right image of the front environment.
  • the front environment such as a stair, or a turn of a stair, or other front environment
  • the image processing system is configured to process images acquired by the camera, for example, including preliminary recognition and feature extraction of left and right images taken by the camera.
  • the image preliminary recognition function of the image processing system includes: performing preliminary processing such as noise reduction, and storing the collected image, and then performing preliminary recognition, for example, by comparing the acquired image with the reference image to determine whether the captured image is included or not.
  • the initially identified image may also be stored in an image database for image training to further optimize the image recognition rate.
  • the image feature extraction function of the image processing system includes: (1) extracting the horizontal distance of the linear edge of the target object to the camera, for example, extracting the depth feature of the stairs, that is, the width of a single step of the stairs, to determine the moving distance of the mobile system (ie, (2) extracting the distance from the linear edge of the target object to the camera, for example, extracting the height feature of the stairs, ie the height of a single step of the stairs, to determine the height distance of the mobile system (ie The distance moved up or down); (3) Extracting the frontal environmental features of the stair turn by means of image training and machine learning to determine the displacement of the moving system (ie, the displacement or steering angle of the turn).
  • the action control system is configured to perform mobile control of the mobile system based on processing results of the image processing system, such as features extracted by the image processing system.
  • the motion control system is used to: (1) control the moving wheel of the mobile system according to the moving distance (for example, including the active moving wheel and the driven moving wheel) to move forward or backward; (2) control the climbing wheel upward according to the height distance; Or move down; (3) control the steering wheel to move (ie, turn) according to the variable displacement; (4) brake the action wheel.
  • the action control system controls the mobile system to at least implement the functions as described in (1) and (2);
  • the action control system controls the mobile system to at least implement the function as described in (3).
  • the mobile system in the embodiment of the present disclosure can automatically recognize the front environment of the mobile system through the camera and the image processing system, and automatically acquire the step width and height of the stairs in the front environment, and can automatically realize the motion control system. Adjusting the progress of the line, the height of the climb or the angle of the turn and achieving automatic travel, so that the mobile system can be intelligently traveled in different scenes, different step widths and different heights without manual assistance.
  • a mobile system MS provided by at least one embodiment of the present disclosure includes a camera C, an image processing system P, and a motion control system MCS.
  • the camera C includes a left camera LC and a right camera RC that is coupled to the image processing system P and configured to output an image signal to the image processing system P.
  • the image processing system P can recognize the front environment based on the image signal output from the camera C, for example, identifying the depth information of the stairs (ie, the step width of the stairs), the height distance, and the variable displacement.
  • the action control system MCS includes an active mobile device AMD configured to move the mobile system MS, a passive mobile device PMD coupled to the active mobile device AMD, and configured to be movable by the active mobile device AMD a stair climbing device CD configured to drive the mobile system MS to implement a stair function; and a drive control device DCD connected to the image processing system P and configured to be controllable according to a processing result of the image processing system P
  • the actions of the active mobile device AMD and the climbing device CD can control the active mobile device AMD to realize the function of automatically adjusting the line to advance, automatically travel or realize the turning according to the processing result of the image processing system P.
  • the drive control device DCD can also control the climbing of the building according to the processing result of the image processing system P.
  • the device CD implements the function of climbing stairs.
  • the climbing function of the mobile system mentioned in the embodiment of the present disclosure is only for the stairs, and may also be for any object having a stepped structure.
  • the motion control system MCS also includes a chassis BP, and the camera C (see LC and RC) is located in front of the chassis BP to capture the front environment during the movement of the mobile system MS, the active mobile device AMD, the passive mobile device PMD and the climbing device
  • the CDs are located below the chassis BP to drive the chassis for forward and backward movement, steering movement and stair climbing.
  • Above the chassis BP can be used to carry heavy objects or people with disabilities or the elderly.
  • the drive control device DCD includes a travel drive shaft MDS, a climb drive shaft CDS, an engine E, and a brake BK1, the engine E being configured to drive the active mobile device AMD through the travel drive shaft MDS and drive the creeper device through the climbing drive shaft CDS CD, the transmitter E is configured to control the active mobile device AMD and the floor climbing device CD according to the processing result of the image processing system P, the brake BK1 being configured to control the active mobile device AMD and the passive mobile device PMD according to the processing result of the image processing system P Brake.
  • the travel drive shaft MDS is coupled to the engine E via the travel transmission MSC and transmits the power of the engine E to the active mobile device AMD; for example, the climb drive shaft CDS is connected to the engine E through the creeping transmission CSC, and the engine E is The power is transmitted to the climbing device CD.
  • the brake BK1 is coupled to the active mobile device AMD and the passive mobile device PMD via a hydraulic conduit HP or other type of transmission system.
  • the drive control device DCD may also include another brake BK2 that is coupled to the passive mobile device PMD.
  • brake BK2 is a brake drum or other type of brake.
  • the left camera LC and the right camera RC included in the camera C are both CCD (charge-coupled device) cameras or other types of cameras.
  • the climbing device CD includes a climbing wheel, and the climbing wheel includes a plurality of rod-shaped structures CWR (for example, three or more rod-shaped structures), and the plurality of rod-shaped structures CWR are outward from the center of the climbing wheel Radially extended.
  • the rod-shaped structure CWR of the climbing wheel acts on the steps of the stairs, and the rod-shaped structure is rotated correspondingly by rotating the climbing wheel, thereby driving the mobile system MS to cross the steps to realize climbing.
  • the climbing wheel includes three rod-like structures CWR, and the three rod-shaped structures CWR will equally divide the circumference.
  • the rod-shaped structure CWR can be a rod of any rigid material.
  • the climbing device CD includes two climbing wheels CW of the left climbing wheel LCW and the right climbing wheel RCW (for example, the lengths of the rod-shaped structures included in the left and right climbing wheels are equal), and the two climbing wheels
  • the arrangement direction is the same as that of the left camera LC and the right camera RC (for example, the lines connecting the centers of the left and right climbing wheels are parallel to the optical centers of the left and right cameras).
  • the use of two climbing wheels helps to keep the mobile system MS stable during the climb.
  • one of the left climbing wheel LCW and the right climbing wheel RCW is an active climbing wheel and the other is a passive climbing wheel.
  • the active climbing wheel can be flipped forward to climb the building under the control of the engine E.
  • the climbing device CD further includes a liftable bar that can be raised or lowered, and the liftable bar connects the chassis and the stair wheel for lifting and lowering the climbing wheel, thereby making the mobile system suitable for steps of different heights.
  • the liftable bar is coupled to the engine E and configured to be raised or lowered under the control of the engine E.
  • the climbing device CD includes a left liftable lever LLF connected to the left climbing wheel LCW and a right liftable lever RLF connected to the right climbing wheel RCW.
  • the active mobile device AMD includes two active moving wheels, see the left active moving wheel LAW and the right active moving wheel RAW, for implementing front, rear, left, and right movement of the mobile system on a plane.
  • the two active moving wheels are arranged at the rear of the climbing wheel to avoid the active moving wheel touching the stairs during use, which affects the climbing of the building.
  • the movement of the left and right active moving wheels can be controlled, and turning or straight traveling can be realized.
  • the processing result of the image processing system P is that the ordinates of the left pixel point and the right pixel point in the corresponding pixel point are not equal, that is, y l ⁇ y r in FIG.
  • the two active moving wheels are adopted to facilitate the control of the forward and backward movement and the steering movement of the mobile system MS, and to facilitate the smooth movement of the mobile system MS.
  • the active moving device AMD includes two active moving wheels that are equal in diameter, and the centers of the two active moving wheels are connected in parallel with the optical centers of the left camera LC and the right camera RC of the camera.
  • the step can be judged by comparing whether the left point of the left object in the left image acquired by the left camera and the right point of the right pixel in the right image acquired by the right camera are equal. Whether the edge is parallel to the optical center line of the left camera and the right camera; in the case where the two are not parallel, by setting the diameters of the two active moving wheels to be equal, it is convenient to move the left active moving wheel LAW or the right active moving wheel RAW to keep the edge of the step parallel to the line of light.
  • the passive moving device PMD includes a driven moving wheel, a half shaft HS and a differential DF located behind the active moving wheel, the differential DF is connected to the traveling drive shaft MDS, and the half shaft HS is moving the moving wheel PW and the difference Speed DF connection.
  • the differential DF is used to enable the active and driven moving wheels to rotate at different rotational speeds.
  • the half shaft HS is used to effect transmission of power between the differential DF and the driven moving wheel.
  • the passive mobile device PMD includes a left driven moving wheel LPW and a right driven moving wheel RPW arranged in the arrangement direction of the left camera LC and the right camera RC, so that the mobile system MS remains stationary.
  • At least another embodiment of the present disclosure provides a mobile system for use with image processing system P that is similar to mobile system MS as shown in Figure 7a, with the main difference being that image processing system P is not included.
  • the mobile system includes a camera and a motion control system, as shown in Figure 7a, the camera includes a left camera LC and a right camera RC, and is configured to output an image signal to the image processing system P.
  • the action control system includes: an active mobile device configured to move the mobile system MS according to a processing result of the image processing system P; a passive mobile device connected to the active mobile device, and configured to be driven by the active mobile device Moving; a climbing device configured to drive the mobile system MS to implement a stair climbing function, the stair climbing device comprising a climbing wheel (eg, including a left climbing wheel LCW and a right climbing wheel RCW), the climbing wheel including a plurality of poles a structure CWR, the plurality of rod-shaped structures CWR project radially outward from a center of the climbing wheel; and a drive control device configured to control the active moving device and the climbing device according to the processing result of the image processing system P Actions.
  • a climbing wheel eg, including a left climbing wheel LCW and a right climbing wheel RCW
  • the climbing wheel including a plurality of poles a structure CWR, the plurality of rod-shaped structures CWR project radially outward from a center of the climbing wheel
  • the drive control device includes a central processing unit CPU, a travel drive shaft MDS, a climb drive shaft CDS, an engine E, and a brake BK1, and the engine E is configured to drive the active mobile device through the travel drive shaft MDS and drive through the climb drive shaft CDS
  • the creeping device, the transmitter E is configured to control the active mobile device and the climbing device according to instructions of the central processing unit CPU
  • the brake BK1 is configured to control the braking of the active mobile device and the passive mobile device according to instructions of the central processing unit CPU.
  • the central processing unit CPU and the image processing system P may be integrated or both may use separate devices connected to each other.
  • the climbing device includes two climbing wheels CW, and the arrangement directions of the two climbing wheels CW are the same as those of the left camera LC and the right camera RC.
  • the climbing device also includes a liftable bar that can be coupled to the climbing wheel.
  • the image processing method and system provided by the embodiments of the present disclosure can be used to automatically recognize a stepped structure such as a staircase; the mobile system in the embodiment of the present disclosure can automatically recognize the front environment of the mobile system through the camera and the image processing system. And automatically obtain the step width and height of the stairs in the front environment, through the action control system, it can automatically adjust the line length, climb height or corner and realize automatic travel, which can realize different scenes, different step widths and different step heights.
  • the mobile system is intelligently traveled without the need for manual assistance.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Transportation (AREA)
  • Robotics (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Architecture (AREA)
  • Civil Engineering (AREA)
  • Structural Engineering (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

一种图像处理方法和系统、存储介质和移动系统,该图像处理方法包括:获取包括目标物体的深度信息的二维深度矩阵Dp;计算Dp中的每一行元素的平均值以得到深度向量D=[d1,d2,d3,…,di,…,dn],其中,元素di为所述二维深度矩阵Dp的第i行元素的平均值;计算所述深度向量D中的相邻元素的差的绝对值以得到边缘确定向量ΔD=[Δd1,Δd2,Δd3,…,Δdi,…,Δdn-1],其中,元素Δdi=|di-di+1|;计算所述边缘确定向量ΔD中相邻元素差值的绝对值的平均值d0=(|Δd2-Δd1|+|Δd3-Δd2|+…+|Δdn-1-Δdn-2|)/(n-2);以及将边缘确定向量ΔD中的相邻元素差值的绝对值|Δdi-Δdi+1|与d0进行比较以识别所述目标物体的边缘,其中,在|Δdi-Δdi+1|大于d0的情况下,确定二维深度矩阵Dp中的第i行元素对应所述目标物体的边缘。

Description

图像处理方法和系统、存储介质和移动系统 技术领域
本公开的实施例涉及一种图像处理方法、图像处理系统、存储介质、包括该图像处理系统的移动系统以及与该图像处理系统配合使用的移动系统。
背景技术
爬楼机(Climbing Robot)可以实现诸如帮助残疾人和老年人上下楼梯或者在楼梯上搬运重物等功能。
轮式爬楼机是一种常见的爬楼机。然而,常见的轮式爬楼机(例如,依靠单轮行动的爬楼机)移动的步长以及爬楼的高度都是固定的,无法根据楼梯的高度或者宽度的不同进行自动识别,因此需要人工辅助才能实现爬楼的功能以及在楼梯拐弯处拐弯的功能。
发明内容
本公开实施例提供一种图像处理方法、图像处理系统、存储介质、包括该图像处理系统的移动系统以及与该图像处理系统配合使用的移动系统,本公开实施例可以自动识别例如楼梯等台阶状结构。
本公开的至少一个实施例提供一种图像处理方法,其包括:获取包括目标物体的深度信息的二维深度矩阵D p,其中,元素d ij为所述二维深度矩阵D p中的第i行和第j列元素,并且d ij的数值为深度值;计算所述二维深度矩阵D p中的每一行元素的平均值以得到深度向量D=[d 1,d 2,d 3,…,d i,…,d n],其中,元素d i为所述二维深度矩阵D p的第i行元素的平均值;计算所述深度向量D中的相邻元素的差的绝对值以得到边缘确定向量ΔD=[Δd 1,Δd 2,Δd 3,…,Δd i,…,Δd n-1],其中,元素Δd i=|d i-d i+1|;计算所述边缘确定向量ΔD中相邻元素差值的绝对值的平均值d 0=(|Δd 2-Δd 1|+|Δd 3-Δd 2|+…+|Δd n-1-Δd n-2|)/(n-2);以及将边缘确定向量ΔD中的相邻元素差值的绝对值|Δd i-Δd i+1|与d 0进行比较以识别所述目标物体的边缘,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定二维深度矩阵D p中的第i行元素对应所述目标物体的边缘。
例如,在|Δd i-Δd i+1|大于d 0的情况下,确定深度向量D中的d i的数值为所述边缘到用于获取所述目标物体的深度信息的摄像机包括的左摄像头和右摄像头的光心连线的距离d。
例如,所述的图像处理方法还包括:根据所述摄像机拍摄用于获取所述二维深度矩阵D p的左图像和右图像时相对于水平面的旋转角度θ、以及所述边缘到所述光心连线的所述距离d,确定所述边缘到所述光心连线的水平距离以及所述边缘到所述光心连线的垂直距离。
例如,所述的图像处理方法还包括:步骤A,摄像机以相对于水平面的不同旋转角度对所述目标物体进行拍摄,以获取多组左图像和右图像;步骤B,根据所述多组左图像和右图像获取多个二维深度矩阵;步骤C,确定每个二维深度矩阵对应的边缘数量;步骤D,确定分别对应0个边缘和1个边缘的二维深度矩阵分别对应的摄像机相对于水平面的旋转角度θ 0和θ 1;以及在θ 0至θ 1的角度范围内重复步骤A至步骤C,以确定摄像机相对于水平面的临界旋转角度θ c,其中,所述临界旋转角度θ c是指二维深度矩阵对应的边缘数量在0和1之间进行改变时作为改变临界点的边缘数量为1的二维深度矩阵对应的旋转角度。
例如,所述的图像处理方法还包括:根据所述临界旋转角度θ c计算相邻边缘之间的水平距离和垂直距离。
例如,所述的图像处理方法还包括:在获取所述二维深度矩阵D p之前,利用摄像机的左摄像头和右摄像头分别获取包括所述目标物体的左图像和右图像;将所述左图像和右图像中的至少一个与参考图像进行比较,以确定所述左图像和所述右图像中的所述至少一个中的目标物体是否包括直线形边缘。
例如,所述的图像处理方法还包括:在确定所述左图像和所述右图像中的所述至少一个中的目标物体包括直线形边缘之后,根据所述左图像和右图像获取多组对应像素点,其中,每组对应像素点包括对应于所述目标物体的同一物点的左图像中的左像素点和右图像中的右像素点;以及比较每组对应像素点的左像素点与右像素点的纵坐标的大小,以确定所述目标物体的直线形边缘与所述左摄像头和右摄像头的光心连线是否平行。
例如,在对应像素点的左像素点的纵坐标等于右像素点的纵坐标的情况 下,确定所述直线形边缘与所述左摄像头和右摄像头的光心连线平行;在对应像素点的左像素点的纵坐标大于右像素点的纵坐标的情况下,确定所述左摄像头的光心到所述直线形边缘的距离大于所述右摄像头的光心到所述直线形边缘的距离;以及在对应像素点的左像素点的纵坐标小于右像素点的纵坐标的情况下,确定所述左摄像头的光心到所述直线形边缘的距离小于所述右摄像头的光心到所述直线形边缘的距离。
例如,所述的图像处理方法还包括:在所述直线形边缘与所述光心连线平行的情况下,利用所述左图像和所述右图像获取视差矩阵D x,其中,所述视差矩阵D x中每个元素的数值为所述每个元素对应的一组对应像素点的左像素点和右像素点的横坐标之差的绝对值;以及根据所述视差矩阵D x、所述左摄像头和右摄像头的焦距f、以及所述左摄像头和所述右摄像头的光心之间的距离D c,得到所述二维深度矩阵D p
本公开的至少一个实施例提供一种图像处理系统,其包括处理器以及存储器,用于存储可执行指令,所述指令被所述处理器加载并执行:获取包括目标物体的深度信息的二维深度矩阵D p,其中,元素d ij为所述二维深度矩阵D p中的第i行和第j列元素,并且d ij的数值为深度值;计算所述二维深度矩阵D p中的每一行元素的平均值以得到深度向量D=[d 1,d 2,d 3,…,d i,…,d n],其中,元素d i为所述二维深度矩阵D p的第i行元素的平均值;计算所述深度向量D中的相邻元素的差的绝对值以得到边缘确定向量ΔD=[Δd 1,Δd 2,Δd 3,…,Δd i,…,Δd n-1],其中,元素Δd i=|d i-d i+1|;计算所述边缘确定向量ΔD中相邻元素差值的绝对值的平均值d 0=(|Δd 2-Δd 1|+|Δd 3-Δd 2|+…+|Δd n-1-Δd n-2|)/(n-2);以及将边缘确定向量ΔD中的相邻元素差值的绝对值|Δd i-Δd i+1|与d 0进行比较以识别所述目标物体的边缘,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定二维深度矩阵D p中的第i行元素对应所述目标物体的边缘。
本公开的至少一个实施例提供一种存储介质,其中存储有指令,所述指令用于被处理器加载并执行:获取包括目标物体的深度信息的二维深度矩阵D p,其中,元素d ij为所述二维深度矩阵D p中的第i行和第j列元素,并且d ij的数值为深度值;计算所述二维深度矩阵D p中的每一行元素的平均值以得到深度向量D=[d 1,d 2,d 3,…,d i,…,d n],其中,元素d i为所述二维深度矩阵D p的第i行元素的平均值;计算所述深度向量D中的相邻元素的差的绝对值 以得到边缘确定向量ΔD=[Δd 1,Δd 2,Δd 3,…,Δd i,…,Δd n-1],其中,元素Δd i=|d i-d i+1|;计算所述边缘确定向量ΔD中相邻元素差值的绝对值的平均值d 0=(|Δd 2-Δd 1|+|Δd 3-Δd 2|+…+|Δd n-1-Δd n-2|)/(n-2);以及将边缘确定向量ΔD中的相邻元素差值的绝对值|Δd i-Δd i+1|与d 0进行比较以识别所述目标物体的边缘,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定二维深度矩阵D p中的第i行元素对应所述目标物体的边缘。
本公开的至少一个实施例提供一种移动系统,其包括以上所述的图像处理系统。
例如,所述的移动系统还包括摄像机和行动控制系统。所述摄像机包括左摄像头和右摄像头并且被配置为向所述图像处理系统输出图像信号。所述行动控制系统包括:主动移动装置,其被配置为可移动所述移动系统;被动移动装置,其与所述主动移动装置连接,并且被配置为可在所述主动移动装置的带动下移动;爬楼装置,其被配置为可带动所述移动系统实现爬楼功能;以及驱动控制装置,其被配置为可根据所述图像处理系统的处理结果控制所述主动移动装置和所述爬楼装置的动作。
例如,所述爬楼装置包括爬楼轮,所述爬楼轮包括多个杆状结构,所述多个杆状结构从所述爬楼轮的中心向外呈辐射状伸出。
例如,所述爬楼装置还包括可升降杆,所述可升降杆与所述爬楼轮连接。
例如,所述主动移动装置包括两个主动移动轮,所述两个主动移动轮的直径相等,并且所述两个主动移动轮的中心的连线与所述摄像机的左摄像头和右摄像头的光心的连线平行。
例如,所述驱动控制装置包括行进传动轴、爬楼传动轴、发动机和制动器,所述发动机被配置为通过所述行进传动轴驱动所述主动移动装置并且通过所述爬楼传动轴驱动所述爬楼装置,所述制动器被配置为控制所述主动移动装置和所述被动移动装置的制动。
例如,所述被动移动装置包括从动移动轮、半轴和差速器,所述差速器与所述行进传动轴连接,所述半轴将所述从动移动轮与所述差速器连接。
例如,所述行动控制系统还包括底盘,所述摄像机位于所述底盘的前端,所述主动移动装置、被动移动装置和爬楼装置位于所述底盘下方。
本公开的至少一个实施例提供一种用于与以上所述的图像处理系统配合 使用的移动系统,其包括摄像机以及行动控制系统。所述摄像机包括左摄像头和右摄像头,并且被配置为向所述图像处理系统输出图像信号。所述行动控制系统包括:主动移动装置,其被配置为可移动所述移动系统;被动移动装置,其与所述主动移动装置连接,并且被配置为可在所述主动移动装置的带动下移动;爬楼装置,其被配置为可带动所述移动系统实现爬楼功能,其中,所述爬楼装置包括爬楼轮,所述爬楼轮包括多个杆状结构,所述多个杆状结构从所述爬楼轮的中心向外呈辐射状伸出;以及驱动控制装置,其被配置为可根据所述图像处理系统的处理结果控制所述主动移动装置和所述爬楼装置的动作。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对实施例的附图作简单地介绍,显而易见地,下面描述中的附图仅仅涉及本公开的一些实施例,而非对本公开的限制。
图1为本公开实施例提供的图像处理方法的流程图;
图2为本公开实施例中摄像机的临界旋转角度、台阶的宽度以及台阶的高度之间的关系示意图;
图3为本公开实施例中的图像处理方法采用的三角测量原理的示意图;
图4为本公开实施例提供的图像处理方法的另一流程图;
图5为本公开实施例中的图像处理系统的结构示意图;
图6为本公开实施例中的移动系统的结构框图;
图7a为本公开实施例中的移动系统的另一结构框图;
图7b为本公开实施例中的移动系统的结构示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例的附图,对本公开实施例的技术方案进行清楚、完整地描述。显然,所描述的实施例是本公开的一部分实施例,而不是全部的实施例。基于所描述的本公开的实施例,本领域普通技术人员在无需创造性劳动的前提下所获得的所有其他实施例,都属于本公开保护的范围。
除非另外定义,本公开使用的技术术语或者科学术语应当为本公开所属领域内具有一般技能的人士所理解的通常意义。本公开中使用的“第一”、“第二”以及类似的词语并不表示任何顺序、数量或者重要性,而只是用来区分不同的组成部分。“包括”或者“包含”等类似的词语意指出现该词前面的元件或者物件涵盖出现在该词后面列举的元件或者物件及其等同,而不排除其他元件或者物件。“连接”或者“相连”等类似的词语并非限定于物理的或者机械的连接,而是可以包括电性的连接,不管是直接的还是间接的。“上”、“下”、“左”、“右”等仅用于表示相对位置关系,当被描述对象的绝对位置改变后,则该相对位置关系也可能相应地改变。
本公开实施例提供一种图像处理方法、图像处理系统、存储介质、包括该图像处理系统的移动系统以及与该图像处理系统配合使用的移动系统。该图像处理方法或图像处理系统可以用于识别例如楼梯等包括台阶的物体。采用该图像处理方法或图像处理系统的移动系统也具有类似的自动识别功能并且还可以被配置为具有自动爬楼的功能。例如,该移动系统为爬楼机或爬楼机的腿部系统。
下面结合图1至图4对本公开实施例提供的图像处理方法进行详细说明。
本公开的至少一个实施例提供一种图像处理方法,该方法可以用于识别楼梯等台阶状结构的直线形边缘。例如,如图1所示,该方法至少包括以下步骤S01至步骤S05。
步骤S01:获取包括目标物体(例如楼梯)的深度信息的二维深度矩阵D p=[d 11,d 12,d 13,…;d 21,d 22,d 23,…;…;…;d n1,d n2,d n3,…],其中,元素d ij为二维深度矩阵D p中的第i行和第j列元素,并且d ij的数值为深度值,该深度值表示该元素对应的物点到摄像机的距离。
需要说明的是,元素的深度值是指该元素对应的物点到摄像机的距离,即该物点到摄像机的左摄像头和右摄像头的光心连线的距离。
步骤S02:计算二维深度矩阵D p中的每一行元素的平均值以得到深度向量D=[d 1,d 2,d 3,…,d i,…,d n],其中,元素d i为二维深度矩阵D p的第i行元素的平均值,假设第i行共有m个元素,则d i=(d i1+d i2+d i3+…+d im)/m。
步骤S03:计算深度向量D中的相邻元素的差的绝对值以得到边缘确定向量ΔD=[Δd 1,Δd 2,Δd 3,…,Δd i,…,Δd n-1],其中,元素Δd i=|d i-d i+1|。
步骤S04:计算边缘确定向量ΔD中相邻元素差值的绝对值的平均值d 0=(|Δd 2-Δd 1|+|Δd 3-Δd 2|+…+|Δd n-1-Δd n-2|)/(n-2)。
步骤S05:将边缘确定向量ΔD中的相邻元素差值的绝对值|Δd i-Δd i+1|与d 0进行比较以识别目标物体的边缘,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定出二维深度矩阵D p中的第i行元素对应所述目标物体的边缘。
本公开实施例提供的图像处理方法,利用了楼梯等台阶状结构的以下性质,以楼梯为例:(1)在摄像机的前面出现阶梯状楼梯并且楼梯的台阶边缘与摄像机的左、右摄像头的光心连线平行的情况下,同一台阶边缘的各物点到摄像机的深度值是一样的,因此,可以在步骤S02中采用对二维深度矩阵D p的每一行元素取平均的方式获得深度向量D;(2)在台阶边缘附近,深度值发生突变,利用此性质,可以在步骤S05中通过比较|Δd i-Δd i+1|与d 0的大小来识别发生突变的位置,从而识别台阶边缘的位置。
例如,在步骤S05中,在|Δd i-Δd i+1|大于d 0的情况下,还可以确定出深度向量D中的d i的数值为所述直线形边缘到用于获取目标物体的深度信息的摄像机的距离,即到摄像机包括的左摄像头和右摄像头的光心连线的距离d。
例如,如图1所示,本公开的至少一个实施例提供的图像处理方还包括步骤S06:在识别到目标物体的直线形边缘之后,根据摄像机拍摄用于获取二维深度矩阵D p的左图像和右图像时相对于水平面的旋转角度θ、以及该边缘到摄像机左右摄像头光心连线的上述距离d,可以确定出该边缘到光心连线的水平距离d*cosθ以及该边缘到光心连线的垂直距离d*sinθ,根据该水平距离和该垂直距离可以确定移动系统的直线移动距离和升降距离。
例如,在对目标物体进行拍摄以获取左图像和右图像的同时可以记录摄像机相对于水平面的旋转角度θ,从而有利于在识别到所述直线形边缘后获取拍摄时所记录的旋转角度θ。
例如,本发明的至少一个实施例提供的图像处理方法还可以包括:步骤A,摄像机以相对于水平面的不同旋转角度对目标物体进行拍摄,以获取多组左图像和右图像;步骤B,根据多组左图像和右图像获取多个二维深度矩阵;步骤C,确定每个二维深度矩阵对应的边缘数量;步骤D,确定分别对应0个边缘和1个边缘的二维深度矩阵分别对应的摄像机相对于水平面的旋转角度θ 0和θ 1;以及,在θ 0至θ 1的角度范围内重复步骤A至步骤C,以确 定摄像机相对于水平面的临界旋转角度θ c。临界旋转角度θ c是指二维深度矩阵对应的边缘数量在0和1之间进行改变时作为改变临界点的边缘数量为1的二维深度矩阵对应的旋转角度。以摄像机位于楼梯下方为例,摄像机相对于水平面的旋转角度逐渐增大的过程中,根据对应于不同旋转角度的二维深度矩阵所得到的边缘数量从0逐渐增大,在边缘数量逐渐变化的过程中存在边缘数量刚好变为1这一临界点,该临界点对应的二维深度矩阵为临界二维深度矩阵,并且该临界点对应的旋转角度为临界旋转角度θ c
例如,可以在α/2至45°的初始角度范围内(α为摄像机的视场角),以选定的步进角度(例如5°的角度变化量)调整摄像机的旋转角度(即摄像机与水平面之间的夹角),以获取不同旋转角度下的二维深度矩阵,通过判断不同旋转角度对应的二维深度矩阵所对应的楼梯阶梯数,选择在阶梯数为1和0之间的角度范围;之后,在该角度范围内进一步以不同的旋转角度获取深度向量D,直到得到阶梯数刚好为1时的旋转角度,该旋转角度即为临界旋转角度θ c
根据临界旋转角度θ c可以确定出台阶的宽度和高度。例如,本发明的至少一个实施例提供的图像处理方法还包括:如图2所示,临界旋转角度θ c与相邻的台阶边缘所在平面相对于水平面的夹角相等(即摄像机处于临界旋转角度θ c时其成像面与相邻台阶边缘所在平面平行),因此,根据临界旋转角度θ c计算相邻边缘之间的水平距离s(即台阶的宽度)和垂直距离h(即台阶的高度),其中,s=d*cosθ c,h=d*sinθ c。在图2中,C表示摄像机的左摄像头和右摄像头的彼此重合的光心,折线表示楼梯的台阶,摄像机相对于水平面的旋转角度θ c等于相邻的直线边缘Ed所在平面与水平面之间的夹角。
例如,本公开的至少一个实施例提供的图像处理方法还包括:在获取二维深度矩阵D p之前,利用摄像机的左摄像头和右摄像头分别获取包括目标物体的左图像和右图像;将左图像和右图像中的至少一个与存储在图像库中的参考图像进行比较,以确定左图像和右图像中的至少一个中的目标物体是否包括直线形边缘。
例如,本公开的至少一个实施例提供的图像处理方法还包括:在确定左图像和右图像中的至少一个中的目标物体包括直线形边缘之后,根据左图像和右图像获取多组对应像素点(例如,至少三组对应像素点),其中,每组 对应像素点包括对应于目标物体的同一物点的左图像中的左像素点和右图像中的右像素点;之后,比较每组对应像素点的左像素点与右像素点的纵坐标的大小,以确定目标物体的直线形边缘与左摄像头和右摄像头的光心连线是否平行。
图3为本公开实施例中的图像处理方法采用的三角测量原理的示意图。例如,如图3所示,C l表示左摄像头的光心;C r表示右摄像头的光心;D c表示左摄像头和右摄像头的光心之间的距离;f表示左摄像头和右摄像头的焦距(两个摄像头的焦距相等);P表示直线形边缘上的物点;P l和P r为物点P在摄像机焦距平面上的成像点并且二者构成一组对应像素点,(x l,y l)和(x r,y r)分别为呈现点P l和P r在对应图像中的位置的坐标点;d i表示物点P到左摄像头和右摄像头的光心连线C lC r的距离,d i=fD c/d x,d x=|x l-x r|。
例如,在对应像素点的左像素点的纵坐标等于右像素点的纵坐标的情况下,即在y l=y r的情况下,确定出直线形边缘与左摄像头和右摄像头的光心连线C lC r平行;在对应像素点的左像素点的纵坐标大于右像素点的纵坐标的情况下,即在y l>y r的情况下,确定出左摄像头的光心C l到直线形边缘的距离大于右摄像头的光心C r到直线形边缘的距离;在对应像素点的左像素点的纵坐标小于右像素点的纵坐标的情况下,即在y l<y r的情况下,确定出左摄像头的光心C l到直线形边缘的距离小于右摄像头的光心C r到直线形边缘的距离。
例如,本公开的至少一个实施例提供的图像处理方法还包括:在直线形边缘与光心连线C lC r平行的情况下,利用左图像和右图像获取视差矩阵D x,其中,视差矩阵D x中每个元素d x的数值为相应的一组对应像素点的左像素点和右像素点的横坐标之差的绝对值,也就是说,d x=|x l-x r|;利用公式d i=fD c/d x,并且根据视差矩阵D x、左摄像头和右摄像头的焦距f、以及左摄像头和右摄像头的光心之间的距离D c,可以得到二维深度矩阵D p
例如,左摄像头拍摄的左图像可以用矩阵R l表示,右摄像头获取的右图像可以用矩阵R r表示,R l=[l 11,l 12,l 13,…;l 21,l 22,l 23,…;…;…l n1,l n2,l n3,…],R r=[r 11,r 12,r 13,…;r 21,r 22,r 23,…;…;…r n1,r n2,r n3,…],其中,R l和R r中各元素的数值为灰度值;在直线形边缘与左、右摄像头的光心连线平行的情况下,通过对R l和R r中的元素进行匹配,将对应于相同物点的像素点提取出来,并且根据 公式d x=|x l-x r|,可得到视差矩阵D x
下面以本公开实施例用于识别楼梯为例并且结合图4,对本公开实施例提供的图像处理方法进行说明。例如,该图像处理方法包括以下步骤S1至步骤S8。
步骤S1:利用摄像机获取包括目标物体的左图像和右图像,并且记录摄像机的旋转角度。
步骤S2:将左图像和右图像中的至少一个与存储在图像库里的参考图像进行比较。
步骤S3:根据比较结果,确定目标物体是否包括直线形边缘,以确定目标物体是否为楼梯。若比较结果为目标物体不包括直线形边缘但包括楼梯拐弯,则确定转向位移(也称为变向位移或者转向角度);若比较结果既不包括直线形边缘也不包括楼梯拐弯,则返回步骤S1,继续拍摄摄像机的前方环境。
若比较结果为目标物体包括直线形边缘,则进行步骤S41:判断直线形边缘与摄像机的左摄像头和右摄像头的光心连线是否平行。
若直线形边缘与光心连线不平行,则进行步骤S42,即调整摄像机的左摄像头和右摄像头到目标物体的距离以使直线形边缘与光心连线平行,之后进行步骤S1。
若直线形边缘与光心连线平行,则进行步骤S5:根据左图像和右图像获取包括目标物体的深度信息的二维深度矩阵。
步骤S6:根据以上步骤S01至步骤S05,确定利用二维深度矩阵可得到的直线形边缘的个数,即二维深度矩阵中有多少个元素满足如以上步骤S05所述的|Δd i-Δd i+1|大于d 0这一边缘位置判定条件。
步骤S7:多次重复以上步骤,使摄像机以不同的旋转角度进行多次拍摄,直到确定出临界二维深度矩阵及临界旋转角度θ c。该步骤可参见以上相关描述,重复之处不再赘述。
步骤S8:根据得到的临界二维深度矩阵确定台阶边缘到摄像机的距离d,并且根据d和临界旋转角度θ c计算楼梯的每个台阶的宽度和高度。
本公开的至少一个实施例还提供一种图像处理系统,如图5所示,该图像处理系统包括:处理器;存储器,用于存储可执行指令,指令被处理器加 载并执行以下步骤:获取包括目标物体的深度信息的二维深度矩阵D p,其中,元素d ij为二维深度矩阵D p中的第i行和第j列元素,并且d ij的数值为深度值;计算二维深度矩阵D p中的每一行元素的平均值以得到深度向量D=[d 1,d 2,d 3,…,d i,…,d n],其中,元素d i为二维深度矩阵D p的第i行元素的平均值;计算深度向量D中的相邻元素的差的绝对值以得到边缘确定向量ΔD=[Δd 1,Δd 2,Δd 3,…,Δd i,…,Δd n-1],其中,元素Δd i=|d i-d i+1|;计算边缘确定向量ΔD中相邻元素差值的绝对值的平均值d 0=(|Δd 2-Δd 1|+|Δd 3-Δd 2|+…+|Δd n-1-Δd n-2|)/(n-2);以及将边缘确定向量ΔD中的相邻元素差值的绝对值|Δd i-Δd i+1|与d 0进行比较以识别目标物体的边缘,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定二维深度矩阵D p中的第i行元素对应目标物体的边缘。
例如,存储器中的指令在被处理器运行后还可执行:确定直线形边缘到摄像机的距离d i、确定临界二维深度矩阵D p、确定临界旋转角度、确定直线形边缘到摄像机的水平距离和垂直距离、或者确定直线形边缘与摄像机是否平行,等等。这些步骤的实现方式可参照以上图像处理方法的实施例中的相关描述,重复之处不再赘述。
例如,该存储器可以是半导体存储器、磁表面存储器、激光存储器、随机存储器、只读存储器、串行访问存储器、非永久记忆的存储器、永久性记忆的存储器或者本领域熟知的任何其它形式的存储器。
例如,处理器可以是通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。通用处理器可以是微处理器或者任何常规的处理器等。
本公开的至少一个实施例还提供一种存储介质,其中存储有指令,该指令用于被处理器加载并执行:获取包括目标物体的深度信息的二维深度矩阵D p,其中,元素d ij为二维深度矩阵D p中的第i行和第j列元素,并且d ij的数值为深度值;计算二维深度矩阵D p中的每一行元素的平均值以得到深度向量D=[d 1,d 2,d 3,…,d i,…,d n],其中,元素d i为二维深度矩阵D p的第i行元素的平均值;计算深度向量D中的相邻元素的差的绝对值以得到边缘确定向量ΔD=[Δd 1,Δd 2,Δd 3,…,Δd i,…,Δd n-1],其中,元素Δd i=|d i-d i+1|;计算边缘确定向量ΔD中相邻元素差值的绝对值的平均值d 0= (|Δd 2-Δd 1|+|Δd 3-Δd 2|+…+|Δd n-1-Δd n-2|)/(n-2);以及将边缘确定向量ΔD中的相邻元素差值的绝对值|Δd i-Δd i+1|与d 0进行比较以识别目标物体的边缘,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定二维深度矩阵D p中的第i行元素对应目标物体的边缘。
例如,存储介质中的指令在被处理器运行后还可执行:确定直线形边缘到摄像机的距离d i、确定临界二维深度矩阵D p、确定临界旋转角度、确定直线形边缘到摄像机的水平距离和垂直距离、或者确定直线形边缘与摄像机是否平行,等等。这些步骤的实现方式可参照以上图像处理方法的实施例中的相关描述,重复之处不再赘述。
例如,该存储介质可以是半导体存储器、磁表面存储器、激光存储器、随机存储器、只读存储器、串行访问存储器、非永久记忆的存储器、永久性记忆的存储器或者本领域熟知的任何其它形式的存储介质。
例如,处理器可以是通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。通用处理器可以是微处理器或者任何常规的处理器等。
下面以用于识别楼梯为例并结合图6、图7a和图7b,对本公开实施例提供的移动系统进行详细说明。
例如,如图6所示,本公开的至少一个实施例提供一种移动系统,其包括摄像机、以上任一实施例所述的图像处理系统和行动控制系统这三部分。
摄像机包括左摄像头和右摄像头,用于对前方环境(例如楼梯、或楼梯的拐弯、或其它前方环境)进行拍摄,以获取该前方环境的左图像和右图像。
图像处理系统用于对摄像机采集的图像进行处理,例如包括对摄像机拍摄的左图像和右图像进行初步识别和特征提取。
图像处理系统的图像初步识别功能包括:将采集的图像进行例如减少噪声等初步处理后进行存储,之后进行初步识别,例如,通过将采集的图像与参考图像进行比较以确定采集的图像中是否包括楼梯或楼梯的拐弯,由此实现初步识别。在一些实施例中,也可以将经过初步识别的图像存储到图像数据库中,用于图像训练,以进一步优化图像识别率。
图像处理系统的图像特征提取功能包括:(1)提取目标物体的直线形边 缘到摄像机的水平距离,例如提取楼梯的深度特征,即楼梯的单个台阶的宽度,以确定移动系统的移动距离(即向前或向后移动的距离);(2)提取目标物体的直线形边缘到摄像机的垂直距离,例如提取楼梯的高度特征,即楼梯的单个台阶的高度,以确定移动系统的高度距离(即向上或向下移动的距离);(3)通过图像训练和机器学习的方式提取关于楼梯拐弯的前方环境特征,以确定移动系统的变向位移(即拐弯的位移或者转向角度)。
行动控制系统用于根据图像处理系统的处理结果(例如图像处理系统提取的特征)对移动系统进行移动控制。例如,行动控制系统用于:(1)根据移动距离控制移动系统的行动轮(例如包括主动移动轮和从动移动轮)向前或向后移动;(2)根据高度距离控制爬楼轮向上或向下移动;(3)根据变向位移控制行动轮转向移动(即拐弯);(4)对行动轮进行制动。例如,在图像处理系统的处理结果为目标物体包括直线形边缘时(例如目标物体为楼梯时),则行动控制系统控制该移动系统至少实现如(1)和(2)所述的功能;在图像处理系统的处理结果为目标物体为楼梯转弯时,则行动控制系统控制该移动系统至少实现如(3)所述的功能。
由图6可以看出,本公开实施例中的移动系统通过摄像机和图像处理系统可以自动识别移动系统的前方环境,并且自动获取前方环境中楼梯的台阶宽度和高度,通过行动控制系统可以实现自动调节行进步长、爬楼高度或拐弯角度并且实现自动行进,从而可以实现在不同场景、不同阶梯宽度和不同高度的情况下使该移动系统智能地行进,而不需要人工辅助。
例如,如图7a和图7b所示,本公开的至少一个实施例提供的移动系统MS包括摄像机C、图像处理系统P和行动控制系统MCS。摄像机C包括左摄像头LC和右摄像头RC,摄像机C与图像处理系统P连接并且被配置为向图像处理系统P输出图像信号。图像处理系统P根据摄像机C输出的图像信号可以识别前方环境,例如识别楼梯的深度信息(即楼梯的台阶宽度)、高度距离和变向位移。行动控制系统MCS包括:主动移动装置AMD,其被配置为可移动所述移动系统MS;被动移动装置PMD,其与主动移动装置AMD连接,并且被配置为可在主动移动装置AMD的带动下移动;爬楼装置CD,其被配置为可带动所述移动系统MS实现爬楼功能;以及驱动控制装置DCD,其与图像处理系统P连接,并且被配置为可根据图像处理系统P的处 理结果控制主动移动装置AMD和爬楼装置CD的动作。驱动控制装置DCD可根据图像处理系统P的处理结果控制主动移动装置AMD实现自动调节行进步长、自动行进或者实现拐弯的功能,驱动控制装置DCD还可以根据图像处理系统P的处理结果控制爬楼装置CD实现爬楼的功能。
需要说明的是,本公开实施例中提及的移动系统的爬楼功能并仅针对楼梯,也可以针对任意的具有台阶状结构的物体。
例如,行动控制系统MCS还包括底盘BP,摄像机C(参见LC和RC)位于底盘BP前方以在移动系统MS移动过程中对前方环境进行拍摄,主动移动装置AMD、被动移动装置PMD和爬楼装置CD都位于底盘BP下方以带动底盘实现前后移动、转向移动和爬楼功能。底盘BP上方可用于承载重物或残疾人或老年人等。
例如,驱动控制装置DCD包括行进传动轴MDS、爬楼传动轴CDS、发动机E和制动器BK1,发动机E被配置为通过行进传动轴MDS驱动主动移动装置AMD并且通过爬楼传动轴CDS驱动爬楼装置CD,发送机E被配置为根据图像处理系统P的处理结果控制主动移动装置AMD和爬楼装置CD,制动器BK1被配置为根据图像处理系统P的处理结果控制主动移动装置AMD和被动移动装置PMD的制动。
例如,行进传动轴MDS通过行进变速器MSC与发动机E连接,并且将发动机E的动力传递给主动移动装置AMD;例如,爬楼传动轴CDS与通过爬楼变速器CSC与发动机E连接,并且将发动机E的动力传递给爬楼装置CD。
例如,制动器BK1通过液压管道HP或者其他类型的传动系统与主动移动装置AMD和被动移动装置PMD连接。
例如,驱动控制装置DCD还可以包括另一制动器BK2,其与被动移动装置PMD连接。例如,制动器BK2为制动鼓或其它类型的制动器。
例如,摄像机C包括的左摄像头LC和右摄像头RC都为CCD(charge-coupled device)相机或其它类型相机。
例如,爬楼装置CD包括爬楼轮,爬楼轮包括多个杆状结构CWR(例如3个或者更多个杆状结构),该多个杆状结构CWR从爬楼轮的中心向外呈辐射状伸出。在爬楼过程中,爬楼轮的杆状结构CWR作用在楼梯的台阶上, 通过旋转爬楼轮使得该杆状结构相应地发生旋转,由此带动移动系统MS跨越台阶,实现爬楼。例如,爬楼轮包括三个杆状结构CWR,并且这三个杆状结构CWR将圆周等分。该杆状结构CWR可以为任意刚性材质的杆。
例如,爬楼装置CD包括左爬楼轮LCW和右爬楼轮RCW这两个爬楼轮CW(例如左、右爬楼轮包括的杆状结构的长度相等),并且这两个爬楼轮的排列方向与左摄像头LC和右摄像头RC的排列方向相同(例如左、右爬楼轮的中心的连线与左、右摄像头的光心连线平行)。采用两个爬楼轮有利于使移动系统MS在爬楼过程中保持平稳。例如,左爬楼轮LCW和右爬楼轮RCW中的一个为主动爬楼轮且另一个为被动爬楼轮。主动爬楼轮在发动机E的控制下可向前翻转进行爬楼的动作。
例如,爬楼装置CD还包括可升高或降低的可升降杆,可升降杆连接底盘和爬楼轮,用于实现爬楼轮的升降,从而使移动系统适用于不同高度的台阶。例如,可升降杆与发动机E连接并且被配置为在发动机E的控制下升高或者降低。例如,爬楼装置CD包括与左爬楼轮LCW连接的左可升降杆LLF和与右爬楼轮RCW连接的右可升降杆RLF。
例如,主动移动装置AMD包括两个主动移动轮,参见左主动移动轮LAW和右主动移动轮RAW,用于实现移动系统在平面上前、后、左、右移动。这两个主动移动轮都设置于爬楼轮的后方,以避免使用过程中主动移动轮触碰楼梯,影响爬楼轮爬楼。例如,根据图像处理系统P的处理结果控制左、右主动移动轮的移动,可以实现转弯或直行。例如,在图像处理系统P的处理结果为对应像素点中左像素点和右像素点的纵坐标不相等时,即图3中的y l≠y r时,可以通过控制左、右主动移动轮实现转弯;若y l>y r,则通过控制左主动移动LAW轮向前运动,可以使y l=y r;若y l<y r,则通过控制右主动移动轮RAW向前运动,可以使得y l=y r。采用两个主动移动轮,便于对移动系统MS的前后移动和转向移动进行控制,并有利于使移动系统MS平稳地移动。
例如,主动移动装置AMD包括的两个主动移动轮的直径相等,并且这两个主动移动轮的中心的连线与摄像机的左摄像头LC和右摄像头RC的光心连线平行。在本公开实施例中,通过比较楼梯台阶边缘的同一物点在左摄像头获取的左图像中的左像素点以及在右摄像头获取的右图像中的右像素点 的纵坐标是否相等,可以判断台阶边缘是否与左摄像头和右摄像头的光心连线平行;在二者不平行的情况下,通过将两个主动移动轮的直径设置为相等,便于通过移动左主动移动轮LAW或右主动移动轮RAW来使台阶边缘与光线连线保持平行。
例如,被动移动装置PMD包括位于主动移动轮的后方的从动移动轮、半轴HS和差速器DF,差速器DF与行进传动轴MDS连接,半轴HS将从动移动轮PW与差速器DF连接。差速器DF用于使主动移动轮和从动移动轮实现以不同转速转动。半轴HS用于实现在差速器DF与从动移动轮之间传递动力。例如,被动移动装置PMD包括沿左摄像头LC和右摄像头RC的排列方向排列的左从动移动轮LPW和右从动移动轮RPW,以便于移动系统MS保持平稳。
本公开的至少另一个实施例提供一种用于与图像处理系统P配合使用的移动系统,其与如图7a所示的移动系统MS类似,主要区别在于不包括图像处理系统P。该移动系统包括摄像机和行动控制系统,如图7a所示,摄像机包括左摄像头LC和右摄像头RC,并且被配置为用于向图像处理系统P输出图像信号。行动控制系统包括:主动移动装置,其被配置为可根据图像处理系统P的处理结果移动移动系统MS;被动移动装置,其与主动移动装置连接,并且被配置为可在主动移动装置的带动下移动;爬楼装置,其被配置为可带动移动系统MS实现爬楼功能,爬楼装置包括爬楼轮(例如包括左爬楼轮LCW和右爬楼轮RCW),爬楼轮包括多个杆状结构CWR,该多个杆状结构CWR从爬楼轮的中心向外呈辐射状伸出;驱动控制装置,其被配置为可根据图像处理系统P的处理结果控制主动移动装置和爬楼装置的动作。
例如,驱动控制装置包括中央处理器CPU、行进传动轴MDS、爬楼传动轴CDS、发动机E和制动器BK1,发动机E被配置为通过行进传动轴MDS驱动主动移动装置并且通过爬楼传动轴CDS驱动爬楼装置,发送机E被配置为根据中央处理器CPU的指令控制主动移动装置和爬楼装置,制动器BK1被配置为根据中央处理器CPU的指令控制主动移动装置和被动移动装置的制动。例如,在该移动系统采用图像处理系统P时,中央处理器CPU与图像处理系统P可以集成在一起或者二者分别采用彼此连接的独立器件。
例如,爬楼装置包括两个爬楼轮CW,这两个爬楼轮CW的排列方向与 左摄像头LC和右摄像头RC的排列方向相同。
例如,爬楼装置还包括可升降杆,可升降杆与爬楼轮连接。
以上图像处理系统P配合使用的移动系统中各部件的实施例方式可参照上述包括图像处理系统P的移动系统MS的实施例中的相关描述,重复之处不再赘述。
综上所述,本公开实施例提供的图像处理方法及系统可以用于自动识别诸如楼梯等台阶状结构;本公开实施例中的移动系统通过摄像机和图像处理系统可以自动识别移动系统的前方环境并且自动获取前方环境中楼梯的台阶宽度和高度,通过行动控制系统可以实现自动调节行进步长、爬楼高度或拐弯角并且实现自动行进,从而可以实现在不同场景、不同阶梯宽度和不同阶梯高度的情况下使该移动系统智能地行进,而不需要人工辅助。
以上所述仅是本公开的示范性实施方式,而非用于限制本公开的保护范围,本公开的保护范围由所附的权利要求确定。
本申请要求于2017年6月30日递交的中国专利申请第201710554504.1号的优先权,在此全文引用上述中国专利申请公开的内容以作为本申请的一部分。

Claims (20)

  1. 一种图像处理方法,包括:
    获取包括目标物体的深度信息的二维深度矩阵D p,其中,元素d ij为所述二维深度矩阵D p中的第i行和第j列元素,并且d ij的数值为深度值;
    计算所述二维深度矩阵D p中的每一行元素的平均值以得到深度向量D=[d 1,d 2,d 3,…,d i,…,d n],其中,元素d i为所述二维深度矩阵D p的第i行元素的平均值;
    计算所述深度向量D中的相邻元素的差的绝对值以得到边缘确定向量ΔD=[Δd 1,Δd 2,Δd 3,…,Δd i,…,Δd n-1],其中,元素Δd i=|d i-d i+1|;
    计算所述边缘确定向量ΔD中相邻元素差值的绝对值的平均值d 0=(|Δd 2-Δd 1|+|Δd 3-Δd 2|+…+|Δd n-1-Δd n-2|)/(n-2);以及
    将边缘确定向量ΔD中的相邻元素差值的绝对值|Δd i-Δd i+1|与d 0进行比较以识别所述目标物体的边缘,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定二维深度矩阵D p中的第i行元素对应所述目标物体的边缘。
  2. 根据权利要求1所述的方法,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定深度向量D中的d i的数值为所述边缘到用于获取所述目标物体的深度信息的摄像机包括的左摄像头和右摄像头的光心连线的距离d。
  3. 根据权利要求2所述的图像处理方法,还包括:
    根据所述摄像机拍摄用于获取所述二维深度矩阵D p的左图像和右图像时相对于水平面的旋转角度θ、以及所述边缘到所述光心连线的所述距离d,确定所述边缘到所述光心连线的水平距离以及所述边缘到所述光心连线的垂直距离。
  4. 根据权利要求1所述的图像处理方法,还包括:
    步骤A,摄像机以相对于水平面的不同旋转角度对所述目标物体进行拍摄,以获取多组左图像和右图像;
    步骤B,根据所述多组左图像和右图像获取多个二维深度矩阵;
    步骤C,确定每个二维深度矩阵对应的边缘数量;
    步骤D,确定分别对应0个边缘和1个边缘的二维深度矩阵分别对应的摄像机相对于水平面的旋转角度θ 0和θ 1;以及
    在θ 0至θ 1的角度范围内重复步骤A至步骤C,以确定摄像机相对于水平面的临界旋转角度θ c,其中,所述临界旋转角度θ c是指二维深度矩阵对应的边缘数量在0和1之间进行改变时作为改变临界点的边缘数量为1的二维深度矩阵对应的旋转角度。
  5. 根据权利要求4所述的图像处理方法,还包括:
    根据所述临界旋转角度θ c计算相邻边缘之间的水平距离和垂直距离。
  6. 根据权利要求1所述的图像处理方法,还包括:
    在获取所述二维深度矩阵D p之前,利用摄像机的左摄像头和右摄像头分别获取包括所述目标物体的左图像和右图像;
    将所述左图像和右图像中的至少一个与参考图像进行比较,以确定所述左图像和所述右图像中的所述至少一个中的目标物体是否包括直线形边缘。
  7. 根据权利要求6所述的图像处理方法,还包括:
    在确定所述左图像和所述右图像中的所述至少一个中的目标物体包括直线形边缘之后,根据所述左图像和右图像获取多组对应像素点,其中,每组对应像素点包括对应于所述目标物体的同一物点的左图像中的左像素点和右图像中的右像素点;以及
    比较每组对应像素点的左像素点与右像素点的纵坐标的大小,以确定所述目标物体的直线形边缘与所述左摄像头和右摄像头的光心连线是否平行。
  8. 根据权利要求7所述的图像处理方法,其中,
    在对应像素点的左像素点的纵坐标等于右像素点的纵坐标的情况下,确定所述直线形边缘与所述左摄像头和右摄像头的光心连线平行;
    在对应像素点的左像素点的纵坐标大于右像素点的纵坐标的情况下,确定所述左摄像头的光心到所述直线形边缘的距离大于所述右摄像头的光心到所述直线形边缘的距离;以及
    在对应像素点的左像素点的纵坐标小于右像素点的纵坐标的情况下,确定所述左摄像头的光心到所述直线形边缘的距离小于所述右摄像头的光心到所述直线形边缘的距离。
  9. 根据权利要求7或8所述的图像处理方法,还包括:
    在所述直线形边缘与所述光心连线平行的情况下,利用所述左图像和所述右图像获取视差矩阵D x,其中,所述视差矩阵D x中每个元素的数值为所 述每个元素对应的一组对应像素点的左像素点和右像素点的横坐标之差的绝对值;以及
    根据所述视差矩阵D x、所述左摄像头和右摄像头的焦距f、以及所述左摄像头和所述右摄像头的光心之间的距离D c,得到所述二维深度矩阵D p
  10. 一种图像处理系统,包括:
    处理器;以及
    存储器,用于存储可执行指令,所述指令被所述处理器加载并执行:
    获取包括目标物体的深度信息的二维深度矩阵D p,其中,元素d ij为所述二维深度矩阵D p中的第i行和第j列元素,并且d ij的数值为深度值;
    计算所述二维深度矩阵D p中的每一行元素的平均值以得到深度向量D=[d 1,d 2,d 3,…,d i,…,d n],其中,元素d i为所述二维深度矩阵D p的第i行元素的平均值;
    计算所述深度向量D中的相邻元素的差的绝对值以得到边缘确定向量ΔD=[Δd 1,Δd 2,Δd 3,…,Δd i,…,Δd n-1],其中,元素Δd i=|d i-d i+1|;
    计算所述边缘确定向量ΔD中相邻元素差值的绝对值的平均值d 0=(|Δd 2-Δd 1|+|Δd 3-Δd 2|+…+|Δd n-1-Δd n-2|)/(n-2);以及
    将边缘确定向量ΔD中的相邻元素差值的绝对值|Δd i-Δd i+1|与d 0进行比较以识别所述目标物体的边缘,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定二维深度矩阵D p中的第i行元素对应所述目标物体的边缘。
  11. 一种存储介质,其中存储有指令,其中,所述指令用于被处理器加载并执行:
    获取包括目标物体的深度信息的二维深度矩阵D p,其中,元素d ij为所述二维深度矩阵D p中的第i行和第j列元素,并且d ij的数值为深度值;
    计算所述二维深度矩阵D p中的每一行元素的平均值以得到深度向量D=[d 1,d 2,d 3,…,d i,…,d n],其中,元素d i为所述二维深度矩阵D p的第i行元素的平均值;
    计算所述深度向量D中的相邻元素的差的绝对值以得到边缘确定向量ΔD=[Δd 1,Δd 2,Δd 3,…,Δd i,…,Δd n-1],其中,元素Δd i=|d i-d i+1|;
    计算所述边缘确定向量ΔD中相邻元素差值的绝对值的平均值d 0=(|Δd 2-Δd 1|+|Δd 3-Δd 2|+…+|Δd n-1-Δd n-2|)/(n-2);以及
    将边缘确定向量ΔD中的相邻元素差值的绝对值|Δd i-Δd i+1|与d 0进行比较以识别所述目标物体的边缘,其中,在|Δd i-Δd i+1|大于d 0的情况下,确定二维深度矩阵D p中的第i行元素对应所述目标物体的边缘。
  12. 一种移动系统,包括根据权利要求10所述的图像处理系统。
  13. 根据权利要求12所述的移动系统,还包括摄像机和行动控制系统,其中,
    所述摄像机包括左摄像头和右摄像头并且被配置为向所述图像处理系统输出图像信号;
    所述行动控制系统包括:
    主动移动装置,其被配置为可移动所述移动系统;
    被动移动装置,其与所述主动移动装置连接,并且被配置为可在所述主动移动装置的带动下移动;
    爬楼装置,其被配置为可带动所述移动系统实现爬楼功能;以及
    驱动控制装置,其被配置为可根据所述图像处理系统的处理结果控制所述主动移动装置和所述爬楼装置的动作。
  14. 根据权利要求13所述的移动系统,其中,所述爬楼装置包括爬楼轮,所述爬楼轮包括多个杆状结构,所述多个杆状结构从所述爬楼轮的中心向外呈辐射状伸出。
  15. 根据权利要求14所述的移动系统,其中,所述爬楼装置还包括可升降杆,所述可升降杆与所述爬楼轮连接。
  16. 根据权利要求13至15中任一项所述的移动系统,其中,所述主动移动装置包括两个主动移动轮,所述两个主动移动轮的直径相等,并且所述两个主动移动轮的中心的连线与所述摄像机的左摄像头和右摄像头的光心的连线平行。
  17. 根据权利要求13至16中任一项所述的移动系统,其中,所述驱动控制装置包括行进传动轴、爬楼传动轴、发动机和制动器,所述发动机被配置为通过所述行进传动轴驱动所述主动移动装置并且通过所述爬楼传动轴驱动所述爬楼装置,所述制动器被配置为控制所述主动移动装置和所述被动移动装置的制动。
  18. 根据权利要求17所述的移动系统,其中,所述被动移动装置包括从 动移动轮、半轴和差速器,所述差速器与所述行进传动轴连接,所述半轴将所述从动移动轮与所述差速器连接。
  19. 根据权利要求13至18中任一项所述的移动系统,其中,所述行动控制系统还包括底盘,所述摄像机位于所述底盘的前端,所述主动移动装置、被动移动装置和爬楼装置位于所述底盘下方。
  20. 一种用于与权利要求10所述的图像处理系统配合使用的移动系统,包括:
    摄像机,其包括左摄像头和右摄像头,并且被配置为向所述图像处理系统输出图像信号;以及
    行动控制系统,其包括:
    主动移动装置,其被配置为可移动所述移动系统;
    被动移动装置,其与所述主动移动装置连接,并且被配置为可在所述主动移动装置的带动下移动;
    爬楼装置,其被配置为可带动所述移动系统实现爬楼功能,其中,所述爬楼装置包括爬楼轮,所述爬楼轮包括多个杆状结构,所述多个杆状结构从所述爬楼轮的中心向外呈辐射状伸出;以及
    驱动控制装置,其被配置为可根据所述图像处理系统的处理结果控制所述主动移动装置和所述爬楼装置的动作。
PCT/CN2018/075057 2017-06-30 2018-02-02 图像处理方法和系统、存储介质和移动系统 WO2019000947A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP18764983.5A EP3648053B1 (en) 2017-06-30 2018-02-02 Image processing method and system, storage medium, and moving system
US16/085,652 US10933931B2 (en) 2017-06-30 2018-02-02 Image processing method and system, storage medium and moving system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710554504.1 2017-06-30
CN201710554504.1A CN109215044B (zh) 2017-06-30 2017-06-30 图像处理方法和系统、存储介质和移动系统

Publications (1)

Publication Number Publication Date
WO2019000947A1 true WO2019000947A1 (zh) 2019-01-03

Family

ID=64740341

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/075057 WO2019000947A1 (zh) 2017-06-30 2018-02-02 图像处理方法和系统、存储介质和移动系统

Country Status (4)

Country Link
US (1) US10933931B2 (zh)
EP (1) EP3648053B1 (zh)
CN (1) CN109215044B (zh)
WO (1) WO2019000947A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6996200B2 (ja) * 2017-09-29 2022-01-17 富士通株式会社 画像処理方法、画像処理装置、および画像処理プログラム
US11548151B2 (en) 2019-04-12 2023-01-10 Boston Dynamics, Inc. Robotically negotiating stairs
US11599128B2 (en) * 2020-04-22 2023-03-07 Boston Dynamics, Inc. Perception and fitting for a stair tracker
CN112116660B (zh) * 2019-06-19 2024-03-29 京东方科技集团股份有限公司 视差图校正方法、装置、终端及计算机可读介质
CN113049016B (zh) * 2021-04-29 2023-05-30 宿州学院 一种土木工程的道路桥梁自走式勘测装置
CN114241737B (zh) * 2021-11-30 2022-07-22 慧之安信息技术股份有限公司 一种基于深度学习的攀登铁塔监测方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006260105A (ja) * 2005-03-16 2006-09-28 Matsushita Electric Works Ltd 移動装置
US20130127996A1 (en) * 2011-11-23 2013-05-23 Samsung Electronics Co., Ltd. Method of recognizing stairs in three dimensional data image
CN104331884A (zh) * 2014-10-29 2015-02-04 上海大学 四触角履带机器人爬楼梯参数获取系统和方法
CN105074600A (zh) * 2013-02-27 2015-11-18 夏普株式会社 周围环境识别装置、使用其的自主移动系统以及周围环境识别方法
CN106821692A (zh) * 2016-11-23 2017-06-13 杭州视氪科技有限公司 一种基于rgb‑d相机和立体声的视障人士楼梯检测系统及方法

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7697786B2 (en) * 2005-03-14 2010-04-13 Sarnoff Corporation Method and apparatus for detecting edges of an object
CN104011772B (zh) * 2011-10-19 2017-02-15 克朗设备公司 基于识别和跟踪图像场景中的多个对象来控制车叉
US8660362B2 (en) * 2011-11-21 2014-02-25 Microsoft Corporation Combined depth filtering and super resolution
CN102636152B (zh) * 2012-04-19 2013-12-18 慈溪思达电子科技有限公司 可移动平台的主动视觉测距系统
CN103248906B (zh) * 2013-04-17 2015-02-18 清华大学深圳研究生院 一种双目立体视频序列的深度图获取方法与系统
CN104252706B (zh) * 2013-06-27 2017-04-12 株式会社理光 特定平面的检测方法和系统
CN103400392B (zh) * 2013-08-19 2016-06-22 山东鲁能智能技术有限公司 基于变电站巡检机器人的双目视觉导航系统及方法
TWI517099B (zh) * 2013-11-15 2016-01-11 瑞昱半導體股份有限公司 影像連續邊緣偵測系統與其方法
CN103868460B (zh) * 2014-03-13 2016-10-05 桂林电子科技大学 基于视差优化算法的双目立体视觉自动测量方法
CN105354819B (zh) * 2015-09-29 2018-10-09 上海图漾信息科技有限公司 深度数据测量系统、深度数据确定方法和装置
CN105740802A (zh) * 2016-01-28 2016-07-06 北京中科慧眼科技有限公司 基于视差图的障碍物检测方法和装置及汽车驾驶辅助系统
CN106228110B (zh) * 2016-07-07 2019-09-20 浙江零跑科技有限公司 一种基于车载双目相机的障碍物及可行驶区域检测方法
CN106203390B (zh) * 2016-07-22 2019-09-24 杭州视氪科技有限公司 一种智能盲人辅助系统
CN106056107B (zh) * 2016-07-28 2021-11-16 福建农林大学 一种基于双目视觉避桩控制方法
CN106814734A (zh) * 2016-11-30 2017-06-09 北京贝虎机器人技术有限公司 使用计算设备控制自主移动式设备的方法及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006260105A (ja) * 2005-03-16 2006-09-28 Matsushita Electric Works Ltd 移動装置
US20130127996A1 (en) * 2011-11-23 2013-05-23 Samsung Electronics Co., Ltd. Method of recognizing stairs in three dimensional data image
CN105074600A (zh) * 2013-02-27 2015-11-18 夏普株式会社 周围环境识别装置、使用其的自主移动系统以及周围环境识别方法
CN104331884A (zh) * 2014-10-29 2015-02-04 上海大学 四触角履带机器人爬楼梯参数获取系统和方法
CN106821692A (zh) * 2016-11-23 2017-06-13 杭州视氪科技有限公司 一种基于rgb‑d相机和立体声的视障人士楼梯检测系统及方法

Also Published As

Publication number Publication date
US10933931B2 (en) 2021-03-02
CN109215044A (zh) 2019-01-15
EP3648053A4 (en) 2021-04-28
EP3648053B1 (en) 2023-10-11
CN109215044B (zh) 2020-12-15
US20190256159A1 (en) 2019-08-22
EP3648053A1 (en) 2020-05-06

Similar Documents

Publication Publication Date Title
WO2019000947A1 (zh) 图像处理方法和系统、存储介质和移动系统
WO2021004312A1 (zh) 一种基于双目立体视觉系统的车辆智能测轨迹方法
DE60308782T2 (de) Vorrichtung und Methode zur Hinderniserkennung
WO2020258286A1 (zh) 图像处理方法、装置、拍摄装置和可移动平台
WO2018024006A1 (zh) 一种聚焦型光场相机的渲染方法和系统
US10821897B2 (en) Method and device for adjusting driver assistance apparatus automatically for personalization and calibration according to driver&#39;s status
WO2019015493A1 (zh) 用于智能车库的基于光学图像的车辆识别系统
WO2010124497A1 (zh) 一种运动检测方法、装置和系统
CN104361628A (zh) 一种基于航空倾斜摄影测量的三维实景建模系统
JP4084857B2 (ja) 自動車の前方監視システムにおけるイメージセンサの縦横比設定方法
CN107084680A (zh) 一种基于机器单目视觉的目标深度测量方法
CN104469170B (zh) 双目摄像装置、图像处理方法及装置
WO2021204267A1 (zh) 身份识别
CN103824303A (zh) 基于被摄物的位置、方向调整图像透视畸变的方法和装置
CN108154536A (zh) 二维平面迭代的相机标定法
CN113673584A (zh) 一种图像检测方法及相关装置
JP6337504B2 (ja) 画像処理装置、移動体、ロボット、機器制御方法およびプログラム
TWI532619B (zh) Dual Image Obstacle Avoidance Path Planning Navigation Control Method
CN113724335B (zh) 一种基于单目相机的三维目标定位方法及系统
Sui et al. Extrinsic calibration of camera and 3D laser sensor system
CN106556958A (zh) 距离选通成像的自动聚焦方法
CN108090930A (zh) 基于双目立体相机的障碍物视觉检测系统及方法
JP2008287436A5 (zh)
JP3447461B2 (ja) 移動障害物検出装置及びその方法
KR101781172B1 (ko) 영상 정합 장치 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18764983

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018764983

Country of ref document: EP

Effective date: 20200130