CN111376833A - Method, apparatus and computer program product for monitoring an object - Google Patents

Method, apparatus and computer program product for monitoring an object Download PDF

Info

Publication number
CN111376833A
CN111376833A CN201811623957.6A CN201811623957A CN111376833A CN 111376833 A CN111376833 A CN 111376833A CN 201811623957 A CN201811623957 A CN 201811623957A CN 111376833 A CN111376833 A CN 111376833A
Authority
CN
China
Prior art keywords
information
value
attribute
vehicle
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811623957.6A
Other languages
Chinese (zh)
Inventor
李小中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qoros Automotive Co Ltd
Original Assignee
Qoros Automotive Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qoros Automotive Co Ltd filed Critical Qoros Automotive Co Ltd
Priority to CN201811623957.6A priority Critical patent/CN111376833A/en
Publication of CN111376833A publication Critical patent/CN111376833A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention relates to a method, apparatus and computer program product for monitoring an object. The method includes receiving first information associated with a property of a monitored object from a radar of a vehicle. The method also includes receiving second information associated with the attribute of the monitored object from a camera of the vehicle. The method also includes determining target information for an attribute of the monitored object based on the first information and the second information. By using the method, the distance measurement precision of the automobile for surrounding objects is improved.

Description

Method, apparatus and computer program product for monitoring an object
Technical Field
The present invention relates to the field of monitoring, in particular to a method, device and computer program product for monitoring an object.
Background
With the development of society, more and more families adopt vehicles as travel tools. As the use of vehicles increases, the functions of the vehicles are also continuously improved with the development of technology. Such as many automobiles now have been equipped with display devices. The display device in the vehicle can display a lot of useful information to the user, such as the image of the back-up car and some functional parameters of the vehicle. In this way, the user can easily know the information related to the vehicle.
As the use of vehicles has increased, many devices for assisting driving have come to be used in vehicles. For example, when driving, particularly at night, an infrared night vision system or a camera is generally used to acquire information on objects around the vehicle, and then the acquired information on the objects is displayed on a display device in the vehicle. However, how to better acquire information of objects around the vehicle becomes a problem which needs to be solved urgently at present.
Disclosure of Invention
The invention provides a method, a device and a computer program product for monitoring a subject.
According to a first aspect of the present disclosure, a method for monitoring a subject is provided. The method includes receiving first information associated with a property of a monitored object from a radar of a vehicle. The method also includes receiving second information associated with the attribute of the monitored object from a camera of the vehicle. The method also includes determining target information for an attribute of the monitored object based on the first information and the second information.
According to a second aspect of the present disclosure, an apparatus for monitoring a subject is provided. The device includes: a first receiving module configured to receive first information associated with a property of a monitored object from a radar of a vehicle; a second receiving module configured to receive second information associated with the attribute of the monitored object from the camera of the vehicle; and a first determination module configured to determine target information of the attribute of the monitoring object based on the first information and the second information.
In a third aspect of the disclosure, an electronic device is provided that includes one or more processors and a memory device. The storage device is used to store one or more programs. The one or more programs, when executed by the one or more processors, cause the one or more processors to perform a method according to the first aspect of the disclosure.
In a fourth aspect of the present disclosure, a computer-readable medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the method according to the first aspect of the present disclosure.
It should be understood that the statements herein reciting aspects are not intended to limit the critical or essential features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following more particular descriptions of exemplary embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout the exemplary embodiments of the disclosure.
FIG. 1 illustrates a schematic diagram of an example environment 100 in which apparatuses and/or methods according to embodiments of the present disclosure may be implemented;
fig. 2 illustrates a flow diagram of a method 200 for monitoring a subject according to an embodiment of the present disclosure;
FIG. 3 illustrates a schematic diagram 300 showing monitoring results, in accordance with an embodiment of the present disclosure;
fig. 4 illustrates a block diagram 400 of an apparatus for monitoring a subject according to an embodiment of the present disclosure
Fig. 5 illustrates a schematic block diagram of an example device 500 suitable for use to implement embodiments of the present disclosure.
Like or corresponding reference characters designate like or corresponding parts throughout the several views.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
In describing embodiments of the present disclosure, the terms "include" and its derivatives should be interpreted as being inclusive, i.e., "including but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions are also possible below.
The principles of the present disclosure will be described below with reference to a number of example embodiments shown in the drawings. While the preferred embodiments of the present disclosure have been illustrated in the accompanying drawings, it is to be understood that these embodiments are described merely for the purpose of enabling those skilled in the art to better understand and to practice the present disclosure, and are not intended to limit the scope of the present disclosure in any way.
During vehicle operation, especially at night, infrared imaging devices or cameras are typically used to capture objects around the vehicle. For example, objects in front of the vehicle, such as a vehicle traveling in front, a pedestrian, a stationary obstacle, and the like, are acquired. However, the distance measurement accuracy of the infrared imaging device and the camera is low. In addition, the infrared imaging device and the camera have poor all-weather working performance. The performance of the system may be degraded in fog, snow and sand weather. In addition, the infrared imaging device and the camera display less information and only can display the outline of an object.
In order to solve the above problems, the present disclosure provides a method of monitoring a subject. In this method, information of objects around the vehicle is acquired by a radar and a camera (e.g., a forward-looking camera), respectively, and then attribute information of the objects around the vehicle is determined based on the information acquired by both the radar and the camera. The relevant information of the object is then displayed on a display device within the vehicle. By adopting the method, the distance measuring precision of the objects around the vehicle is improved, the all-weather working capacity of the vehicle is improved, and the information of more objects can be displayed.
Fig. 1 illustrates a schematic diagram of an example environment 100 in which devices and/or methods according to embodiments of the disclosure may be implemented.
In FIG. 1, an example environment 100 has a vehicle therein that includes a camera 102, a display module 104, a radar 106, and a control module 108.
In a vehicle, a camera 102 is used to acquire image data around the vehicle. The number of cameras 102 may be one or more. Only one camera 102 is depicted in fig. 1 as an example. The foregoing examples are provided merely to illustrate the disclosure and are not intended to be limiting thereof. The number of cameras 102 may be set to any suitable number as desired by one skilled in the art.
The type of camera 102 may be set to any suitable type based on the needs. In one example, the camera 102 may be an in-vehicle camera. Alternatively or additionally, to obtain data in front of the vehicle, the camera 102 may be an Advanced Driver Assistance System (ADAS) forward looking camera. In another example, the vehicle has multiple cameras, each of which may or may not be of the same type. The foregoing examples are provided merely to illustrate the disclosure and are not intended to be limiting thereof.
The camera 102 may be used to acquire image data associated with a monitored subject. In one example, from the image data, the number of monitoring objects and attribute information may be determined. Alternatively or additionally, the properties of the monitored object may include distance, velocity, azimuth, and profile. The distance represents a straight-line distance of the monitoring object from the vehicle. The speed indicates how fast the monitoring object travels. The azimuth angle represents an angular relationship between the monitoring object and the vehicle. The outline represents the shape and size of the monitoring object. The foregoing examples are provided merely to illustrate the disclosure and are not intended to be limiting thereof. One skilled in the art can obtain any information related to the monitored object as desired.
In the vehicle, a radar 106 is used to acquire data of a monitoring object around the vehicle. The number of radars 106 may be one or more. Only one radar 106 is depicted in fig. 1 as an example. The foregoing examples are provided merely to illustrate the disclosure and are not intended to be limiting thereof. The number of radars may be set to any number as desired by one skilled in the art.
The type of radar 106 may be set to any suitable type based on the needs. In one example, radar 106 is a millimeter wave radar having a relatively large measurement range that may detect objects relatively far from the vehicle. In one example, radar 106 is a lidar, which also has a relatively large measurement range. The foregoing examples are provided merely to illustrate the disclosure and are not intended to be limiting thereof. In another example, there may be a plurality of radars 106, and the type of each radar 106 may or may not be the same. Any suitable type of radar may be provided as desired by those skilled in the art.
Control module 108 is configured to process data from camera 102 and radar 106 and send the processed data to display module 104 for display. Control module 108 may include a hardware processor including, but not limited to, a hardware Central Processing Unit (CPU), a Field Programmable Gate Array (FPGA), a Complex Programmable Logic Device (CPLD), an Application Specific Integrated Circuit (ASIC), a system on a chip (SoC), or a combination thereof.
The display module 104 is used to display data information processed by the control module 108 to display monitored object information to a user in the vehicle. In one example, the display module 104 may be a display device within a vehicle. In another example, the display module 104 may be a display device coupled to a vehicle. The foregoing examples are provided merely to illustrate the disclosure and are not intended to be limiting thereof. Further, the vehicle may not have the display module 104, and the vehicle may not need to display the data information determined by the control module 108 in future autonomous driving.
The location of the components in fig. 1 is merely an example, not a specific limitation of the present disclosure, and the components may be located at any suitable location based on actual needs.
Fig. 1 above describes an example environment 100 in which the present disclosure finds application, and the method for monitoring a subject is described in detail below in conjunction with fig. 2.
After the vehicle is started, the radar 106 and camera 102 may be turned on to obtain data information about the vehicle. At block 202, the control module 108 receives information associated with attributes of the monitored object from the radar 106 of the vehicle. For convenience of description, information associated with the attribute of the monitored object obtained by the radar 106 is referred to as first information in the following description.
In one example, the first information includes attribute information of the monitoring object. Alternatively or additionally, the property of the monitored object comprises one or more of distance, velocity and azimuth. In another example, any attribute information associated with the monitored object may be obtained based on actual needs. The above examples are intended to be illustrative of the present disclosure only and are not intended to be limiting.
In another example, control module 108 obtains information associated with the monitored objects, e.g., the number of monitored objects, in addition to information associated with the attributes of the monitored objects from radar 106.
At block 204, information associated with a property of the monitored object is received from the camera 102 of the vehicle. For convenience of description, the information associated with the attribute of the monitored object obtained by the camera 102 is referred to as second information in the following description.
In one example, the second information includes attribute information of the monitoring object. Alternatively or additionally, the properties of the monitored object include one or more of distance, velocity, azimuth, and profile. Alternatively or additionally, the contour includes a shape and a size of the monitoring object. In another example, any attribute information associated with the monitored object may be obtained based on actual needs. The above examples are intended to be illustrative of the present disclosure, and are not intended to be limiting of the present disclosure.
In another example, the control module 108 obtains information associated with the monitored objects, e.g., the number of monitored objects, in addition to obtaining information associated with the attributes of the monitored objects from the camera 102.
At block 206, target information for the property of the monitored object is determined based on the first information and the second information. After obtaining the first and second information from radar 106 and camera 102, respectively, control module 108 determines information associated with the attributes of the monitored object from the data provided by the first and second information.
By obtaining target information of the attribute of the monitored object by combining the first information obtained by the radar 106 and the second information obtained by the camera 102, it is possible to obtain high accuracy of the obtained target information, for example, to improve the ranging accuracy. The all-weather working ability is also improved and the information of the monitoring object which can be displayed is increased.
The control module 108 may perform a rationality check on the first and second information after receiving the first information and the second information to avoid noise caused by erroneous data in the first information or the second information.
The rationality check of the first information or the second information may be performed by various suitable methods. In one example, the first information or the second information may be compared to an average of previously obtained information, indicating that the data is erroneous if above or below a predetermined threshold.
In another example, the first information or the second information may be compared with previously obtained information of the monitoring object, and if a predetermined threshold is exceeded, it is indicated that the first information or the second information is erroneous.
In another example, the first information or the second information may also be compared to predetermined parameters of the radar 106 or the camera 102, indicating that the first information or the second information is erroneous if the predetermined parameters of the radar 106 or the camera 102 are exceeded. For example, the detection range of the radar 106 is set to a predetermined threshold, and if the resulting distance is greater than the predetermined threshold, the data is erroneous.
Alternatively or additionally, the above methods may be used together for rationality detection, or different rationality detection methods may be employed for different attributes. The above examples are intended to be illustrative of how the rationality measure may be performed and are not intended to be limiting of the disclosure. One skilled in the art can set any suitable rationality detection method based on the needs.
Through the rationality detection, inaccurate data can be removed, and the accuracy of the data is improved. And also improves computational efficiency due to reduced inaccurate data.
The control module 108 determines first and second information associated with a property of the monitored object based on data obtained from the radar 106 and the camera 102 or obtained rationality detected data. In one example, the control module 108 may determine the second information of the monitoring object associated with the first information or the first information of the monitoring object associated with the second information based on the distance of the monitoring object obtained from the radar 106 and the distance of the monitoring object obtained from the camera 102. The above examples are intended to be illustrative of the present disclosure, and are not intended to be limiting of the present disclosure. The person skilled in the art may determine the first information and the second information of the monitoring object by any suitable method, for example, the position of the monitoring object, according to the needs.
In determining the target information, if only the first information associated with the attribute of the monitoring object is obtained and the second information is not obtained, the first information is taken as the target information of the attribute of the monitoring object. If only the second information associated with the attribute of the monitoring object is obtained, and the first information is not obtained, the second information is taken as the target information of the attribute of the monitoring object.
If the first information and the second information associated with the attribute of the monitoring object are obtained at the same time, the target information of the attribute of the monitoring object may be determined based on the first information and the second information.
In one example, the first information includes a first value of the attribute and the second information includes a second value of the attribute. A third value of the attribute is determined based on the first value and the second value. In one example, the third value of the attribute may be determined by averaging the first value and the second value. Alternatively or additionally, a weighted average of the first and second values may be used to determine a third value of the attribute. The foregoing examples are provided merely to illustrate the disclosure and are not intended to be limiting thereof. The skilled person may set any suitable method of determining the third value based on the needs.
It is checked whether the third value is valid. In one example, in determining whether the third value is valid, it is determined whether a difference between the third value and the first value is less than a first threshold. If the difference is less than the first threshold, the third value is valid. The size of the first threshold may be set to any suitable size as desired. In another example, it is determined whether the difference between the third value and the second value is less than a second threshold, and if the difference is less than the second threshold, the third value is indicated to be valid. The size of the second threshold may be set to any suitable size as desired. If the third value is valid in response, the third value is determined to be the target value for the attribute. By verifying the validity of the third value, the accuracy of the obtained third value may be improved.
In another example of determining the target information, the first information is determined to be the target information if the attribute is at least one of a distance, a velocity, or an azimuth. Determining the second information as the target information if the attribute is at least one of a shape or a size. By adopting the mode, the accuracy of the attribute information can be improved, because the data information of the radar for measuring the distance, the speed and the azimuth angle is more accurate, and the data information of the camera for measuring the shape and the size is more accurate.
After the target information is determined, the validity of the target information is verified by using the information different from the target information in the first information and the second information. For example, if the target information is the first information, the validity of the target information is verified with the second information; and if the target information is the second information, verifying the validity of the target information by using the first information. In verifying the validity, the validity of the target information is also determined based on whether the difference between the target information and the first information or the second information is smaller than a predetermined threshold value. By verifying the validity of the target information, the accuracy of the acquired target information can be improved.
In the present disclosure, based on the determined target information, the monitoring object and the attribute information of the monitoring object may be displayed to the user. The display area needs to be determined when the display is performed. In one example, the display area is set to an area of 10% of the entire lane and the periphery of the lane in front of the vehicle. The above examples are intended to be illustrative of the present disclosure, and not limiting thereof, and one skilled in the art can set any suitable display area as desired.
And after the display area is determined, displaying the monitoring object and the target information associated with the attribute in the display area. In one example, at least one of a speed, a distance, a shape, an azimuth, a size, and the like of a monitoring object in a lane ahead of a vehicle is displayed.
As an example, fig. 3 shows a schematic view of a monitored object in front of a user determined by the above-described method. The monitored objects in front of the vehicle 316 include stationary obstacles 302, 308, and 312, and their distances from the vehicle 316 are displayed. The front vehicles 304 and 310 are also shown, as are their distance from the vehicle 316 and their speed of travel. In addition, people and other moving objects 306 and 314 are displayed, as well as their distance from the vehicle 316 and their travel speed. When determining different objects, the profile data acquired in the camera may be compared with preset data to determine the type of the monitoring object. Fig. 3 is merely an example, and not a limitation of the present disclosure, and information of the monitored object and its attributes may be displayed in any suitable form.
In one example, a degree of risk of the monitored subject may also be determined. When the target information includes a distance and an azimuth between the vehicle and the monitoring object, a lateral distance and a longitudinal distance of the monitoring object from the vehicle are determined by the distance and the azimuth.
And determining the danger degree of the monitoring object by determining the transverse distance and the longitudinal distance of the monitoring object from the vehicle. For example, the lateral distance and the longitudinal distance may be used to classify the monitored object as an object with potential danger or hazard or an object without potential danger or hazard. Alternatively or additionally, objects that are potentially dangerous or hazardous may further refine their danger rating based on the speed of the object. If the first risk level corresponds to an object that is closer to, stationary, or traversing the vehicle; the second level corresponds to objects farther away from the vehicle that are stationary; the third level corresponds to objects that are farther away from the vehicle, in motion. For objects without potential danger or hazard, it may be an associated obstacle outside the lane, a tree or a utility pole, etc.
The risk level is displayed in association with the monitored subject in the display module 104. Safety prompt information can be provided for the user in advance by determining the danger degree of the monitored object and displaying the danger degree to the user, so that the safety of the user in driving the vehicle is improved.
Fig. 4 shows a schematic block diagram of an apparatus 400 for monitoring a subject according to an embodiment of the present disclosure. As shown in fig. 4, apparatus 400 includes a first receiving module 410 configured to receive first information associated with a property of a monitored object from radar 106 of a vehicle. The apparatus 400 further includes a second receiving module 420 configured to receive second information associated with the property of the monitored object from the camera 102 of the vehicle. The apparatus 400 further comprises a first determination module 430 configured to determine target information of a property of the monitored object based on the first information and the second information.
In some embodiments, the first information comprises a first value of the attribute and the second information comprises a second value of the attribute, wherein the first determining module comprises: a second determination module configured to determine a third value of the attribute based on the first value and the second value; a third determination module configured to determine whether the third value is valid; and a fourth determination module configured to determine the third value as the target value of the attribute in response to the third value being valid.
In some embodiments, the third determining module comprises a first comparing module configured to determine whether a difference between the third value and the first value is less than a first threshold; or a second comparison module configured to determine whether a difference between the third value and the second value is less than a second threshold.
In some embodiments, the first determining module comprises: a fifth determination module configured to determine the first information as target information in response to the attribute being at least one of a distance, a velocity, or an azimuth; and a sixth determination module configured to determine the second information as the target information in response to the attribute being at least one of a shape or a size.
In some embodiments, the apparatus 400 further comprises: a first verification module configured to verify validity of the target information using information different from the target information among the first information and the second information.
In some embodiments, the apparatus 400 further includes a first display module configured to display the target information in association with the monitored object through a display device of the vehicle.
In some embodiments, the target information includes a distance and an azimuth angle between the vehicle and the monitored object, the apparatus 400 further includes: a seventh determining module configured to determine a lateral distance and a longitudinal distance of the monitoring object from the vehicle based on the distance and the azimuth; an eighth determination module configured to determine a degree of risk of the monitoring object based on the lateral distance and the longitudinal distance; and a second display module configured to display the degree of risk in association with the monitored subject.
Fig. 5 illustrates a schematic block diagram of an example device 500 that may be used to implement embodiments of the present disclosure. For example, any of 102, 104, 106, and 108 as shown in fig. 1 may be implemented by device 500. As shown, device 500 includes a Central Processing Unit (CPU)501 that may perform various appropriate actions and processes in accordance with computer program instructions stored in a Read Only Memory (ROM)502 or loaded from a storage unit 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the device 500 can also be stored. The CPU 501, ROM502, and RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
A number of components in the device 500 are connected to the I/O interface 505, including: an input unit 506 such as a keyboard, a mouse, a key, a switch, or the like; an output unit 507 such as various types of displays, speakers, and the like; a storage unit 508, such as a magnetic disk, optical disk, or the like; and a communication unit 509 such as a network card, modem, wireless communication transceiver, etc. The communication unit 509 allows the device 500 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The various processes and processes described above, such as method 200, may be performed by processing unit 501. For example, in some embodiments, the method 200 may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as the storage unit 508. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 500 via the ROM502 and/or the communication unit 509. When the computer program is loaded into RAM 503 and executed by CPU 501, one or more of the acts of methods 200 and 300 described above may be performed.
The present disclosure may be methods, apparatus, systems, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for carrying out various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (16)

1. A method for monitoring a subject, comprising:
receiving first information associated with a property of a monitored object from a radar of a vehicle;
receiving second information associated with the attribute of the monitored object from a camera of the vehicle; and
determining target information of the attribute of the monitored object based on the first information and the second information.
2. The method of claim 1, wherein the first information comprises a first value of the attribute, the second information comprises a second value of the attribute, and wherein determining the target information comprises:
determining a third value of the attribute based on the first value and the second value;
determining whether the third value is valid; and
in response to the third value being valid, determining the third value as a target value for the attribute.
3. The method of claim 2, wherein determining whether the third value is valid comprises:
determining whether a difference between the third value and the first value is less than a first threshold; or
Determining whether a difference between the third value and the second value is less than a second threshold.
4. The method of claim 1, wherein determining the target information comprises:
determining the first information as the target information in response to the attribute being at least one of a distance, a velocity, or an azimuth; and
determining the second information as the target information in response to the attribute being at least one of a shape or a size.
5. The method of claim 4, further comprising:
verifying the validity of the target information using information different from the target information in the first information and the second information.
6. The method of claim 1, further comprising:
displaying, by a display device of the vehicle, the target information in association with the monitoring object.
7. The method of claim 6, the target information comprising a distance and an azimuth between the vehicle and the monitored object, the method further comprising:
determining a lateral distance and a longitudinal distance of the monitoring object from the vehicle based on the distance and the azimuth;
determining a risk level of the monitored object based on the lateral distance and the longitudinal distance; and
displaying the degree of risk in association with the monitored subject.
8. An apparatus for monitoring a subject, comprising:
a first receiving module configured to receive first information associated with a property of a monitored object from a radar of a vehicle;
a second receiving module configured to receive second information associated with the attribute of the monitored object from a camera of the vehicle; and
a first determination module configured to determine target information of the attribute of the monitoring object based on the first information and the second information.
9. The apparatus of claim 8, wherein the first information comprises a first value of the attribute and the second information comprises a second value of the attribute, and wherein the first determining means comprises:
a second determination module configured to determine a third value of the attribute based on the first value and the second value;
a third determination module configured to determine whether the third value is valid; and
a fourth determination module configured to determine the third value as a target value for the attribute in response to the third value being valid.
10. The apparatus of claim 9, wherein the third determining module comprises:
a first comparison module configured to determine whether a difference of the third value and the first value is less than a first threshold; or
A second comparison module configured to determine whether a difference between the third value and the second value is less than a second threshold.
11. The apparatus of claim 8, wherein the first determining module comprises:
a fifth determination module configured to determine the first information as the target information in response to the attribute being at least one of a distance, a velocity, or an azimuth; and
a sixth determination module configured to determine the second information as the target information in response to the attribute being at least one of a shape or a size.
12. The apparatus of claim 11, further comprising:
a first verification module configured to verify validity of the target information using information different from the target information among the first information and the second information.
13. The apparatus of claim 8, further comprising:
a first display module configured to display the target information in association with the monitoring object through a display device of the vehicle.
14. The apparatus of claim 13, the target information comprising a distance and an azimuth between the vehicle and the monitored object, the apparatus further comprising:
a seventh determining module configured to determine a lateral distance and a longitudinal distance of the monitoring object from the vehicle based on the distance and the azimuth;
an eighth determination module configured to determine a degree of risk of the monitored object based on the lateral distance and the longitudinal distance; and
a second display module configured to display the degree of risk in association with the monitored subject.
15. An electronic device, comprising:
one or more processors; and
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method according to any one of claims 1-7.
16. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN201811623957.6A 2018-12-28 2018-12-28 Method, apparatus and computer program product for monitoring an object Pending CN111376833A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811623957.6A CN111376833A (en) 2018-12-28 2018-12-28 Method, apparatus and computer program product for monitoring an object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811623957.6A CN111376833A (en) 2018-12-28 2018-12-28 Method, apparatus and computer program product for monitoring an object

Publications (1)

Publication Number Publication Date
CN111376833A true CN111376833A (en) 2020-07-07

Family

ID=71212751

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811623957.6A Pending CN111376833A (en) 2018-12-28 2018-12-28 Method, apparatus and computer program product for monitoring an object

Country Status (1)

Country Link
CN (1) CN111376833A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1940591A (en) * 2005-09-26 2007-04-04 通用汽车环球科技运作公司 System and method of target tracking using sensor fusion
CN101241188A (en) * 2007-02-06 2008-08-13 通用汽车环球科技运作公司 Collision avoidance system and method of detecting overpass locations using data fusion
US7991550B2 (en) * 2006-02-03 2011-08-02 GM Global Technology Operations LLC Method and apparatus for on-vehicle calibration and orientation of object-tracking systems
CN102609953A (en) * 2010-12-02 2012-07-25 通用汽车环球科技运作有限责任公司 Multi-object appearance-enhanced fusion of camera and range sensor data
CN105792110A (en) * 2016-03-30 2016-07-20 上海申腾信息技术有限公司 Data fusion and intelligent searching processing method for multiple data sources

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1940591A (en) * 2005-09-26 2007-04-04 通用汽车环球科技运作公司 System and method of target tracking using sensor fusion
US7991550B2 (en) * 2006-02-03 2011-08-02 GM Global Technology Operations LLC Method and apparatus for on-vehicle calibration and orientation of object-tracking systems
CN101241188A (en) * 2007-02-06 2008-08-13 通用汽车环球科技运作公司 Collision avoidance system and method of detecting overpass locations using data fusion
CN102609953A (en) * 2010-12-02 2012-07-25 通用汽车环球科技运作有限责任公司 Multi-object appearance-enhanced fusion of camera and range sensor data
CN105792110A (en) * 2016-03-30 2016-07-20 上海申腾信息技术有限公司 Data fusion and intelligent searching processing method for multiple data sources

Similar Documents

Publication Publication Date Title
WO2021023102A1 (en) Method and apparatus for updating map, and storage medium
CN109212530B (en) Method and apparatus for determining velocity of obstacle
WO2018177026A1 (en) Device and method for determining road edge
US11250288B2 (en) Information processing apparatus and information processing method using correlation between attributes
CN109212532B (en) Method and apparatus for detecting obstacles
CN113715814B (en) Collision detection method, device, electronic equipment, medium and automatic driving vehicle
CN110867132B (en) Environment sensing method, device, electronic equipment and computer readable storage medium
US20160363647A1 (en) Vehicle positioning in intersection using visual cues, stationary objects, and gps
CN110632617B (en) Laser radar point cloud data processing method and device
CN113370911B (en) Pose adjustment method, device, equipment and medium of vehicle-mounted sensor
EP4089659A1 (en) Map updating method, apparatus and device
US11204610B2 (en) Information processing apparatus, vehicle, and information processing method using correlation between attributes
US10210403B2 (en) Method and apparatus for pixel based lane prediction
KR102126670B1 (en) Apparatus and method for tracking objects with optimizing region of interest
CN112580571A (en) Vehicle running control method and device and electronic equipment
CN110341621B (en) Obstacle detection method and device
CN108859952B (en) Vehicle lane change early warning method and device and radar
CN112541437A (en) Vehicle positioning method and device, electronic equipment and storage medium
US20190362512A1 (en) Method and Apparatus for Estimating a Range of a Moving Object
CN113165653A (en) Following vehicle
CN108693517B (en) Vehicle positioning method and device and radar
CN110781816A (en) Method, device, equipment and storage medium for transverse positioning of vehicle in lane
CN109195849B (en) Image pickup apparatus
EP4141483A1 (en) Target detection method and apparatus
CN112902911B (en) Ranging method, device, equipment and storage medium based on monocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200707

WD01 Invention patent application deemed withdrawn after publication