CN112525267A - In-vehicle environment detection method and device, electronic equipment and storage medium - Google Patents

In-vehicle environment detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112525267A
CN112525267A CN202011515813.6A CN202011515813A CN112525267A CN 112525267 A CN112525267 A CN 112525267A CN 202011515813 A CN202011515813 A CN 202011515813A CN 112525267 A CN112525267 A CN 112525267A
Authority
CN
China
Prior art keywords
vehicle
unmanned vehicle
information
interior
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011515813.6A
Other languages
Chinese (zh)
Inventor
王小刚
余程鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Leading Technology Co Ltd
Original Assignee
Nanjing Leading Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Leading Technology Co Ltd filed Critical Nanjing Leading Technology Co Ltd
Priority to CN202011515813.6A priority Critical patent/CN112525267A/en
Publication of CN112525267A publication Critical patent/CN112525267A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses in-vehicle environment detection method, device, electronic equipment and storage medium, which belong to the technical field of vehicle operation, and the method comprises the following steps: when the fact that the unmanned vehicle enters the no-load running state from the passenger-carrying running state is determined, obtaining in-vehicle environment information, wherein the in-vehicle environment information at least comprises in-vehicle image information of the unmanned vehicle, then, according to the in-vehicle image information, determining whether the cleanliness inside the unmanned vehicle reaches the standard or not, and if the cleanliness inside the unmanned vehicle does not reach the standard, sending alarm information that the in-vehicle environment of the unmanned vehicle is to be improved. Like this, can in time know the in-vehicle environment of unmanned vehicle and improve the in-vehicle environment when necessary after the passenger gets off the bus, consequently, can promote the comfort level of riding of unmanned vehicle, and then promote the on-line operation effect of unmanned vehicle.

Description

In-vehicle environment detection method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of vehicle operation technologies, and in particular, to a method and an apparatus for detecting an environment inside a vehicle, an electronic device, and a storage medium.
Background
With the rapid development of the internet technology, many vehicle operation enterprises turn the off-line operation mode to the on-line mode, and the unmanned network car reservation appears, so that great convenience is brought to passengers when the unmanned network car reservation is called.
However, since the unmanned network booking vehicle only provides a vehicle without a driver, and passengers who reserve to use the vehicle are diversified, after one passenger uses the vehicle, how to ensure riding comfort of the vehicle and not influence the next passenger to use the vehicle is a problem which needs to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a method and a device for detecting an in-vehicle environment, electronic equipment and a storage medium, which are used for improving the online operation effect of an unmanned vehicle.
In a first aspect, an embodiment of the present application provides a method for detecting an environment in a vehicle, including:
when the unmanned vehicle is determined to enter an idle running state from a passenger-carrying running state, obtaining in-vehicle environment information, wherein the in-vehicle environment information at least comprises in-vehicle image information of the unmanned vehicle;
determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the in-vehicle image information;
and if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard, sending warning information that the environment in the unmanned vehicle is to be improved.
In one possible embodiment, the in-vehicle environment information further includes in-vehicle smell information of the unmanned vehicle, and further includes:
determining whether the interior of the unmanned vehicle has peculiar smell or not according to the in-vehicle smell information;
and if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard or the interior of the unmanned vehicle is determined to have peculiar smell, sending the alarm information.
In one possible embodiment, after determining that the unmanned vehicle has an interior odor, the method further comprises:
controlling the unmanned vehicle to ventilate, and controlling the unmanned vehicle to stop ventilating after the ventilation time of the unmanned vehicle reaches a preset time;
re-acquiring in-vehicle smell information of the unmanned vehicle;
determining whether the interior of the unmanned vehicle has peculiar smell or not according to the newly acquired in-vehicle smell information;
and if the peculiar smell is still in the unmanned vehicle, sending the alarm information.
In one possible embodiment, the concentration of at least one odor is contained in any in-vehicle odor information, and whether the interior of the unmanned vehicle has an odor is determined according to the following steps:
if the concentration of any smell is determined to exceed the concentration threshold of the smell, determining that peculiar smell exists in the unmanned vehicle;
determining that no odor is present in the unmanned vehicle if it is determined that the concentration of each odor does not exceed the threshold concentration for that odor.
In one possible embodiment, the in-vehicle environment information further includes in-vehicle volume information of the unmanned vehicle, further including:
determining whether noise exists in the unmanned vehicle according to the volume information in the vehicle;
and if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard, the peculiar smell is determined in the interior of the unmanned vehicle, or the noise is determined in the interior of the unmanned vehicle, sending the alarm information.
In one possible embodiment, the method further comprises:
scene representation information used for representing the driving scene of the unmanned vehicle is obtained;
determining a driving scene of the unmanned vehicle according to the scene representation information;
determining whether there is noise inside the unmanned vehicle according to the in-vehicle volume information, including:
if the volume inside the unmanned vehicle is determined to exceed the volume threshold corresponding to the determined running scene according to the volume information inside the unmanned vehicle, determining that noise exists inside the unmanned vehicle; and if the volume in the unmanned vehicle does not exceed the volume threshold corresponding to the driving scene according to the volume information in the vehicle, determining that no noise exists in the unmanned vehicle.
In one possible embodiment, the method further comprises:
acquiring weather information of the location of the unmanned vehicle;
and if the weather where the unmanned vehicle is located is determined to be not the appointed weather according to the weather information, acquiring the volume information in the unmanned vehicle, wherein the influence degree of the appointed weather on the volume in the unmanned vehicle is higher than the preset degree.
In one possible embodiment, the method further comprises:
acquiring in-vehicle image information of the unmanned vehicle in a passenger-carrying driving state;
determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the in-vehicle image information;
and if the finishing degree in the unmanned vehicle is determined not to reach the standard, controlling the unmanned vehicle to remind passengers to improve the finishing degree in the unmanned vehicle.
In one possible embodiment, any one of the in-vehicle image information includes at least two in-vehicle portion images, and determining a finish degree of an interior of the unmanned vehicle based on the in-vehicle image information includes:
correcting and splicing the images of the interior of each vehicle in the image information of the interior of the vehicle to obtain an integral image of the interior of the vehicle with a preset visual angle;
inputting the whole image in the vehicle into an abnormity detection model for abnormity region detection to obtain an abnormity region in the whole image in the vehicle, wherein the abnormity detection model is obtained by learning the pixel characteristics of the abnormity region in the whole image in the vehicle with unqualified cleanliness;
if the number of pixels in the abnormal area in the integral image in the vehicle is determined to be larger than a set value, determining that the cleanliness of the interior of the unmanned vehicle does not reach the standard; and if the number of pixels in the abnormal area in the integral image in the vehicle is not larger than the set value, determining that the cleanliness of the interior of the unmanned vehicle reaches the standard.
In a second aspect, an embodiment of the present application provides an in-vehicle environment detection apparatus, including:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring in-vehicle environment information when determining that the unmanned vehicle enters an idle running state from a passenger-carrying running state, and the in-vehicle environment information at least comprises in-vehicle image information of the unmanned vehicle;
the determining module is used for determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the in-vehicle image information;
and the sending module is used for sending warning information that the environment in the unmanned vehicle needs to be improved if the cleanliness in the unmanned vehicle is determined not to reach the standard.
In one possible embodiment, the in-vehicle environment information further includes in-vehicle smell information of the unmanned vehicle,
the determining module is further used for determining whether the interior of the unmanned vehicle has peculiar smell according to the in-vehicle smell information;
the sending module is further used for sending the warning information if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard or the interior of the unmanned vehicle is determined to have peculiar smell.
In one possible embodiment, the method further comprises:
the control module is used for controlling the unmanned vehicle to ventilate after determining that peculiar smell exists in the unmanned vehicle, and controlling the unmanned vehicle to stop ventilating after determining that the ventilation time of the unmanned vehicle reaches a preset time;
the acquisition module is further used for re-acquiring the in-vehicle smell information of the unmanned vehicle;
the determining module is further used for determining whether the interior of the unmanned vehicle has peculiar smell according to the acquired interior smell information of the vehicle;
the sending module is further configured to send the warning message if it is determined that the interior of the unmanned vehicle still has the peculiar smell.
In one possible embodiment, the concentration of at least one odor is contained in any in-vehicle odor information,
the determining module is specifically configured to determine that there is an odor inside the unmanned vehicle if it is determined that the concentration of any one of the odors exceeds a concentration threshold of the odor; determining that no odor is present in the unmanned vehicle if it is determined that the concentration of each odor does not exceed the threshold concentration for that odor.
In one possible embodiment, the in-vehicle environment information further includes in-vehicle volume information of the unmanned vehicle,
the determining module is further used for determining whether noise exists in the unmanned vehicle according to the volume information in the vehicle;
the sending module is further used for sending the warning information if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard, the peculiar smell is determined in the interior of the unmanned vehicle, or the noise is determined in the interior of the unmanned vehicle.
In a possible implementation manner, the obtaining module is further configured to obtain scene representation information for representing a driving scene of the unmanned vehicle;
the determining module is specifically configured to determine a driving scene of the unmanned vehicle according to the scene characterization information; if the volume inside the unmanned vehicle is determined to exceed the volume threshold corresponding to the determined running scene according to the volume information inside the unmanned vehicle, determining that noise exists inside the unmanned vehicle; and if the volume in the unmanned vehicle does not exceed the volume threshold corresponding to the driving scene according to the volume information in the vehicle, determining that no noise exists in the unmanned vehicle.
In a possible implementation manner, the obtaining module is further configured to obtain weather information of a location where the unmanned vehicle is located;
the sending module is further configured to acquire the volume information in the unmanned vehicle if it is determined according to the weather information that the weather where the unmanned vehicle is located is not the designated weather, wherein the influence degree of the designated weather on the volume in the unmanned vehicle is higher than a preset degree.
In a possible implementation manner, the obtaining module is further configured to obtain in-vehicle image information of the unmanned vehicle in a passenger-carrying driving state;
the determining module is further used for determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the in-vehicle image information;
the sending module is further used for controlling the unmanned vehicle to remind passengers of improving the cleanliness of the interior of the unmanned vehicle if the fact that the cleanliness of the interior of the unmanned vehicle does not reach the standard is determined.
In one possible embodiment, any one of the in-vehicle image information includes at least two in-vehicle part images,
the determining module is specifically configured to correct and splice the intra-vehicle partial images in the intra-vehicle image information to obtain an intra-vehicle overall image at a preset viewing angle; inputting the whole image in the vehicle into an abnormity detection model for abnormity region detection to obtain an abnormity region in the whole image in the vehicle, wherein the abnormity detection model is obtained by learning the pixel characteristics of the abnormity region in the whole image in the vehicle with unqualified cleanliness; if the number of pixels in the abnormal area in the integral image in the vehicle is determined to be larger than a set value, determining that the cleanliness of the interior of the unmanned vehicle does not reach the standard; and if the number of pixels in the abnormal area in the integral image in the vehicle is not larger than the set value, determining that the cleanliness of the interior of the unmanned vehicle reaches the standard.
In a third aspect, an embodiment of the present application provides an electronic device, including: at least one processor, and a memory communicatively coupled to the at least one processor, wherein:
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the in-vehicle environment detection method described above.
In a fourth aspect, embodiments of the present application provide a storage medium, where instructions are executed by a processor of an electronic device, and the electronic device is capable of executing the in-vehicle environment detection method.
In the embodiment of the application, when the unmanned vehicle is determined to enter the no-load running state from the passenger-carrying running state, the in-vehicle environment information is obtained, the in-vehicle environment information at least comprises in-vehicle image information of the unmanned vehicle, then, whether the cleanliness inside the unmanned vehicle reaches the standard or not is determined according to the in-vehicle image information, and if the cleanliness inside the unmanned vehicle does not reach the standard, the warning information that the in-vehicle environment of the unmanned vehicle is to be improved is sent. Like this, can in time know the in-vehicle environment of unmanned vehicle and improve the in-vehicle environment when necessary after the passenger gets off the bus, consequently, can promote the comfort level of riding of unmanned vehicle, and then promote the on-line operation effect of unmanned vehicle.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic view of an application scenario of a method for detecting an in-vehicle environment according to an embodiment of the present application;
fig. 2 is a flowchart of a method for detecting an in-vehicle environment according to an embodiment of the present disclosure;
fig. 3 is a flowchart of another method for detecting an environment in a vehicle according to an embodiment of the present disclosure;
fig. 4 is a flowchart of another method for detecting an environment in a vehicle according to an embodiment of the present disclosure;
fig. 5 is a flowchart of another method for detecting an environment in a vehicle according to an embodiment of the present application;
fig. 6 is a schematic structural diagram illustrating a comprehensive judgment of riding comfort of an unmanned vehicle by using visual perception, olfactory perception and auditory perception according to an embodiment of the present application;
FIG. 7 is a flowchart of a method for visually inspecting the internal cleanliness of an unmanned vehicle according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an in-vehicle environment detection apparatus according to an embodiment of the present disclosure;
fig. 9 is a schematic hardware structure diagram of an electronic device for implementing a method for detecting an in-vehicle environment according to an embodiment of the present disclosure.
Detailed Description
In order to improve the online operation effect of the unmanned vehicle, the embodiment of the application provides a method and a device for detecting an in-vehicle environment, an electronic device and a storage medium.
The preferred embodiments of the present application will be described below with reference to the accompanying drawings of the specification, it should be understood that the preferred embodiments described herein are merely for illustrating and explaining the present application, and are not intended to limit the present application, and that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
The in-vehicle environment detection method provided by the embodiment of the application is suitable for all scenes needing to detect the in-vehicle environment, and subsequently introduces the scheme of the embodiment of the application by taking the unmanned network car appointment as an example.
Fig. 1 is an application scenario diagram of an in-vehicle environment detection method provided in an embodiment of the present application, including a terminal 11, a server 12, and an unmanned vehicle 13, where:
the terminal 11, such as a mobile phone, an iPad, or the like, has an application installed therein for reserving an unmanned vehicle. The passenger can control the terminal to send a vehicle reservation request to the server through the application program, and the departure place and the destination which are designated by the passenger can be included in the vehicle reservation request.
After receiving the vehicle reservation request transmitted by any terminal, the server 12 may select one unmanned vehicle from the unmanned vehicles in the idle running state, for example, select an unmanned vehicle closest to the departure point, according to the departure point and the destination in the vehicle reservation request, then send vehicle description information of the selected unmanned vehicle, such as a vehicle identifier, to the corresponding terminal, and may transmit a scheduling instruction to the selected unmanned vehicle, where the scheduling instruction includes the departure point and the destination in the vehicle reservation request.
The unmanned vehicle 13, after receiving the scheduling command sent by the server, can automatically travel to the corresponding departure place to pick up the passenger, and send the passenger to the destination.
In this case, the scheduling command may further include a travel route from the current position to the departure point and from the departure point to the destination.
The server 12 may obtain the in-vehicle environment information of the unmanned vehicle when determining that any unmanned vehicle enters the no-load driving state from the passenger-carrying driving state, for example, when receiving a message sent by the unmanned vehicle to get off, and determine whether the in-vehicle environment of the unmanned vehicle may affect the riding experience of the next passenger according to the obtained in-vehicle environment information, and if it is determined that the in-vehicle environment of the unmanned vehicle may affect the riding experience of the next passenger, may send warning information that the in-vehicle environment of the unmanned vehicle is to be improved to remind an operator to perform timely processing, thereby ensuring the online operation effect of the unmanned vehicle.
In addition, the opportunity of entering the no-load running state from the passenger-carrying running state can be determined by the unmanned vehicle, when the unmanned vehicle is determined to enter the no-load running state from the passenger-carrying running state, the in-vehicle environment information is obtained, whether the in-vehicle environment of the unmanned vehicle can influence the riding experience of the next passenger is determined according to the obtained in-vehicle environment information, if the in-vehicle environment of the unmanned vehicle can influence the riding experience of the next passenger is determined, the warning information that the in-vehicle environment of the unmanned vehicle is to be improved can be sent, so that an operator is reminded to process the warning information in time, and the online running effect of the unmanned vehicle is guaranteed.
That is, the execution subject of the embodiment of the present application may be a server, or may be an unmanned vehicle, or some steps may be executed on the unmanned vehicle side, and some steps may be executed on the server side. The technical solution of the present application is described below with reference to specific embodiments.
Fig. 2 is a flowchart of a method for detecting an in-vehicle environment according to an embodiment of the present application, including the following steps:
s201: and when the unmanned vehicle is determined to enter an idle running state from a passenger-carrying running state, obtaining the in-vehicle environment information, wherein the in-vehicle environment information at least comprises the in-vehicle image information of the unmanned vehicle.
For example, after the unmanned vehicle reaches the destination designated by the passenger and the passenger gets off, it is determined that the unmanned vehicle enters the no-load driving state from the passenger-carrying driving state. For another example, after the unmanned vehicle does not reach the destination specified by the passenger, but the passenger gets off the vehicle and the passenger finishes the corresponding network car appointment order, it is determined that the unmanned vehicle enters the no-load driving state from the passenger-carrying driving state.
During specific implementation, a plurality of panoramic cameras can be installed inside the unmanned vehicle, and the plurality of panoramic cameras can be used for collecting image information in the vehicle in real time or periodically. And because the image acquisition area of each panoramic camera is limited, each panoramic camera acquires the images of the interior of the vehicle at one time.
S202: and determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the image information in the vehicle.
In practical application, images acquired by the panoramic camera have distortion to a certain degree, in order to accurately determine whether the cleanliness of the interior of the unmanned vehicle reaches the standard, the images of the interior of each vehicle in the image information in the vehicle can be corrected, such as distortion correction and splicing, so as to obtain the integral image of the interior of the vehicle with a preset visual angle, such as the integral image of the interior of the vehicle with a roof visual angle, then the integral image of the interior of the vehicle is input into an abnormality detection model for detecting abnormal areas, so as to obtain abnormal areas in the integral image of the vehicle, and if the number of pixels in the abnormal areas in the integral image of the vehicle is determined to be greater than a set value, the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard; and if the number of the pixels in the abnormal area in the integral image in the vehicle is not larger than the set value, determining that the cleanliness of the interior of the unmanned vehicle reaches the standard, wherein the abnormal detection model is obtained by learning the pixel characteristics of the abnormal area in the integral image in the vehicle, of which the cleanliness does not reach the standard.
The above process is described below with reference to specific embodiments.
When the method is specifically implemented, a large number of in-vehicle images can be collected through the panoramic camera, distortion correction is carried out on the in-vehicle images collected at the same time according to built-in parameters of the panoramic camera, then splicing processing is carried out on the corrected in-vehicle images, and therefore in-vehicle overall images at specified visual angles are obtained.
Further, manually labeling areas, such as dirty and messy areas, in which the cleanliness of each in-vehicle whole image A does not reach the standard, modifying pixels in the labeled areas in the in-vehicle whole image A into white and modifying pixels in unmarked areas into black, so as to obtain a binary image B, building a deep learning model by using a Resnet50 reference network structure, then, training the built deep learning model by using the in-vehicle whole image A as model input and the binary image B as model output, and thus obtaining an anomaly detection model.
Subsequently, after the in-vehicle image information of the unmanned vehicle is obtained, correcting and splicing images in each in-vehicle part in the in-vehicle image information to obtain an in-vehicle whole image with a preset visual angle, then inputting the in-vehicle whole image into an abnormality detection model to obtain a binary image, wherein a white area in the binary image is an abnormal area in the in-vehicle whole image, and if the number of pixels contained in the white area in the obtained binary image is larger than a set value, determining that the integral cleanliness in the unmanned vehicle does not reach the standard; and if the number of pixels contained in the white area in the obtained binary image is not larger than the set value, determining that the cleanliness of the interior of the unmanned vehicle reaches the standard.
S203: and if the cleanliness of the interior of the unmanned vehicle is determined to be not up to the standard, sending warning information that the environment in the unmanned vehicle is to be improved.
In practical applications, in addition to the in-vehicle image information of the unmanned vehicle, the in-vehicle environment of the unmanned vehicle may be determined by additionally considering the in-vehicle odor information of the unmanned vehicle. Fig. 3 is a flowchart of another method for detecting an in-vehicle environment according to an embodiment of the present application, including the following steps:
s301: and when the unmanned vehicle is determined to enter an idle running state from a passenger-carrying running state, obtaining the in-vehicle environment information, wherein the in-vehicle environment information comprises in-vehicle image information and in-vehicle smell information of the unmanned vehicle.
In particular, various types of odor sensors can be installed inside the unmanned vehicle, and each type of odor sensor is used for collecting the concentration of one odor. For example, some odor sensors are used to collect alcohol concentration and some odor sensors are used to collect formaldehyde concentration. Also, different scent may have different concentration thresholds set.
S302: and determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the image information in the vehicle.
The implementation of this step can be referred to as the implementation of S202, and is not described herein again.
S303: and determining whether the interior of the unmanned vehicle has peculiar smell or not according to the information of the peculiar smell in the vehicle.
In specific implementation, if the concentration of any smell is determined to exceed the concentration threshold of the smell, the peculiar smell in the unmanned vehicle is determined; and if the concentration of each smell is determined not to exceed the concentration threshold of the smell, determining that no peculiar smell exists in the unmanned vehicle, namely when the concentrations of the smells do not exceed the respective corresponding concentration thresholds, determining that no peculiar smell exists in the unmanned vehicle.
S304: and if the cleanliness of the interior of the unmanned vehicle is determined to be not up to the standard or the peculiar smell in the interior of the unmanned vehicle is determined, sending alarm information that the environment in the unmanned vehicle needs to be improved.
It should be noted that there is no strict sequence relationship between S302 and S303.
In addition, after the peculiar smell in the unmanned vehicle is determined, the unmanned vehicle can be controlled to ventilate, such as opening a window and/or an air conditioner, after the ventilation time of the unmanned vehicle reaches the preset time, the unmanned vehicle is controlled to stop ventilation, the in-vehicle peculiar smell information of the unmanned vehicle is obtained again, and if the peculiar smell in the unmanned vehicle is determined to be still peculiar smell according to the in-vehicle peculiar smell information obtained again, the alarm information that the in-vehicle environment of the unmanned vehicle is to be improved is sent.
Like this, can solve the problem that some unmanned vehicle is inside to have the peculiar smell automatically, and to the peculiar smell problem in the car that ventilation can't be solved, send alarm information again and report an emergency and ask for help or increased vigilance of operation personnel's work load to the intelligent degree of operation on the unmanned vehicle line can be promoted. In addition, the warning information can carry warning reasons such as that the cleanliness in the vehicle is not up to standard, the concentration of certain smell is over to standard and the like, so that operators can know the condition in the vehicle as much as possible and make a correct coping strategy as soon as possible.
In practical application, the in-vehicle environment of the unmanned vehicle can be determined by taking the in-vehicle volume information of the unmanned vehicle into consideration. Fig. 4 is a flowchart of another method for detecting an in-vehicle environment according to an embodiment of the present application, including the following steps:
s401: and when the unmanned vehicle is determined to enter an idle running state from a passenger-carrying running state, obtaining the in-vehicle environment information, wherein the in-vehicle environment information comprises in-vehicle image information, in-vehicle smell information and in-vehicle volume information of the unmanned vehicle.
During specific implementation, volume acquisition equipment can be further installed inside the unmanned vehicle, so that volume information in the vehicle can be acquired when needed.
S402: and determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the image information in the vehicle.
The implementation of this step can be referred to as the implementation of S202, and is not described herein again.
S403: and determining whether the interior of the unmanned vehicle has peculiar smell or not according to the information of the peculiar smell in the vehicle.
S404: and determining whether the interior of the unmanned vehicle has noise or not according to the volume information in the vehicle.
For example, if it is determined that the volume inside the unmanned vehicle exceeds a set value according to the volume information inside the vehicle, it is determined that noise exists inside the unmanned vehicle; and if the volume in the unmanned vehicle does not exceed the set value according to the volume information in the vehicle, determining that no noise exists in the unmanned vehicle.
S405: and if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard, peculiar smell is determined in the interior of the unmanned vehicle or noise is determined in the interior of the unmanned vehicle, sending warning information that the environment in the unmanned vehicle needs to be improved.
It should be noted that there is no strict sequence relationship between the above-mentioned S402-S404.
Considering that there are various driving scenes of the unmanned vehicle, and the volume difference inside the unmanned vehicle is large in different scenes, in order to determine whether there is noise inside the unmanned vehicle more accurately, the driving scene of the unmanned vehicle may be divided into a plurality of driving scenes, and different volume thresholds may be set for each driving scene.
For example, scenario 1: closing the windows, driving the speed of 50km/h, counting the average noise value of the urban road to be 65db, and setting the volume threshold to be 75 db.
As another example, scenario 2: closing the car window, driving speed 60km/h, calculating the average noise value of the expressway to be 75db, and setting the volume threshold value to be 80 db.
Fig. 5 is a flowchart of another method for detecting an environment in a vehicle according to an embodiment of the present application, including the following steps:
s501: when the fact that the unmanned vehicle enters the no-load driving state from the passenger-carrying driving state is determined, scene representation information and in-vehicle environment information used for representing the driving scene of the unmanned vehicle are obtained, wherein the in-vehicle environment information comprises in-vehicle image information, in-vehicle smell information and in-vehicle volume information of the unmanned vehicle.
S502: and determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the image information in the vehicle.
S503: and determining whether the interior of the unmanned vehicle has peculiar smell or not according to the information of the peculiar smell in the vehicle.
S504: and determining the driving scene of the unmanned vehicle according to the scene representation information.
S505: if the volume inside the unmanned vehicle exceeds the volume threshold corresponding to the determined running scene according to the volume information in the vehicle, determining that noise exists inside the unmanned vehicle; and if the volume inside the unmanned vehicle is determined not to exceed the volume threshold corresponding to the driving scene according to the volume information inside the unmanned vehicle, determining that no noise exists inside the unmanned vehicle.
S506: and if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard, peculiar smell is determined in the interior of the unmanned vehicle or noise is determined in the interior of the unmanned vehicle, sending warning information that the environment in the unmanned vehicle needs to be improved.
It should be noted that there is no strict sequence relationship among S502, S503, and S504-S505.
In addition, the influence of weather factors on the volume inside the unmanned vehicle is large, and the volume inside the unmanned vehicle may not be considered reasonably in the weather such as rainstorm, typhoon and the like. In order to make the volume in the vehicle have a positive effect on the riding comfort of the unmanned vehicle, the volume in the vehicle may be considered only in sunny days, cloudy days, and the like.
In a specific implementation, in any of the above embodiments in which the in-vehicle volume information of the unmanned vehicle is considered, when it is determined that the unmanned vehicle enters an idle running state from a passenger-carrying running state, weather information of a location where the unmanned vehicle is located may also be obtained, and if it is determined that the weather of the location where the unmanned vehicle is located is not a specified weather according to the weather information, the in-vehicle volume information of the unmanned vehicle is obtained; if it is determined that the weather where the unmanned vehicle is located is the designated weather according to the weather information, the volume information in the unmanned vehicle is not acquired, namely, the influence of the volume in the vehicle on riding comfort is not considered, wherein the influence degree of the designated weather on the volume in the unmanned vehicle is higher than the preset degree, such as rainstorm weather, windstorm weather and the like.
In addition, in any of the above embodiments, the in-vehicle image information of the unmanned vehicle in the passenger carrying running state can be further acquired, whether the cleanliness of the inside of the unmanned vehicle reaches the standard or not is determined according to the acquired in-vehicle image information, and if the cleanliness of the inside of the unmanned vehicle does not reach the standard, the unmanned vehicle can be further controlled to remind the passenger to improve the cleanliness of the inside of the unmanned vehicle, that is, the unmanned vehicle is controlled to remind the passenger to clean the inside of the unmanned vehicle in time.
The technical solution of the present application is described below with reference to specific embodiments.
The embodiment of the application mainly aims at monitoring the environment in the unmanned vehicle in the operation process so as to judge whether the environment in the vehicle is suitable for next operation. The method mainly comprises two stages: the first stage is that before the unmanned vehicle reaches the destination appointed by the current passenger, the abnormal area (namely the area with the smoothness not reaching the standard) in the unmanned vehicle is judged by using a visual perception algorithm, and the passenger is informed of the abnormal area by voice; the second stage is that the unmanned vehicle arrives at a destination, and after the passenger gets off, the unmanned vehicle patrols (namely the unmanned vehicle enters an idle running state from a passenger-carrying running state), the riding comfort of the unmanned vehicle is comprehensively judged by using visual perception, olfactory perception and auditory perception, and when the conditions that the riding experience is influenced by dirty and messy interior, peculiar smell, large noise and the like of the unmanned vehicle are found, the riding comfort can be reported to the Internet of vehicles platform for processing.
Fig. 6 is a schematic structural view illustrating a structure for comprehensively determining riding comfort of an unmanned vehicle by using visual perception, olfactory perception and auditory perception, according to an embodiment of the present disclosure, where a visual perception module may analyze image information in the unmanned vehicle, determine whether there is a visual abnormality in the vehicle (i.e., whether there is an area with an unsatisfactory finish), and if it is determined that there is a visual abnormality in the vehicle, upload the image information in the vehicle so that a subsequent operator may determine whether there is a visual abnormality in the vehicle, and if there is an erroneous determination, may analyze a cause of the erroneous determination and improve a visual algorithm of the visual perception module; the smell perception module can analyze smell information in the unmanned vehicle to determine whether smell is abnormal (namely whether the smell exists) in the vehicle, if so, the smell information in the vehicle can be uploaded to enable a subsequent operator to determine whether the smell is abnormal, and if so, the reason for misjudgment is analyzed and the smell algorithm of the smell perception module is improved; the auditory perception module can analyze the volume information in the unmanned vehicle to determine whether noise exists in the vehicle (namely whether the volume in the vehicle is larger than a set volume threshold), if so, the volume information in the vehicle can be uploaded to enable a subsequent operator to determine whether the noise exists in the vehicle, and if so, the reason of misjudgment is analyzed and the auditory algorithm of the auditory perception module is improved.
Each module is described below.
1. And a visual perception module.
Referring to fig. 7, a 360-degree panoramic camera may be installed in the unmanned vehicle, so as to observe the environment in the vehicle in an all-around manner, after the panoramic camera data of the unmanned vehicle is obtained, distortion correction and splicing may be performed on the panoramic camera data, so as to obtain an in-vehicle overall image at a specified viewing angle, then, an abnormal region detection may be performed on the in-vehicle overall image by using an abnormal detection model, so as to obtain an abnormal region in the in-vehicle overall image, and the abnormal region may be evaluated for cleanliness, for example, whether the in-vehicle overall image is dirty or not may be determined, and when it is determined that the in-vehicle overall image is dirty, the panoramic camera data may be uploaded; when it is determined that there is no mess, the present flow is ended. Subsequently, the in-vehicle image information can be verified and analyzed, if the fact that the abnormality such as dirty, messy and poor caused by the passenger does exist is determined, violation marks can be carried out on the passenger to influence the subsequent riding of the passenger, and if the fact that the abnormality does not exist is determined, the in-vehicle image information can be collected to optimize the abnormality detection model, so that the abnormality detection model is more accurate.
2. And an olfactory perception module.
When the concentration of any smell exceeds the concentration threshold of the smell, the peculiar smell in the unmanned vehicle is determined, and then the smell information in the unmanned vehicle can be carried in the alarm information and reported to the vehicle networking platform.
Further, when the vehicle networking platform receives the alarm information, if the peculiar smell in the unmanned vehicle is determined, the unmanned vehicle can be controlled to ventilate, such as opening a vehicle window, opening an air conditioner in the vehicle and the like, after a preset time period, the unmanned vehicle is controlled to stop ventilating, the peculiar smell information in the unmanned vehicle is obtained, and if the peculiar smell in the unmanned vehicle is determined according to the obtained peculiar smell information in the vehicle, an operator is informed to process the peculiar smell in time; and if the fact that no peculiar smell exists in the unmanned vehicle is determined according to the acquired smell information in the vehicle, the alarm can be relieved.
3. And a hearing perception module.
In practical application, when the unmanned vehicle patrols, noise exists in the vehicle, and if the noise is too large, the comfort of passengers is affected, so that the volume information in the vehicle of the unmanned vehicle needs to be measured, and the noise in the vehicle needs to be controlled in a proper range, wherein the noise measuring equipment can be a precise sound level meter, and the precise sound level meter can be arranged at the position close to the head in the middle of the back row of the unmanned vehicle.
Considering that the noise sources in the vehicle are more, such as the vehicle speed, whether windows are opened or not, the road condition, the road surface and the like all influence the noise detection. Because the noise detection has more influence factors, a big data statistical algorithm can be adopted to measure the noise value:
vehicle speed dimension: noise values at low speed [ <30km/h ], medium speed [30km/h, 60km/h ], high speed [ >70km/h ];
window dimensions: recording noise values of different opening positions and different opening numbers of the car window;
road condition dimension: recording noise values under the conditions of tunnels, high speed, urban crowded roads and the like according to information provided by a GPS;
weather dimension: when weather is abnormal, such as overlarge wind speed and heavy rain, noise measurement is not carried out;
and (3) analyzing the statistical big data to obtain the standard noise value of the unmanned vehicle under various scenes, such as scene 1: the vehicle window is closed, the vehicle speed is 50km/h, the average noise value of the urban road is counted to be 65db, and then the set threshold value can be 75 db. Subsequently, if the volume in the vehicle measured in the scene 1 is more than 75db when the unmanned vehicle patrols, reporting to the Internet of vehicles platform for noise test analysis.
When the method provided in the embodiments of the present application is implemented in software or hardware or a combination of software and hardware, a plurality of functional modules may be included in the electronic device, and each functional module may include software, hardware or a combination of software and hardware.
Fig. 8 is a schematic structural diagram of an in-vehicle environment detection apparatus according to an embodiment of the present application, and includes an obtaining module 801, a determining module 802, and a sending module 803.
An obtaining module 801, configured to obtain in-vehicle environment information when it is determined that an unmanned vehicle enters an idle running state from a passenger-carrying running state, where the in-vehicle environment information at least includes in-vehicle image information of the unmanned vehicle;
a determining module 802, configured to determine whether a cleanliness of an interior of the unmanned vehicle reaches a standard according to the in-vehicle image information;
a sending module 803, configured to send warning information that the in-vehicle environment of the unmanned vehicle is to be improved if it is determined that the cleanliness of the inside of the unmanned vehicle does not meet the standard.
In one possible embodiment, the in-vehicle environment information further includes in-vehicle smell information of the unmanned vehicle,
the determining module 802 is further configured to determine whether there is a peculiar smell inside the unmanned vehicle according to the in-vehicle smell information;
the sending module 803 is further configured to send the warning message if it is determined that the cleanliness of the interior of the unmanned vehicle does not reach the standard or it is determined that there is a peculiar smell in the interior of the unmanned vehicle.
In one possible embodiment, the method further comprises:
the control module 804 is used for controlling the unmanned vehicle to ventilate after determining that peculiar smell exists in the unmanned vehicle, and controlling the unmanned vehicle to stop ventilating after determining that the ventilation time of the unmanned vehicle reaches a preset time;
the obtaining module 801 is further configured to obtain interior smell information of the unmanned vehicle again;
the determining module 802 is further configured to determine whether there is a peculiar smell inside the unmanned vehicle according to the acquired in-vehicle smell information;
the sending module 803 is further configured to send the warning message if it is determined that the interior of the unmanned vehicle still has the odor.
In one possible embodiment, the concentration of at least one odor is contained in any in-vehicle odor information,
the determining module 802 is specifically configured to determine that there is an odor inside the unmanned vehicle if it is determined that the concentration of any one of the odors exceeds the concentration threshold of the odor; determining that no odor is present in the unmanned vehicle if it is determined that the concentration of each odor does not exceed the threshold concentration for that odor.
In one possible embodiment, the in-vehicle environment information further includes in-vehicle volume information of the unmanned vehicle,
the determining module 802 is further configured to determine whether noise exists inside the unmanned vehicle according to the volume information inside the unmanned vehicle;
the sending module 803 is further configured to send the warning message if it is determined that the cleanliness of the interior of the unmanned vehicle does not reach the standard, it is determined that there is a peculiar smell in the interior of the unmanned vehicle, or it is determined that there is noise in the interior of the unmanned vehicle.
In a possible implementation manner, the obtaining module 801 is further configured to obtain scene representation information for representing a driving scene of the unmanned vehicle;
the determining module 802 is specifically configured to determine a driving scene of the unmanned vehicle according to the scene representation information; if the volume inside the unmanned vehicle is determined to exceed the volume threshold corresponding to the determined running scene according to the volume information inside the unmanned vehicle, determining that noise exists inside the unmanned vehicle; and if the volume in the unmanned vehicle does not exceed the volume threshold corresponding to the driving scene according to the volume information in the vehicle, determining that no noise exists in the unmanned vehicle.
In a possible implementation, the obtaining module 801 is further configured to obtain weather information of a location where the unmanned vehicle is located;
the sending module 803 is further configured to, if it is determined according to the weather information that the weather where the unmanned vehicle is located is not the specified weather, obtain the in-vehicle volume information of the unmanned vehicle, where the influence degree of the specified weather on the in-vehicle volume of the unmanned vehicle is higher than a preset degree.
In a possible implementation manner, the obtaining module 801 is further configured to obtain in-vehicle image information of the unmanned vehicle in a passenger-carrying driving state;
the determining module 802 is further configured to determine whether the cleanliness of the interior of the unmanned vehicle reaches a standard according to the in-vehicle image information;
the sending module 803 is further configured to control the unmanned vehicle to remind a passenger to improve the cleanliness of the interior of the unmanned vehicle if it is determined that the cleanliness of the interior of the unmanned vehicle does not reach the standard.
In one possible embodiment, any one of the in-vehicle image information includes at least two in-vehicle part images,
the determining module 802 is specifically configured to correct and splice the intra-vehicle partial images in the intra-vehicle image information to obtain an intra-vehicle whole image at a preset view angle; inputting the whole image in the vehicle into an abnormity detection model for abnormity region detection to obtain an abnormity region in the whole image in the vehicle, wherein the abnormity detection model is obtained by learning the pixel characteristics of the abnormity region in the whole image in the vehicle with unqualified cleanliness; if the number of pixels in the abnormal area in the integral image in the vehicle is determined to be larger than a set value, determining that the cleanliness of the interior of the unmanned vehicle does not reach the standard; and if the number of pixels in the abnormal area in the integral image in the vehicle is not larger than the set value, determining that the cleanliness of the interior of the unmanned vehicle reaches the standard.
The division of the modules in the embodiments of the present application is schematic, and only one logical function division is provided, and in actual implementation, there may be another division manner, and in addition, each functional module in each embodiment of the present application may be integrated in one processor, may also exist alone physically, or may also be integrated in one module by two or more modules. The coupling of the various modules to each other may be through interfaces that are typically electrical communication interfaces, but mechanical or other forms of interfaces are not excluded. Thus, modules described as separate components may or may not be physically separate, may be located in one place, or may be distributed in different locations on the same or different devices. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, where the electronic device includes a transceiver 901 and a processor 902, and the processor 902 may be a Central Processing Unit (CPU), a microprocessor, an application specific integrated circuit, a programmable logic circuit, a large scale integrated circuit, or a digital Processing Unit. The transceiver 901 is used for data transmission and reception between an electronic device and other devices.
The electronic device may further comprise a memory 903 for storing software instructions executed by the processor 902, but may also store some other data required by the electronic device, such as identification information of the electronic device, encryption information of the electronic device, user data, etc. The Memory 903 may be a Volatile Memory (Volatile Memory), such as a Random-Access Memory (RAM); the Memory 903 may also be a Non-Volatile Memory (Non-Volatile Memory) such as a Read-Only Memory (ROM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, HDD) or a Solid-State Drive (SSD), or the Memory 903 is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited thereto. The memory 903 may be a combination of the above memories.
The specific connection medium between the processor 902, the memory 903 and the transceiver 901 is not limited in the embodiments of the present application. In the embodiment of the present application, only the memory 903, the processor 902, and the transceiver 901 are connected through the bus 904 in fig. 9 for explanation, the bus is shown by a thick line in fig. 9, and the connection manner between other components is only for illustrative purpose and is not limited thereto. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 9, but this does not indicate only one bus or one type of bus.
The processor 902 may be dedicated hardware or a processor running software, and when the processor 902 may run software, the processor 902 reads software instructions stored in the memory 903 and executes the in-vehicle environment detection method referred to in the foregoing embodiments under the drive of the software instructions.
The embodiment of the present application also provides a storage medium, and when instructions in the storage medium are executed by a processor of an electronic device, the electronic device is capable of executing the in-vehicle environment detection method in the foregoing embodiment.
In some possible embodiments, the various aspects of the in-vehicle environment detection method provided in this application may also be implemented in the form of a program product, where the program product includes program code, and when the program product is run on an electronic device, the program code is configured to cause the electronic device to execute the in-vehicle environment detection method in the foregoing embodiments.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable Disk, a hard Disk, a RAM, a ROM, an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a Compact Disk Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product for detecting the environment in the vehicle in the embodiment of the application can adopt a CD-ROM and comprises program codes, and can run on a computing device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In situations involving remote computing devices, the remote computing devices may be connected to the user computing device over any kind of Network, such as a Local Area Network (LAN) or Wide Area Network (WAN), or may be connected to external computing devices (e.g., over the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (20)

1. An in-vehicle environment detection method is characterized by comprising the following steps:
when the unmanned vehicle is determined to enter an idle running state from a passenger-carrying running state, obtaining in-vehicle environment information, wherein the in-vehicle environment information at least comprises in-vehicle image information of the unmanned vehicle;
determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the in-vehicle image information;
and if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard, sending warning information that the environment in the unmanned vehicle is to be improved.
2. The method of claim 1, wherein the in-vehicle environmental information further includes in-vehicle odor information of the unmanned vehicle, further comprising:
determining whether the interior of the unmanned vehicle has peculiar smell or not according to the in-vehicle smell information;
and if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard or the interior of the unmanned vehicle is determined to have peculiar smell, sending the alarm information.
3. The method of claim 2, further comprising, after determining that the unmanned vehicle has an interior odor,:
controlling the unmanned vehicle to ventilate, and controlling the unmanned vehicle to stop ventilating after the ventilation time of the unmanned vehicle reaches a preset time;
re-acquiring in-vehicle smell information of the unmanned vehicle;
determining whether the interior of the unmanned vehicle has peculiar smell or not according to the newly acquired in-vehicle smell information;
and if the peculiar smell is still in the unmanned vehicle, sending the alarm information.
4. A method according to claim 2 or 3, wherein the concentration of at least one smell is included in any in-vehicle smell information, and whether there is an offensive smell inside the unmanned vehicle is determined according to the steps of:
if the concentration of any smell is determined to exceed the concentration threshold of the smell, determining that peculiar smell exists in the unmanned vehicle;
determining that no odor is present in the unmanned vehicle if it is determined that the concentration of each odor does not exceed the threshold concentration for that odor.
5. The method of any of claims 1-3, wherein the in-vehicle environment information further comprises in-vehicle volume information for the unmanned vehicle, further comprising:
determining whether noise exists in the unmanned vehicle according to the volume information in the vehicle;
and if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard, the peculiar smell is determined in the interior of the unmanned vehicle, or the noise is determined in the interior of the unmanned vehicle, sending the alarm information.
6. The method of claim 5, further comprising:
scene representation information used for representing the driving scene of the unmanned vehicle is obtained;
determining a driving scene of the unmanned vehicle according to the scene representation information;
determining whether there is noise inside the unmanned vehicle according to the in-vehicle volume information, including:
if the volume inside the unmanned vehicle is determined to exceed the volume threshold corresponding to the determined running scene according to the volume information inside the unmanned vehicle, determining that noise exists inside the unmanned vehicle; and if the volume in the unmanned vehicle does not exceed the volume threshold corresponding to the driving scene according to the volume information in the vehicle, determining that no noise exists in the unmanned vehicle.
7. The method of claim 5, further comprising:
acquiring weather information of the location of the unmanned vehicle;
and if the weather where the unmanned vehicle is located is determined to be not the appointed weather according to the weather information, acquiring the volume information in the unmanned vehicle, wherein the influence degree of the appointed weather on the volume in the unmanned vehicle is higher than the preset degree.
8. The method of claim 1, further comprising:
acquiring in-vehicle image information of the unmanned vehicle in a passenger-carrying driving state;
determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the in-vehicle image information;
and if the finishing degree in the unmanned vehicle is determined not to reach the standard, controlling the unmanned vehicle to remind passengers to improve the finishing degree in the unmanned vehicle.
9. The method of claim 1 or 8, wherein any in-vehicle image information includes at least two in-vehicle images, and determining a smoothness of an interior of the unmanned vehicle based on the in-vehicle image information comprises:
correcting and splicing the images of the interior of each vehicle in the image information of the interior of the vehicle to obtain an integral image of the interior of the vehicle with a preset visual angle;
inputting the whole image in the vehicle into an abnormity detection model for abnormity region detection to obtain an abnormity region in the whole image in the vehicle, wherein the abnormity detection model is obtained by learning the pixel characteristics of the abnormity region in the whole image in the vehicle with unqualified cleanliness;
if the number of pixels in the abnormal area in the integral image in the vehicle is determined to be larger than a set value, determining that the cleanliness of the interior of the unmanned vehicle does not reach the standard; and if the number of pixels in the abnormal area in the integral image in the vehicle is not larger than the set value, determining that the cleanliness of the interior of the unmanned vehicle reaches the standard.
10. An in-vehicle environment detection device, characterized by comprising:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring in-vehicle environment information when determining that the unmanned vehicle enters an idle running state from a passenger-carrying running state, and the in-vehicle environment information at least comprises in-vehicle image information of the unmanned vehicle;
the determining module is used for determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the in-vehicle image information;
and the sending module is used for sending warning information that the environment in the unmanned vehicle needs to be improved if the cleanliness in the unmanned vehicle is determined not to reach the standard.
11. The apparatus according to claim 10, wherein the in-vehicle environmental information further includes in-vehicle smell information of the unmanned vehicle,
the determining module is further used for determining whether the interior of the unmanned vehicle has peculiar smell according to the in-vehicle smell information;
the sending module is further used for sending the warning information if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard or the interior of the unmanned vehicle is determined to have peculiar smell.
12. The apparatus of claim 11, further comprising:
the control module is used for controlling the unmanned vehicle to ventilate after determining that peculiar smell exists in the unmanned vehicle, and controlling the unmanned vehicle to stop ventilating after determining that the ventilation time of the unmanned vehicle reaches a preset time;
the acquisition module is further used for re-acquiring the in-vehicle smell information of the unmanned vehicle;
the determining module is further used for determining whether the interior of the unmanned vehicle has peculiar smell according to the acquired interior smell information of the vehicle;
the sending module is further configured to send the warning message if it is determined that the interior of the unmanned vehicle still has the peculiar smell.
13. The apparatus according to claim 11 or 12, wherein the concentration of at least one smell is included in any in-vehicle smell information,
the determining module is specifically configured to determine that there is an odor inside the unmanned vehicle if it is determined that the concentration of any one of the odors exceeds a concentration threshold of the odor; determining that no odor is present in the unmanned vehicle if it is determined that the concentration of each odor does not exceed the threshold concentration for that odor.
14. The apparatus of any of claims 10-12, wherein the in-vehicle environment information further comprises in-vehicle volume information of the unmanned vehicle,
the determining module is further used for determining whether noise exists in the unmanned vehicle according to the volume information in the vehicle;
the sending module is further used for sending the warning information if the cleanliness of the interior of the unmanned vehicle is determined not to reach the standard, the peculiar smell is determined in the interior of the unmanned vehicle, or the noise is determined in the interior of the unmanned vehicle.
15. The apparatus of claim 14,
the acquisition module is further used for acquiring scene representation information used for representing the driving scene of the unmanned vehicle;
the determining module is specifically configured to determine a driving scene of the unmanned vehicle according to the scene characterization information; if the volume inside the unmanned vehicle is determined to exceed the volume threshold corresponding to the determined running scene according to the volume information inside the unmanned vehicle, determining that noise exists inside the unmanned vehicle; and if the volume in the unmanned vehicle does not exceed the volume threshold corresponding to the driving scene according to the volume information in the vehicle, determining that no noise exists in the unmanned vehicle.
16. The apparatus of claim 14,
the acquisition module is further used for acquiring weather information of the location of the unmanned vehicle;
the sending module is further configured to acquire the volume information in the unmanned vehicle if it is determined according to the weather information that the weather where the unmanned vehicle is located is not the designated weather, wherein the influence degree of the designated weather on the volume in the unmanned vehicle is higher than a preset degree.
17. The apparatus of claim 10,
the acquisition module is also used for acquiring the image information in the unmanned vehicle when the unmanned vehicle is in a passenger carrying driving state;
the determining module is further used for determining whether the cleanliness of the interior of the unmanned vehicle reaches the standard or not according to the in-vehicle image information;
the sending module is further used for controlling the unmanned vehicle to remind passengers of improving the cleanliness of the interior of the unmanned vehicle if the fact that the cleanliness of the interior of the unmanned vehicle does not reach the standard is determined.
18. The apparatus according to claim 10 or 17, wherein any one of the in-vehicle image information includes at least two in-vehicle portion images,
the determining module is specifically configured to correct and splice the intra-vehicle partial images in the intra-vehicle image information to obtain an intra-vehicle overall image at a preset viewing angle; inputting the whole image in the vehicle into an abnormity detection model for abnormity region detection to obtain an abnormity region in the whole image in the vehicle, wherein the abnormity detection model is obtained by learning the pixel characteristics of the abnormity region in the whole image in the vehicle with unqualified cleanliness; if the number of pixels in the abnormal area in the integral image in the vehicle is determined to be larger than a set value, determining that the cleanliness of the interior of the unmanned vehicle does not reach the standard; and if the number of pixels in the abnormal area in the integral image in the vehicle is not larger than the set value, determining that the cleanliness of the interior of the unmanned vehicle reaches the standard.
19. An electronic device, comprising: at least one processor, and a memory communicatively coupled to the at least one processor, wherein:
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-9.
20. A storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method of any of claims 1-9.
CN202011515813.6A 2020-12-21 2020-12-21 In-vehicle environment detection method and device, electronic equipment and storage medium Pending CN112525267A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011515813.6A CN112525267A (en) 2020-12-21 2020-12-21 In-vehicle environment detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011515813.6A CN112525267A (en) 2020-12-21 2020-12-21 In-vehicle environment detection method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112525267A true CN112525267A (en) 2021-03-19

Family

ID=75001975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011515813.6A Pending CN112525267A (en) 2020-12-21 2020-12-21 In-vehicle environment detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112525267A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010243338A (en) * 2009-04-07 2010-10-28 Honda Motor Co Ltd Abnormal noise determining device
CN105974912A (en) * 2016-04-28 2016-09-28 百度在线网络技术(北京)有限公司 Alarm method of unmanned vehicle and apparatus thereof
CN106251189A (en) * 2016-07-19 2016-12-21 河南步用车科技有限公司 A kind of automobile timesharing rent method based on Internet of Things
JP2016218568A (en) * 2015-05-15 2016-12-22 株式会社デンソー Event Detection Device
US20180225890A1 (en) * 2017-02-03 2018-08-09 Ford Global Technologies, Llc System And Method For Assessing The Interior Of An Autonomous Vehicle
CN110356201A (en) * 2018-03-26 2019-10-22 本田技研工业株式会社 Internal environment adjustment device and the vehicle and internal environment method of adjustment for having it
CN110871787A (en) * 2018-09-04 2020-03-10 通用汽车环球科技运作有限责任公司 Method and apparatus for internal noise sensing for efficient noise and vibration performance

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010243338A (en) * 2009-04-07 2010-10-28 Honda Motor Co Ltd Abnormal noise determining device
JP2016218568A (en) * 2015-05-15 2016-12-22 株式会社デンソー Event Detection Device
CN105974912A (en) * 2016-04-28 2016-09-28 百度在线网络技术(北京)有限公司 Alarm method of unmanned vehicle and apparatus thereof
CN106251189A (en) * 2016-07-19 2016-12-21 河南步用车科技有限公司 A kind of automobile timesharing rent method based on Internet of Things
US20180225890A1 (en) * 2017-02-03 2018-08-09 Ford Global Technologies, Llc System And Method For Assessing The Interior Of An Autonomous Vehicle
CN110356201A (en) * 2018-03-26 2019-10-22 本田技研工业株式会社 Internal environment adjustment device and the vehicle and internal environment method of adjustment for having it
CN110871787A (en) * 2018-09-04 2020-03-10 通用汽车环球科技运作有限责任公司 Method and apparatus for internal noise sensing for efficient noise and vibration performance

Similar Documents

Publication Publication Date Title
US10226982B2 (en) Automatic vehicle climate control based on predicted air quality
US11361556B2 (en) Deterioration diagnosis device, deterioration diagnosis system, deterioration diagnosis method, and storage medium for storing program
US10542211B2 (en) Camera subsystem evaluation using sensor report integration
CN111399481B (en) Automatic driving scene information collection and remote upgrading method and system
CN108569097A (en) Control method, control device and the vehicle recycled in vehicle air conditioning
DE102014216954A1 (en) System and method for providing weather information
CN105187790B (en) Method, device and system for monitoring working state of vehicle-mounted terminal
CN113535743A (en) Real-time updating method and device for unmanned map, electronic equipment and storage medium
CN112769877A (en) Group fog early warning method, cloud server, vehicle and medium
CN111126835B (en) Public vehicle management method based on Beidou satellite positioning
CN115273480A (en) Traffic early warning system and method based on big data analysis
CN114966631A (en) Fault diagnosis and processing method and device for vehicle-mounted laser radar, medium and vehicle
EP3854624B1 (en) Method and system for measuring the energy behaviour of a transport network, and computer program therefor
CN112525267A (en) In-vehicle environment detection method and device, electronic equipment and storage medium
CN112036332A (en) Passenger density detection system and detection method for public transport
CN116989809A (en) Navigation information updating method and device, electronic equipment and storage medium
CN114419875A (en) Vehicle travel segmentation method and device and storage medium
CN113401132B (en) Driving model updating method and device and electronic equipment
CN112557057B (en) Method and device for supervising test operation of automatic driving automobile road and vehicle-mounted terminal
CN114968189A (en) Platform for perception system development of an autopilot system
CN114132144A (en) Method, device and equipment for controlling internal and external circulation of automobile air conditioner and storage medium
US11590982B1 (en) Trip based characterization using micro prediction determinations
CN112766746A (en) Traffic accident recognition method and device, electronic equipment and storage medium
US11030830B1 (en) Customized operating point
CN116386156B (en) ETC terminal fault processing method for high-speed toll station

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210319

RJ01 Rejection of invention patent application after publication