CN113841187A - Signal processing device, signal processing method, program, and imaging device - Google Patents

Signal processing device, signal processing method, program, and imaging device Download PDF

Info

Publication number
CN113841187A
CN113841187A CN202080037215.4A CN202080037215A CN113841187A CN 113841187 A CN113841187 A CN 113841187A CN 202080037215 A CN202080037215 A CN 202080037215A CN 113841187 A CN113841187 A CN 113841187A
Authority
CN
China
Prior art keywords
unit
text information
vehicle
signal processing
abnormality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080037215.4A
Other languages
Chinese (zh)
Inventor
星野和弘
加藤康之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Sony Group Corp
Original Assignee
Sony Semiconductor Solutions Corp
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp, Sony Group Corp filed Critical Sony Semiconductor Solutions Corp
Publication of CN113841187A publication Critical patent/CN113841187A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/016Personal emergency signalling and security systems

Abstract

The present technology relates to a signal processing device, a signal processing method, a program, and an imaging device capable of promptly performing notification of an abnormality such as an accident. A signal processing apparatus comprising: an identifying unit that identifies content of the captured image imaged by the imaging unit; a text information generating unit that generates text information including data representing the content of the recognized captured image in characters; and a transmission control unit that controls transmission of the text information. The present technology can be applied to, for example, a system that notifies an abnormality of a vehicle.

Description

Signal processing device, signal processing method, program, and imaging device
Technical Field
The present technology relates to a signal processing device, a signal processing method, a program, and an imaging device, and more particularly, to a signal processing device, a signal processing method, a program, and an imaging device capable of promptly performing abnormality notification.
Background
In recent years, the situation of an accident scene is imaged and recorded by a drive recorder or a monitoring camera, and the cause of the accident or the like is analyzed using the recorded image.
Further, conventionally, it has been proposed to display an area including an object in an image having a rectangular frame, and in a case where the area is specified, extract feature point data of the object, search an information database based on the feature point data, and display obtained related information of the object (for example, see patent document 1).
Reference list
Patent document
Patent document 1: japanese patent application laid-open No.2017-
Disclosure of Invention
Problems to be solved by the invention
However, when the driving recorder or the monitoring camera records an image of an accident scene, it does not notify the occurrence or situation of the accident to a police department, a hospital, or the like. Further, for example, although it is conceivable to transmit the captured image to a police department, a hospital, or the like, it is necessary to analyze the image, and the situation of the accident or the communication of the occurrence is delayed.
Meanwhile, the invention disclosed in patent document 1 does not consider the situation or occurrence of a notification accident.
The present technology is conceived in view of such a situation, and its object is to promptly perform notification of an abnormality such as an accident.
Solution to the problem
A signal processing apparatus according to an aspect of the present technology includes: an identifying unit that identifies content of the captured image imaged by the imaging unit; a text information generating unit that generates text information including data representing the content of the recognized captured image in characters; and a transmission control unit that controls transmission of the text information.
A signal processing method according to an aspect of the present technology includes: identifying content of a captured image imaged by an imaging unit; generating text information including data representing content of the recognized captured image in characters; and controlling the transmission of the text information.
A program according to an aspect of the present technology causes a computer to execute a process including the steps of: identifying content of a captured image imaged by an imaging unit; generating text information including data representing content of the recognized captured image in characters; and controlling the transmission of the text information.
According to an aspect of the present technology, content of a captured image imaged by an imaging unit is identified, text information including data representing the identified content of the captured image in characters is generated, and transmission of the text information is controlled.
Drawings
Fig. 1 is a block diagram illustrating an exemplary configuration of a vehicle control system to which the present technology is applied.
Fig. 2 is a block diagram illustrating a first embodiment of a signal processing system.
Fig. 3 is a flowchart for explaining the first embodiment of the exception notification processing.
Fig. 4 is a block diagram illustrating a second embodiment of a signal processing system.
Fig. 5 is a flowchart for explaining the second embodiment of the exception notification processing.
Fig. 6 is a diagram illustrating an exemplary configuration of a computer.
Detailed Description
Hereinafter, embodiments for implementing the present technology will be described. The description will be given in the following order.
1. Exemplary configuration of a vehicle control System
2. First embodiment
3. Second embodiment
4. Modification example
5. Others
<1. exemplary configuration of vehicle control System >
Fig. 1 is a block diagram illustrating a schematic exemplary functional configuration of a vehicle control system 100 as an example of a mobile body control system to which the present technology can be applied.
Note that, hereinafter, the vehicle 10 provided with the vehicle control system 100 will be referred to as a host vehicle in a case of being distinguished from another vehicle.
The vehicle control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, an in-vehicle device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system 108, a main system control unit 109, a main system 110, a storage unit 111, and an automatic driving control unit 112. The input unit 101, the data acquisition unit 102, the communication unit 103, the output control unit 105, the drive system control unit 107, the main body system control unit 109, the storage unit 111, and the automatic driving control unit 112 are connected to each other via a communication network 121. The communication network 121 includes, for example, a Controller Area Network (CAN), a Local Interconnect Network (LIN), a Local Area Network (LAN), or an in-vehicle communication network, a bus, or the like, which conforms to any standard such as FlexRay (registered trademark). Note that the units of the vehicle control system 100 may be directly connected without the communication network 121.
Note that, hereinafter, in the case where each unit of the vehicle control system 100 performs communication via the communication network 121, description of the communication network 121 will be omitted. For example, in the case where the input unit 101 and the automatic driving control unit 112 communicate with each other via the communication network 121, it is simply described that the input unit 101 and the automatic driving control unit 112 communicate with each other.
The input unit 101 includes a device for an occupant to input various types of data, instructions, and the like. For example, the input unit 101 includes operation devices such as a touch panel, buttons, a microphone, switches, and a joystick, operation devices that can be input by methods such as voice and gestures other than manual operation, and the like. Further, for example, the input unit 101 may be a remote control device using infrared rays or other radio waves or an external connection device such as a mobile device or a wearable device compatible with the operation of the vehicle control system 100. The input unit 101 generates input signals based on data, instructions, and the like input by the occupant, and supplies them to the respective units of the vehicle control system 100.
The data acquisition unit 102 includes various sensors and the like that acquire data to be used for processing of the vehicle control system 100 and supply the acquired data to the respective units of the vehicle control system 100.
For example, the data acquisition unit 102 includes various sensors for detecting the state of the host vehicle and the like. Specifically, for example, the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an Inertial Measurement Unit (IMU), sensors for detecting, for example, an operation amount of an accelerator pedal, an operation amount of a brake pedal, a steering angle of a steering wheel, an engine speed, a motor speed, a wheel rotation speed, and the like.
Further, for example, the data acquisition unit 102 includes various sensors for detecting information associated with the exterior of the host vehicle. Specifically, for example, the data acquisition unit 102 includes an imaging device such as a time-of-flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. Further, for example, the data acquisition unit 102 includes an environmental sensor for detecting weather, meteorological phenomena, or the like, and a surrounding information detection sensor for detecting an object around the host vehicle. The environmental sensors include, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like. The surrounding information detection sensor includes, for example, an ultrasonic sensor, a radar, a light detection and ranging/laser imaging detection and ranging (LiDAR), a sonar, and the like.
Further, for example, the data acquisition unit 102 includes various sensors for detecting the current position of the host vehicle. Specifically, for example, the data acquisition unit 102 includes a Global Navigation Satellite System (GNSS) receiver that receives GNSS signals from GNSS satellites, and the like.
Further, for example, the data acquisition unit 102 includes various sensors for detecting in-vehicle information. Specifically, for example, the data acquisition unit 102 includes an imaging device that images the driver, a biosensor that detects biological information of the driver, a microphone that collects sound inside the vehicle, and the like. For example, a biosensor is provided on a seat surface, a steering wheel, or the like, and detects biological information of an occupant seated on the seat or a driver holding the steering wheel.
The communication unit 103 communicates with the in-vehicle device 104 and various devices, servers, base stations, and the like outside the vehicle, transmits data supplied from each unit of the vehicle control system 100, and supplies the received data to each unit of the vehicle control system 100. Note that the communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 can support a plurality of types of communication protocols.
For example, the communication unit 103 wirelessly communicates with the in-vehicle device 104 using wireless LAN, bluetooth (registered trademark), Near Field Communication (NFC), Wireless Universal Serial Bus (WUSB), or the like. Further, for example, the communication unit 103 performs wired communication with the in-vehicle device 104 via a connection terminal (cable if necessary) (not shown) using a Universal Serial Bus (USB), a high-definition multimedia interface (HDMI) (registered trademark), a mobile high-definition link (MHL), or the like.
Further, for example, the communication unit 103 communicates with a device (e.g., an application server or a control server) existing on an external network (e.g., the internet, a cloud network, or a company private network) via a base station or an access point. Further, for example, the communication unit 103 communicates with a terminal (e.g., a terminal of a pedestrian or a shop, or a Machine Type Communication (MTC) terminal) existing in the vicinity of the host vehicle using a peer-to-peer (P2P) technology. Further, for example, the communication unit 103 performs vehicle-to-all (V2X) communication such as vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication. Further, for example, the communication unit 103 includes a beacon receiving unit that receives radio waves or electromagnetic waves transmitted from a wireless station or the like installed on a road and obtains information such as a current position, congestion, traffic control, or required time.
The in-vehicle device 104 includes, for example, a mobile device or a wearable device owned by a passenger, an information device carried in or attached to a host vehicle, a navigation apparatus that searches for a route to an arbitrary destination, and the like.
The output control unit 105 controls output of various types of information directed to an occupant of the host vehicle or the outside of the host vehicle. For example, the output control unit 105 generates an output signal including at least one of visual information (e.g., image data) or auditory information (e.g., voice data) and supplies them to the output unit 106, thereby controlling output of the visual information and auditory information from the output unit 106. Specifically, for example, the output control unit 105 synthesizes image data imaged by different imaging devices of the data acquisition unit 102 to generate a top view image, a panoramic image, and the like, and outputs an output signal including the generated images to the output unit 106. Further, for example, the output control unit 105 generates voice data including a warning sound, a warning message, or the like for a hazard such as a collision, a contact, or an entrance into an unsafe zone, and supplies an output signal including the generated voice data to the output unit 106.
The output unit 106 includes a device capable of outputting visual information or auditory information to an occupant of the host vehicle or outside the host vehicle. For example, the output unit 106 includes a display device, an instrument panel, an audio speaker, an earphone, a wearable device such as a glasses type display worn by a passenger, a projector, a lamp, and the like. The display device included in the output unit 106 may be, for example, a device such as a head-up display (head-up display), a transmissive display (transmissive display), or a device having an Augmented Reality (AR) display function that displays visual information in the field of view of the driver, in addition to a device having a normal display.
The drive system control unit 107 generates various control signals and supplies them to the drive system 108, thereby controlling the drive system 108. Further, the drive system control unit 107 supplies a control signal and an error signal to each unit of the drive system 108 as necessary, thereby performing notification of a control state and abnormality of the drive system 108, and the like.
The drive system 108 includes various devices associated with the drive system of the host vehicle. For example, the drive system 108 includes: a driving force generating apparatus for generating a driving force such as an internal combustion engine or a driving motor, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting a steering angle, a braking apparatus for generating a braking force, an anti-lock brake system (ABS), an Electronic Stability Control (ESC), an electric power steering apparatus, and the like.
The main body system control unit 109 generates various control signals and supplies them to the main body system 110, thereby controlling the main body system 110. Further, the main body system control unit 109 supplies a control signal and an error signal to each unit of the main body system 110 as necessary, thereby performing notification of a control state and abnormality of the main body system 110, and the like.
The body system 110 includes various devices of the body system mounted on the vehicle body. For example, the body system 110 includes a keyless entry system, a smart key system, a power window device, a power seat, a steering wheel, an air conditioner, an air bag, a seat belt, various lamps (e.g., a front lamp, a rear lamp, a brake lamp, a blinker, a fog lamp, etc.), and the like.
The storage unit 111 includes, for example, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic storage device such as a Hard Disk Drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, and the like. The storage unit 111 stores various programs, data, and the like to be used by each unit of the vehicle control system 100. For example, the storage unit 111 stores map data such as a three-dimensional high-precision map (such as a dynamic map), a global map that is lower in precision than the high-precision map and covers a wide area, and a local map that includes information around the host vehicle.
The automatic driving control unit 112 performs control related to automatic driving such as autonomous traveling or driving support. Specifically, for example, the automatic driving control unit 112 performs cooperative control intended to realize functions of an Advanced Driver Assistance System (ADAS) including collision avoidance or impact mitigation of the host vehicle, follow-up running based on an inter-vehicle distance, vehicle speed hold running, host vehicle collision warning, lane departure warning for the host vehicle, and the like. Further, for example, the automated driving control unit 112 executes cooperative control of automated driving or the like that aims to autonomously travel without depending on the operation of the driver. The automatic driving control unit 112 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135.
The detection unit 131 detects various types of information required to control the automated driving. The detection unit 131 includes a vehicle external information detection unit 141, an in-vehicle information detection unit 142, and a vehicle state detection unit 143.
The vehicle external information detection unit 141 detects information external to the host vehicle based on data or signals from the units of the vehicle control system 100. For example, the vehicle external information detection unit 141 performs detection processing, recognition processing, and tracking processing on an object around the host vehicle, and detection processing of a distance to the object. Examples of objects to be detected include vehicles, people, obstacles, structures, roads, traffic lights, traffic signs, and road signs. Further, for example, the vehicle exterior information detection unit 141 detects the environment around the host vehicle. Examples of the ambient environment to be detected include weather, ambient temperature, humidity, brightness, and road surface condition. The vehicle exterior information detecting unit 141 supplies data indicating the detection processing result to the own position estimating unit 132, the map analyzing unit 151, the traffic regulation identifying unit 152 and the situation identifying unit 153 of the situation analyzing unit 133, the emergency avoiding unit 171 of the operation control unit 135, and the like.
The in-vehicle information detection unit 142 detects the in-vehicle information based on data or signals from the respective units of the vehicle control system 100. For example, the in-vehicle information detection unit 142 performs authentication processing and recognition processing of the driver, detection processing of the driver state, detection processing of the occupant, detection processing of the in-vehicle environment, and the like. Examples of driver states to be detected include physical condition, wakefulness level, concentration level, fatigue level, direction of sight and intoxication level. Examples of the in-vehicle environment to be detected include ambient temperature, humidity, brightness, and odor. The in-vehicle information detection unit 142 supplies data indicating the detection processing result to the situation recognition unit 153 of the situation analysis unit 133, the emergency avoidance unit 171 of the operation control unit 135, and the like.
The vehicle state detection unit 143 detects the host vehicle state based on data or signals from the units of the vehicle control system 100. Examples of the host vehicle state to be detected include a speed, an acceleration level, a steering angle, presence/absence of an abnormality and contents of an abnormality, a driving operation state, a position and inclination of an electric seat, a door lock state, an air bag state, a magnitude of an impact from the outside, and a state of other on-vehicle devices. The vehicle state detection unit 143 supplies data indicating the detection processing result to the situation recognition unit 153 of the situation analysis unit 133, the emergency avoidance unit 171 of the operation control unit 135, and the like.
The self-position estimation unit 132 estimates the position, orientation, and the like of the host vehicle based on data or signals from the units of the vehicle control system 100, such as the vehicle external information detection unit 141 and the situation recognition unit 153 of the situation analysis unit 133. Further, the self-position estimation unit 132 generates a local map to be used for estimating the self position (hereinafter, referred to as a self-position estimation map) as necessary. The self-position estimation map is a high-precision map using a technique such as simultaneous localization and mapping (SLAM). The self-position estimation unit 132 supplies data indicating the estimation processing result to the map analysis unit 151, the traffic rule recognition unit 152, the situation recognition unit 153, and the like of the situation analysis unit 133. Further, the self-position estimating unit 132 causes the storage unit 111 to store the self-position estimation map.
The situation analysis unit 133 performs analysis processing of the host vehicle and the surrounding situation. The situation analysis unit 133 includes a map analysis unit 151, a traffic rule recognition unit 152, a situation recognition unit 153, and a situation prediction unit 154.
The map analysis unit 151 analyzes various maps stored in the storage unit 111 using data or signals from the units of the vehicle control system 100 such as the self-position estimation unit 132 and the vehicle external information detection unit 141 as necessary, and constructs a map including information necessary for processing automatic driving. The map analysis unit 151 supplies the constructed map to the traffic rule recognition unit 152, the situation recognition unit 153, the situation prediction unit 154, and the route planning unit 161, the action planning unit 162, the operation planning unit 163, and the like of the planning unit 134.
The traffic regulation identification unit 152 identifies the traffic regulation around the host vehicle based on data or signals from the respective units of the vehicle control system 100, such as the own position estimation unit 132, the vehicle external information detection unit 141, and the map analysis unit 151. According to this identification processing, for example, the position and state of a signal around the host vehicle, the content of traffic control around the host vehicle, a lane on which the vehicle can travel, and the like are identified. The traffic regulation recognition unit 152 supplies data indicating the recognition processing result to the situation prediction unit 154 and the like.
The situation recognition unit 153 recognizes a situation related to the host vehicle based on data or signals from the respective units of the vehicle control system 100, such as the self-position estimation unit 132, the vehicle outside information detection unit 141, the in-vehicle information detection unit 142, the vehicle state detection unit 143, and the map analysis unit 151. For example, the situation recognition unit 153 recognizes a situation of the host vehicle, a situation around the host vehicle, a situation of the driver of the host vehicle, and the like. Further, the situation recognition unit 153 generates a partial map (hereinafter, referred to as a situation recognition map) to be used for recognizing the situation around the host vehicle as necessary. The situation recognition map is, for example, an occupancy grid map.
Examples of the host vehicle situation to be recognized include a position, a posture, and a movement (e.g., a velocity, an acceleration level, a moving direction, etc.) of the host vehicle, and presence/absence of an abnormality and the content of the abnormality. Examples of situations around the host vehicle to be identified include: the type and location of the surrounding stationary objects, the type, location and movement (e.g., speed, acceleration level, direction of movement, etc.) of the surrounding moving objects, the configuration of the surrounding road and the state of the road surface, as well as the surrounding weather, ambient temperature, humidity and brightness. Examples of the driver state to be recognized include a physical condition, a wakefulness level, a concentration level, a fatigue level, a movement of sight line, and a driving operation.
The situation recognizing unit 153 supplies data (including a situation recognition map, if necessary) indicating the recognition processing result to the self-position estimating unit 132, the situation predicting unit 154, and the like. Further, the situation recognition unit 153 causes the storage unit 111 to store the situation recognition map.
The situation prediction unit 154 predicts a situation related to the host vehicle based on data or signals from the respective units of the vehicle control system 100, such as the map analysis unit 151, the traffic regulation recognition unit 152, and the situation recognition unit 153. For example, the situation prediction unit 154 predicts the situation of the host vehicle, the situation around the host vehicle, the situation of the driver, and the like.
Examples of the host vehicle situation to be predicted include the behavior of the host vehicle, the occurrence of an abnormality, and the travelable distance. Examples of the situation around the host vehicle to be predicted include behavior of a moving object around the host vehicle, a change in signal state, and a change in environment such as weather. Examples of driver situations to be predicted include the behavior and physical condition of the driver.
The situation prediction unit 154 supplies data indicating the result of the prediction processing to the route planning unit 161, the action planning unit 162, the operation planning unit 163, and the like of the planning unit 134 together with data from the traffic rule recognition unit 152 and the situation recognition unit 153.
The route planning unit 161 plans a route to a destination based on data or signals from the respective units of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. For example, the route planning unit 161 sets a route from the current position to a specified destination based on the global map. Further, the route planning unit 161 appropriately changes the route based on situations such as congestion, an accident, traffic control and construction, the physical condition of the driver, and the like, for example. The route planning unit 161 supplies data indicating a planned route to the action planning unit 162 and the like.
The action planning unit 162 plans an action of the host vehicle to safely travel within a planned time on the route planned by the route planning unit 161 based on data or signals from the units of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. For example, the action planning unit 162 plans start, stop, traveling direction (e.g., forward movement, backward movement, left turn, right turn, direction change, etc.), traveling lane, traveling speed, passing, and the like. The action planning unit 162 supplies data indicating a planned action of the host vehicle to the operation planning unit 163 and the like.
The operation planning unit 163 plans the operation of the host vehicle to implement the action planned by the action planning unit 162 based on data or signals from the respective units of the vehicle control system 100, such as the map analysis unit 151 and the situation prediction unit 154. For example, the operation planning unit 163 plans acceleration, deceleration, a travel locus, and the like. The operation planning unit 163 supplies data indicating the planned operation of the host vehicle to the acceleration/deceleration control unit 172, the direction control unit 173, and the like of the operation control unit 135.
The operation control unit 135 controls the operation of the host vehicle. The operation control unit 135 includes an emergency avoidance unit 171, an acceleration/deceleration control unit 172, and a direction control unit 173.
The emergency avoiding unit 171 detects an emergency such as a collision, a contact, an entrance into an unsafe zone, an abnormality of a driver, or an abnormality of a vehicle based on the detection results of the vehicle external information detecting unit 141, the in-vehicle information detecting unit 142, and the vehicle state detecting unit 143. In the case where the emergency avoidance unit 171 has detected the occurrence of an emergency, it plans the operation of the host vehicle to avoid an emergency such as a sudden stop or a sudden turn. The emergency avoidance unit 171 supplies data indicating a planned operation of the host vehicle to the acceleration/deceleration control unit 172, the direction control unit 173, and the like.
The acceleration/deceleration control unit 172 performs acceleration/deceleration control to realize the operation of the host vehicle planned by the operation planning unit 163 or the emergency avoidance unit 171. For example, the acceleration/deceleration control unit 172 calculates a control target value of the driving force generation device or the brake device to achieve a planned acceleration, deceleration, or sudden stop, and supplies a control command indicating the calculated control target value to the drive system control unit 107.
The direction control unit 173 performs direction control to realize the operation of the host vehicle planned by the operation planning unit 163 or the emergency avoidance unit 171. For example, the direction control unit 173 calculates a control target value of the steering mechanism to realize a travel locus or a sharp turn planned by the operation planning unit 163 or the emergency avoidance unit 171, and supplies a control command indicating the calculated control target value to the drive system control unit 107.
<2. first embodiment >
Next, a first embodiment of the present technology will be described with reference to fig. 2 and 3.
< exemplary configuration of Signal processing System 201 >
Fig. 2 illustrates an exemplary configuration of a signal processing system 201 to which the present technique is applied.
The signal processing system 201 is a system that recognizes the content of a captured image, detects an abnormality based on the recognition result or the like, and transmits text information including data (hereinafter, referred to as character data) representing the recognition result or the like in characters to a predetermined notification destination in the case where the abnormality has been detected.
Note that the character data includes, for example, data obtained by imaging data expressed with characters or the like, in addition to so-called text data.
Further, fig. 2 illustrates an exemplary case in which the signal processing system 201 is provided in the vehicle 10 and detects an abnormality (e.g., an accident, an abnormality of the driver, etc.) of at least one of the vehicle 10 or the surroundings of the vehicle 10.
The signal processing system 201 includes an imaging unit 211, a receiving unit 212, a signal processing unit 213, a transmitting unit 214, and a storage unit 215.
The imaging unit 211 images at least one of the surroundings or the interior of the vehicle 10, for example. The imaging unit 211 supplies image data including an image that has been captured (hereinafter, referred to as a captured image) to the signal processing unit 213, and causes the storage unit 215 to store the image data. For example, the imaging unit 211 constitutes part of the data acquisition unit 102 of the vehicle control system 100.
The receiving unit 212 receives data for abnormality detection and text information generation from the outside of the vehicle and the inside of the vehicle via the communication network 121, and supplies the received data to the signal processing unit 213. For example, the receiving unit 212 constitutes part of the communication unit 103 of the vehicle control system 100 and part of the communication unit (not shown) of the automatic driving control unit 112.
The signal processing unit 213 detects an abnormality based on the image data and the received data, and in the case where an abnormality has been detected, generates text information to supply it to the transmission unit 214. The signal processing unit 213 constitutes, for example, parts of the detection unit 131 and the situation recognition unit 153 of the automatic driving control unit 112 of the vehicle control system 100, and includes a recognition unit 221, a text information generation unit 222, an abnormality detection unit 223, and a transmission control unit 224.
The recognition unit 221 recognizes the content of the captured image, and supplies recognition data indicating the recognition result to the text information generation unit 222 and the abnormality detection unit 223. For example, the recognition unit 221 uses a recognition model constructed by machine learning such as deep learning.
The text information generating unit 222 generates text information including character data representing the content of the captured image (identification data) and the content of the received data, and causes the storage unit 215 to store it.
The abnormality detection unit 223 detects an abnormality based on the identification data and the received data, and supplies data indicating the detection result to the transmission control unit 224.
The transmission control unit 224 controls transmission of the text information by the transmission unit 214 based on the abnormality detection result.
The transmission unit 214 transmits the text information to a predetermined notification destination outside the vehicle under the control of the transmission control unit 224. Note that the communication method of the transmitting unit 214 is not particularly limited. For example, the transmitting unit 214 constitutes part of the communication unit 103 of the vehicle control system 100.
The storage unit 215 constitutes part of the storage unit 111 of the vehicle control system 100.
< Exception Notification processing >
Next, the abnormality notification processing to be executed by the signal processing system 201 will be described with reference to the flowchart of fig. 3.
This process is started when the power of the signal processing system 201 is on, for example, and is ended when the power is off.
In step S1, the imaging unit 211 starts imaging processing. Specifically, the imaging unit 211 starts imaging to supply image data including the obtained captured image to the recognition unit 221, and also starts processing of causing the storage unit 215 to store the image data. Note that the image data stored in the storage unit 215 is erased after a predetermined time (for example, one hour), for example.
In step S2, the recognition unit 221 starts the recognition processing. Specifically, the recognition unit 221 recognizes the content of the captured image, and starts the process of supplying the recognition data indicating the recognition result to the text information generation unit 222 and the abnormality detection unit 223.
Examples of the content of the captured image to be recognized include information associated with an abnormality to be detected by the abnormality detection unit 223 (for example, information to be used for detection and analysis of an abnormality).
For example, in the case where the captured image is an image obtained by imaging the surroundings of the vehicle 10, the characteristics and state of the surrounding vehicle, the characteristics and position of the driver of the surrounding vehicle, the characteristics and position of the surrounding pedestrians (including two-wheeled vehicles), the surrounding situation, and the like will be recognized.
Examples of the characteristics of the surrounding vehicles include the type of vehicle, color, and contents of the vehicle number plate.
Examples of the state of the surrounding vehicle include a speed and a traveling direction.
Examples of characteristics of drivers and pedestrians of surrounding vehicles include gender, age, body type, hair style, skin tone, clothing, and accessories (e.g., hat, glasses, etc.). Note that, for example, personal information obtained by performing face recognition or the like based on a captured image may be included.
Examples of the surrounding situation include weather, a road surface state, presence/absence of an obstacle, presence/absence of an accident occurrence, and a situation of an accident. Examples of accident situations include the type of accident (e.g., single accident, loss of property accident, physical injury accident, etc.), the presence/absence of injured people, vehicle damage situations, and the presence/absence of a fire occurrence.
Further, for example, in the case where the captured image is an image obtained by imaging the interior of the vehicle 10, the characteristics, state, and the like of the driver of the vehicle 10 will be recognized.
For example, the characteristics of the driver of the vehicle 10 are similar to the characteristics of the drivers of the surrounding vehicles of the vehicle 10 described above.
Examples of the state of the driver of the vehicle 10 include a physical condition, an awake level (e.g., presence/absence of dozing), a concentration level, a fatigue level, a line-of-sight direction, an intoxication level (e.g., possibility of drinking), and whether a seat belt is fastened. Note that the state of the driver is recognized by, for example, a Driver Monitoring System (DMS) or the like. For example, the possibility of drinking can be identified by a pupillary saccade or the like.
In step S3, the reception unit 212 starts data reception. Specifically, the receiving unit 212 starts processing of receiving data from the outside of the vehicle and the inside of the vehicle via the communication network 121 and supplying it to the text information generating unit 222 and the abnormality detecting unit 223.
Examples of the received data include information associated with an abnormality to be detected by the abnormality detection unit 223 (e.g., information to be used for detection and analysis of an abnormality). For example, the data received from outside the vehicle includes data received by the communication unit 103 from the in-vehicle device 104, a device existing on an external network, a terminal and a base station existing in the vicinity of the vehicle 10, another vehicle, a pedestrian, a road-attached device, a home, and the like. Examples of the data received from the vehicle interior include data indicating the results of the detection processing performed by the above-described vehicle exterior information detection unit 141, the in-vehicle information detection unit 142, and the vehicle state detection unit 143, and voice data of the vehicle 10 interior obtained by a microphone included in the input unit 101.
In step S4, the abnormality detection unit 223 starts detecting an abnormality based on the identification data and the received data.
For example, the abnormality detection unit 223 detects an accident involving the vehicle 10 based on the state of an airbag of the vehicle 10, the magnitude of an impact from the outside on the vehicle 10, and the like. Further, for example, the abnormality detection unit 223 detects an accident around the vehicle 10 based on information associated with the situation around the vehicle 10. Note that the accident around the vehicle 10 does not necessarily involve the vehicle 10, and may include other inter-vehicle accidents. Further, for example, the abnormality detection unit 223 starts detecting an abnormality of the driver based on information associated with the state of the driver. Examples of the driver's abnormality to be detected include dozing, a drunk state, syncope, cramping, and bleeding.
In step S5, the text information generating unit 222 starts generating text information. Specifically, the text information generating unit 222 starts a process of generating text information including character data representing at least one of the content of a captured image (identification data), the content of data received from outside the vehicle, or the content of data received from inside the vehicle and causing the storage unit 215 to store the text information. With this arrangement, the text information is continuously generated without being changed by the abnormality detection result. Note that the text information stored in the storage unit 215 is erased after a predetermined time (for example, one minute), for example.
Examples of the text information include information associated with an abnormality to be detected by the abnormality detecting unit 223.
Examples of information associated with an anomaly include information indicating the content of the anomaly, information indicating the risk of the anomaly, and information to be used to analyze the anomaly.
Specifically, for example, the text information includes character data representing the characteristics and states of the surrounding vehicles of the vehicle 10 described above, the characteristics of the drivers of the surrounding vehicles, the characteristics and positions of the surrounding pedestrians, the surrounding situations, and the characteristics and states of the drivers of the vehicle 10.
Note that, for example, in the case where an accident occurs, the text information may also include character data representing information (for example, the contents of the vehicle number plate) associated with a surrounding vehicle other than the vehicle that has caused the accident, the surrounding vehicle being different from the vehicle. With this arrangement, it becomes possible to collect witness information of an accident later from the drivers of surrounding vehicles and the like, for example.
For example, the text information includes information associated with the vehicle 10, such as character data representing characteristics and states of the vehicle 10. For example, the characteristics and states of the vehicle 10 are similar to those of surrounding vehicles of the vehicle 10 described above.
For example, in the case where an accident occurs, the text information includes character data representing information associated with the situation of the accident. Examples of accident situations include time of occurrence, place of occurrence, type of accident, presence/absence of injured person, vehicle damage situation, and presence/absence of fire occurrence.
For example, the text information includes character data of content (i.e., content of voice) obtained by performing voice recognition on voice data of the interior of the vehicle 10.
In step S6, the abnormality detection unit 223 determines whether an abnormality is detected based on the result of the abnormality detection processing. The determination processing of step S6 is repeatedly executed until it is determined that an abnormality is detected. Then, in a case where it is determined that the abnormality is detected, the process proceeds to step S7.
In step S7, the signal processing system 201 starts transmission of text information. Specifically, the abnormality detection unit 223 notifies the transmission control unit 224 of the occurrence of an abnormality.
The transmission control unit 224 reads text information generated during a period from a time of a predetermined time (for example, 10 seconds before) before the abnormality is detected to a time when the abnormality is detected from the storage unit 215, and transmits it to a predetermined notification destination via the transmission unit 214. Further, the transmission control unit 224 starts processing of reading the latest text information generated by the text information generating unit 222 from the storage unit 215 and transmitting it to a predetermined notification destination.
For example, the notification destination is set as a predetermined center. Then, for example, the text information is transmitted from the center to various relevant places such as a police department, a hospital, an insurance company, and a security company, or notification based on the text information is performed as necessary. Note that, for example, the notification destination may be directly set as each relevant place.
In step S8, the abnormality detection unit 223 determines whether the abnormality has ended based on the result of the abnormality detection processing. The determination processing of step S8 is repeatedly executed until it is determined that the abnormality has ended, and in the case where it is determined that the abnormality has ended, the processing proceeds to step S9.
In step S9, the signal processing system 201 stops transmitting the text information. Specifically, the abnormality detection unit 223 notifies the end of the abnormality to the transmission control unit 224.
The transmission control unit 224 stops the transmission of the text information.
Note that, for example, the transmission control unit 224 may continue transmission of the text information for a predetermined time after determining that the abnormality has ended.
Thereafter, the process returns to step S6, and the process of step S6 and subsequent processes are executed.
As described above, in the case where an accident, an abnormality of the driver, or the like occurs, text information including character data representing information associated with the occurred abnormality is transmitted to a predetermined notification destination.
With this arrangement, it becomes possible to use text information at the notification destination and the transfer destination without analyzing images and the like. As a result, the occurrence and situation of an abnormality are grasped quickly to perform an action against the abnormality. For example, in the presence of injured personnel, an ambulance may immediately proceed to the accident site. For example, in the event of a fire, the fire engine may travel immediately to the accident site. For example, in the event that an accident vehicle has escaped, the police may immediately perform tracking or enforcement measures.
<3. second embodiment >
Next, a second embodiment of the present technology will be described with reference to fig. 4 and 5.
In the second embodiment, generation of text information is started or stopped as necessary.
< exemplary configuration of Signal processing System 301 >
Fig. 4 illustrates an exemplary configuration of a signal processing system 301 to which the present technique is applied. Note that, in a manner similar to fig. 2, fig. 4 illustrates an exemplary case in which the signal processing system 301 is provided in the vehicle 10 and detects an abnormality (e.g., an accident, an abnormality of the driver, etc.) of at least one of the vehicle 10 or the surroundings of the vehicle 10. Further, in the figure, portions corresponding to those of the signal processing system 201 of fig. 2 are denoted by the same reference symbols, and description thereof will be omitted as appropriate.
The signal processing system 301 is the same as the signal processing system 201 in that the imaging unit 211, the receiving unit 212, the transmitting unit 214, and the storage unit 215 are included, and is different in that the signal processing unit 311 is included instead of the signal processing unit 213. The signal processing unit 311 is the same as the signal processing unit 213 in that the recognition unit 221 is included, and is different in that the abnormality detection unit 321, the text information generation unit 322, and the transmission control unit 323 are included instead of the abnormality detection unit 223, the text information generation unit 222, and the transmission control unit 224.
The abnormality detection unit 321 is the same as the abnormality detection unit 321 of the signal processing system 201 in that an abnormality is detected based on the identification data and the received data, and is different in that a sign of the abnormality is further detected. The abnormality detecting unit 321 supplies data indicating the detection result to the text information generating unit 322.
In a similar manner to the text information generating unit 222, the text information generating unit 322 generates text information based on the identification data and the received data. However, unlike the text information generating unit 222, the text information generating unit 322 starts or stops the generation of the text information based on the sign of the abnormality and the detection result of the abnormality. The text information generating unit 322 supplies the generated text information to the transmission control unit 323, and causes the storage unit 215 to store the text information.
In the case where the transmission control unit 323 obtains text information from the text information generating unit 322, it transmits the obtained text information to a predetermined notification destination via the transmitting unit 214.
< Exception Notification processing >
Next, the abnormality notification processing to be executed by the signal processing system 301 will be described with reference to the flowchart of fig. 5.
This process is started when the power of the signal processing system 301 is on, for example, and is ended when the power is off.
In steps S101 to S103, similar processing to that in steps S1 to S3 of fig. 3 is performed.
In step S104, the abnormality detection unit 321 starts detecting an abnormality. Specifically, in a manner similar to the processing of the abnormality detection unit 223 of step S7 in fig. 3, the abnormality detection unit 321 starts detecting an abnormality, and also starts detecting an indication of the abnormality.
Examples of the signs of abnormality to be detected include risk factors causing an accident and operations for avoiding an accident. Examples of risk factors that cause an accident include unsafe driving of the vehicle 10 and surrounding vehicles, dangerous pedestrians (including two-wheeled vehicles), driver's abnormalities, and surrounding unsafe situations.
Examples of unsafe driving of the vehicle 10 and surrounding vehicles include fatigue driving, drunk driving, unlit driving, inattentive driving, tortuous driving, erratic road driving, blind signaling, trailing, over speed, skidding, sudden start, sudden acceleration, sudden braking, and sudden steering.
Examples of dangerous pedestrians include pedestrians who are running out (may run out), pedestrians in blind areas of the driver of the vehicle 10, pedestrians without looking at traffic lights, pedestrians on the vehicle road, and pedestrians who are strolling.
Examples of surrounding unsafe conditions include earthquakes, heavy fog, floods, storms, snowstorms, fires, rock falls, obstacles, road crashes, and road freezes.
Examples of the operation for avoiding the accident include sudden braking and sudden steering.
In step S105, the abnormality detection unit 321 determines whether a sign of abnormality is detected. In a case where it is determined that the sign of the abnormality is not detected, the process proceeds to step S106.
In step S106, it is determined whether an abnormality is detected in a manner similar to the process of step S6 in fig. 3. In a case where it is determined that the abnormality is not detected, the process returns to step S105.
Thereafter, the processing of steps S105 and S106 is repeatedly executed until it is determined in step S105 that a sign of abnormality is detected or it is determined in step S106 that an abnormality is detected.
On the other hand, in the case where it is determined in step S105 that the sign of the abnormality is detected, that is, in the case where the risk of occurrence of the abnormality increases, the process of step S106 is skipped, and the process proceeds to step S107.
Further, in the case where it is determined in step S106 that an abnormality is detected, the process proceeds to step S107. This is the case when no sign of an abnormality is detected but an abnormality is suddenly detected.
In step S107, the signal processing system 301 starts generation and transmission of text information. Specifically, the abnormality detecting unit 321 notifies the text information generating unit 322 of the sign of the detection of the abnormality or the fact that the abnormality is detected.
In a manner similar to the processing of the text information generating unit 222 of step S5 in fig. 3, the text information generating unit 322 starts generating text information. Further, the text information generating unit 322 starts a process of supplying the generated text information to the transmission control unit 323 and causing the storage unit 215 to store the text information. Note that the text information stored in the storage unit 215 is erased after a predetermined time (for example, one minute), for example.
Note that, in the case where the sign of abnormality is detected, the text information includes, for example, character data representing information associated with the sign of abnormality. Examples of the information associated with the sign of the abnormality include the content of the sign of the abnormality, and the occurrence time and the occurrence place of the sign of the abnormality.
For example, with the information associated with unsafe driving, which is one of the signs of abnormality, included in the text information, the accident analysis accuracy is improved in the case where an accident occurs, whereby it becomes possible to accurately identify the cause of the accident and the like.
The transmission control unit 323 starts processing of transmitting the text information obtained from the text information generating unit 322 to a predetermined notification destination via the transmitting unit 214.
In step S108, the abnormality detection unit 321 determines whether the sign of abnormality or the abnormality has ended. This determination process is repeatedly executed until a sign of an abnormality is determined or the abnormality has ended. Then, in a case where it is determined that the sign of the abnormality or the abnormality has ended, the processing proceeds to step S109. This includes: detecting an anomaly after detecting evidence of an anomaly, and thereafter following a condition in which no anomaly is detected; after the indication of the abnormality is detected, no indication of the abnormality is detected and no case of the abnormality is detected; and the case where no sign of abnormality is detected but abnormality is detected, and thereafter no abnormality is detected.
In step S109, the signal processing system 301 stops the generation and transmission of the text information. Specifically, the abnormality detecting unit 321 notifies the text information generating unit 322 of a sign of abnormality or the fact that the abnormality has ended.
The text information generating unit 322 stops generating the text information. The transmission control unit 323 stops the process of transmitting the text information.
Note that, for example, the text information generation unit 322 and the transmission control unit 323 may continue the generation and transmission of the text information for a predetermined time after determining that the sign of the abnormality or the abnormality has ended.
Thereafter, the process returns to step S105, and the process of step S105 and subsequent processes are executed.
As described above, the sign of the abnormality is detected, and the text information is generated after the risk of occurrence of the abnormality increases, whereby the processing of the signal processing system 301 can be reduced.
Further, in addition to the case where an abnormality is detected, text information is generated and transmitted in the case where a sign of the abnormality is detected. With this arrangement, it becomes possible to prepare in advance for the occurrence of an abnormality at the notification destination of the text information, whereby it becomes possible to respond quickly to the occurrence of an abnormality. Further, it becomes possible to analyze the abnormality in detail more accurately.
<4. modified example >
Hereinafter, a modification of the embodiment according to the present technology described above will be described.
The signal processing system 201 and the signal processing system 301 may include, for example, one semiconductor chip or a plurality of semiconductor chips.
For example, the imaging unit 211 of the signal processing system 201 may be provided in an image sensor, and other units may be provided in other semiconductor chips (e.g., a semiconductor chip for ADAS). For example, part (e.g., the recognition unit 221) or all of the imaging unit 211 and the signal processing unit 213 may be provided in the image sensor, and the other units may be provided in other semiconductor chips (e.g., a semiconductor chip for ADAS). For example, the signal processing system 201 may include an image sensor.
Similarly, for example, the imaging unit 211 of the signal processing system 301 may be provided in an image sensor, and the other units may be provided in other semiconductor chips (e.g., a semiconductor chip for ADAS). For example, part (e.g., the recognition unit 221) or all of the imaging unit 211 and the signal processing unit 311 may be provided in the image sensor, and the other units may be provided in other semiconductor chips (e.g., a semiconductor chip for ADAS). For example, the signal processing system 301 may include an image sensor.
Further, for example, the signal processing system 201 and the signal processing system 301 may include one device, or may include a plurality of devices having different housings.
For example, the signal processing system 201 may include one imaging device. For example, the imaging unit 211 of the signal processing system 201 may be provided in the imaging device, and the other units may be provided in an Electronic Control Unit (ECU) of an ADAS for a vehicle.
Similarly, for example, the signal processing system 301 may include an imaging device. For example, the imaging unit 211 of the signal processing system 301 may be provided in the imaging device, and the other units may be provided in the ECU of the ADAS for the vehicle.
Further, for example, in the first embodiment, in a similar manner to the second embodiment, the generation of text information is started in a case where a sign of abnormality is detected, and the generation of text information is stopped in a case where the sign of abnormality or abnormality has ended.
Further, for example, in the second embodiment, in a similar manner to the first embodiment, transmission of text information is started in a case where an abnormality is detected. Further, for example, in a manner similar to the first embodiment, text information generated during a period from a time of a predetermined time before an abnormality is detected to a time when an abnormality is detected may be transmitted.
Further, for example, after the text information is transmitted for a predetermined time after the abnormality is detected, the transmission of the text information may be stopped regardless of whether the abnormality has ended.
Further, for example, in a case where the vehicle 10 cannot transmit text information to the notification destination due to a malfunction or the like, if it is possible to communicate with surrounding vehicles by short-range communication, the text information may be transmitted to the surrounding vehicles, and the surrounding vehicles may transmit the text information to the notification destination on behalf of the vehicle 10.
Further, for example, the signal processing system 201 and the signal processing system 301 may be installed at a fixed place, and may be used to monitor an abnormality such as a traffic accident in a predetermined monitoring area. Hypothetical examples of target surveillance zones include intersections, highways, and railroad crossings.
In this case, the text information includes, for example, character data representing information associated with the situation of the monitoring area. Examples of the information associated with the situation of the monitored area include a vehicle, a driver, a pedestrian, weather, a road surface state, presence/absence of an obstacle, presence/absence of an accident occurrence, and contents of an accident situation in the monitored area and voice recognition of voice data in the monitored area.
Further, the signal processing system 201 and the signal processing system 301 may be provided in a mobile body other than the vehicle, and may be used to notify various types of abnormality of the mobile body. Assumed examples of the target moving body include a motorcycle, a bicycle, a personal mobile device, an airplane, a ship, a construction machine, and an agricultural machine (agricultural tractor). Further, for example, a moving body such as an unmanned aerial vehicle or a robot which is remotely driven (operated) without riding a user is also included. Hypothetical examples of anomalies to be notified include accidents, falls, damage, and malfunctions.
In this case, for example, the text information includes character data representing information associated with the moving body, a driver of the moving body (in the case where the driver is present) and a situation of abnormality (for example, an accident or the like), and character data representing the contents of voice recognition of voice data in the moving body. Further, in the case where an accident involving the moving body occurs and there is an opposite party of the accident, the text information includes, for example, character data representing information associated with the opposite party of the accident.
Further, the signal processing system 201 and the signal processing system 301 may be disposed in a predetermined monitoring area, and may be used for crime prevention, disaster prevention, and the like. Examples of assumptions about a target monitoring area include various facilities (e.g., shops, companies, schools, factories, stations, airports, warehouses, etc.), business locations, streets, parking lots, residences, and locations where natural disasters are assumed to occur. Examples of assumptions of exceptions to be notified include entry of suspicious people, theft, vandalism, suspicious behavior, fire, and natural disasters (e.g., floods, tsunamis, volcanic eruptions, etc.).
In this case, the text information includes, for example, character data representing information associated with the situation of the monitoring area. Examples of the information associated with the situation of the monitored area include a person, an object, weather, presence/absence of occurrence of an abnormality, and contents of an abnormal situation in the monitored area and voice recognition of voice data in the monitored area.
Further, for example, the content of the text information may be changed according to the situation. Further, for example, the text information may be transmitted multiple times.
Further, for example, the above-described processing may be performed using only image data without using the received data.
Further, the textual information may be used, for example, in dynamic maps for use in autonomous driving. The dynamic map includes, for example, static information that changes little in time such as road surfaces, lanes, and structures, quasi-static information such as management of traffic control schedules and road construction schedules, quasi-dynamic information such as accidents and congestion, and dynamic information such as surrounding vehicle and signal information. Then, for example, the quasi-dynamic information or the like in the center of the notification destination is updated using the text information.
<5. other >)
< exemplary computer configuration >
The series of processes described above may be executed by hardware or software. In the case where the series of processes is executed by software, a program included in the software is installed in a computer. Here, examples of the computer include a computer mounted in dedicated hardware, a general-purpose personal computer capable of executing various functions by installing various programs, and the like.
Fig. 6 is a block diagram illustrating an exemplary hardware configuration of a computer that executes the above-described series of processes using a program.
In the computer 1000, a Central Processing Unit (CPU)1001, a Read Only Memory (ROM)1002, and a Random Access Memory (RAM)1003 are coupled to each other via a bus 1004.
Input/output interface 1005 is further connected to bus 1004. An input unit 1006, an output unit 1007, a recording unit 1008, a communication unit 1009, and a drive 1010 are connected to the input/output interface 1005.
The input unit 1006 includes an input switch, a button, a microphone, an image pickup device, and the like. The output unit 1007 includes a display, a speaker, and the like. The recording unit 1008 includes a hard disk, a nonvolatile memory, and the like. The communication unit 1009 includes a network interface and the like. The drive 1010 drives a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
In the computer 1000 configured as described above, for example, the CPU 1001 loads a program stored in the recording unit 1008 into the RAM1003 via the input/output interface 1005 and the bus 1004 and executes the program, thereby executing the series of processing described above.
The program to be executed by the computer 1000(CPU 1001) may be provided by, for example, being recorded in a removable medium 1011 or the like as a package medium. Further, the program may be provided through a wired or wireless transmission medium such as a local area network, the internet, and digital satellite broadcasting.
In the computer 1000, by attaching the removable medium 1011 to the drive 1010, the program can be installed in the recording unit 1008 via the input/output interface 1005. Further, the program may be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the recording unit 1008. In addition, the program may be installed in the ROM 1002 or the recording unit 1008 in advance.
Note that the program to be executed by the computer may be a program that performs processing in a time-series manner according to the order described in this specification, or may be a program that performs processing in parallel or at necessary timing such as when making a call.
Further, in this specification, the system indicates a set of a plurality of constituent elements (devices, modules (parts), and the like), and it does not matter whether all the constituent elements are in the same housing. Therefore, a plurality of devices accommodated in a single housing and connected through a network and one device in which a plurality of modules are accommodated in one housing are both a system.
Further, the embodiments of the present technology are not limited to the above-described embodiments, and various modifications may be made without departing from the gist of the present technology.
For example, the present technology may employ a configuration of cloud computing in which one function is shared and joint-processed by a plurality of devices via a network.
Further, the steps described in the above flowcharts may be performed by one device or shared by a plurality of devices.
Further, in the case where a plurality of processes are included in one step, the plurality of processes included in this step may be executed by one device or shared by a plurality of devices.
< exemplary configuration combination >
The present technology can also adopt the following configuration.
(1) A signal processing apparatus comprising:
an identifying unit that identifies content of the captured image imaged by the imaging unit;
a text information generating unit that generates text information including data representing the content of the recognized captured image in characters; and
and a transmission control unit that controls transmission of the text information.
(2) The signal processing apparatus according to the above (1), wherein,
the signal processing apparatus is provided in a vehicle, and
the text information generating unit generates the text information based on the identified content of the captured image and the content of data received from at least one of the interior of the vehicle or the exterior of the vehicle.
(3) The signal processing apparatus according to the above (2), wherein,
the text information includes data that represents information associated with an abnormality of at least one of the vehicle or a periphery of the vehicle in characters.
(4) The signal processing apparatus according to the above (3), wherein,
the information associated with the abnormality includes at least one of a characteristic of another vehicle around the vehicle, a state of the another vehicle, a characteristic of a driver of the another vehicle, a situation of an accident, a characteristic of the vehicle, a state of the vehicle, a characteristic of a driver of the vehicle, or a state of a driver of the vehicle.
(5) The signal processing apparatus according to the above (4), wherein,
the characteristic of the other vehicle includes a content of a vehicle number plate of the other vehicle.
(6) The signal processing apparatus according to any one of the above (1) to (5), further comprising:
an abnormality detection unit that detects an abnormality based on the identified content of the captured image, wherein,
the transmission control unit controls transmission of the text information based on a detection result of the abnormality.
(7) The signal processing apparatus according to the above (6), wherein,
the transmission control unit starts transmission of the text information in a case where the abnormality is detected.
(8) The signal processing device according to the above (7), wherein,
the text information generating unit continuously generates the text information regardless of the detection result of the abnormality, and
in the case where the abnormality is detected, the transmission control unit starts transmission of the text information and transmits the text information during a period from a predetermined time before until the abnormality is detected.
(9) The signal processing device according to the above (7), wherein,
the text information generation unit starts generation of the text information in a case where the sign of the abnormality is detected.
(10) The signal processing apparatus according to the above (6), wherein,
the text information generation unit starts generation of the text information in a case where the sign of the abnormality is detected, and
the transmission control unit starts transmission of the text information in a case where the sign of the abnormality is detected.
(11) The signal processing device according to the above (10), wherein,
the text information includes data representing information associated with the sign of the abnormality in characters.
(12) The signal processing device according to the above (10), wherein,
the signal processing apparatus is provided in a vehicle, and
the indication of the anomaly includes at least one of a risk factor of an accident of the vehicle or an operation of the vehicle to avoid the accident.
(13) The signal processing apparatus according to any one of the above (6) to (12), wherein,
the text information includes data representing information associated with the anomaly in characters.
(14) The signal processing apparatus according to any one of the above (6) to (13), further comprising:
a receiving unit that receives data including information associated with the anomaly,
the text information further includes data representing the content of the received data in characters.
(15) The signal processing device according to the above (14), wherein,
the abnormality detection unit further detects the abnormality based on the received data.
(16) The signal processing apparatus according to the above (14) or (15), wherein,
the received data includes voice data, and
the text information includes data representing contents of speech recognition of the speech data in characters.
(17) The signal processing apparatus according to the above (1), wherein,
the imaging unit images a predetermined monitoring area, and
the text information includes data representing information associated with a situation of the monitoring area in characters.
(18) The signal processing apparatus according to any one of the above (1) to (17), further comprising:
the imaging unit.
(19) The signal processing device according to the above (18), further comprising:
an image sensor including the imaging unit and the recognition unit.
(20) A signal processing method, comprising:
identifying content of a captured image imaged by an imaging unit;
generating text information including data representing content of the recognized captured image in characters; and
controlling the sending of the text information.
(21) A program for causing a computer to execute a process comprising the steps of:
identifying content of a captured image imaged by an imaging unit;
generating text information including data representing content of the recognized captured image in characters; and
controlling the sending of the text information.
(22) An image forming apparatus comprising:
an imaging unit;
an identifying unit that identifies content of the captured image imaged by the imaging unit;
a text information generating unit that generates text information including data representing the content of the recognized captured image in characters; and
and a transmission control unit that controls transmission of the text information.
Note that the effects described herein are merely examples and are not limiting, and additional effects may be included.
REFERENCE SIGNS LIST
10 vehicle
100 vehicle control system
101 input unit
102 data acquisition unit
103 communication unit
141 vehicle external information detecting unit
142 in-vehicle information detection unit
143 vehicle state detecting unit
153 situation recognition unit
201 signal processing system
211 image forming unit
212 receiving unit
213 Signal processing Unit
214 sending unit
221 identification unit
222 text information generating unit
223 abnormality detection unit
224 transmission control unit
301 signal processing system
311 signal processing unit
321 abnormality detection unit
322 text information generating unit
323 transmission control unit

Claims (22)

1. A signal processing apparatus comprising:
an identifying unit that identifies content of the captured image imaged by the imaging unit;
a text information generating unit that generates text information including data representing the content of the recognized captured image in characters; and
and a transmission control unit that controls transmission of the text information.
2. The signal processing apparatus according to claim 1,
the signal processing apparatus is provided in a vehicle, and
the text information generating unit generates the text information based on the identified content of the captured image and the content of data received from at least one of the interior of the vehicle or the exterior of the vehicle.
3. The signal processing apparatus according to claim 2,
the text information includes data that represents information associated with an abnormality of at least one of the vehicle or a periphery of the vehicle in characters.
4. The signal processing apparatus according to claim 3,
the information associated with the abnormality includes at least one of a characteristic of another vehicle around the vehicle, a state of the another vehicle, a characteristic of a driver of the another vehicle, a situation of an accident, a characteristic of the vehicle, a state of the vehicle, a characteristic of a driver of the vehicle, or a state of a driver of the vehicle.
5. The signal processing apparatus according to claim 4,
the characteristic of the other vehicle includes a content of a vehicle number plate of the other vehicle.
6. The signal processing apparatus of claim 1, further comprising:
an abnormality detection unit that detects an abnormality based on the identified content of the captured image, wherein,
the transmission control unit controls transmission of the text information based on a detection result of the abnormality.
7. The signal processing apparatus according to claim 6,
the transmission control unit starts transmission of the text information in a case where the abnormality is detected.
8. The signal processing apparatus according to claim 7,
the text information generating unit continuously generates the text information regardless of the detection result of the abnormality, and
in the case where the abnormality is detected, the transmission control unit starts transmission of the text information and transmits the text information during a period from a predetermined time before until the abnormality is detected.
9. The signal processing apparatus according to claim 7,
the text information generation unit starts generation of the text information in a case where the sign of the abnormality is detected.
10. The signal processing apparatus according to claim 6,
the text information generation unit starts generation of the text information in a case where the sign of the abnormality is detected, and
the transmission control unit starts transmission of the text information in a case where the sign of the abnormality is detected.
11. The signal processing apparatus according to claim 10,
the text information includes data representing information associated with the sign of the abnormality in characters.
12. The signal processing apparatus according to claim 10,
the signal processing apparatus is provided in a vehicle, and
the indication of the anomaly includes at least one of a risk factor of an accident of the vehicle or an operation of the vehicle to avoid the accident.
13. The signal processing apparatus according to claim 6,
the text information includes data representing information associated with the anomaly in characters.
14. The signal processing apparatus of claim 6, further comprising:
a receiving unit that receives data including information associated with the anomaly,
the text information further includes data representing the content of the received data in characters.
15. The signal processing apparatus according to claim 14,
the abnormality detection unit further detects the abnormality based on the received data.
16. The signal processing apparatus according to claim 14,
the received data includes voice data, and
the text information includes data representing contents of speech recognition of the speech data in characters.
17. The signal processing apparatus according to claim 1,
the imaging unit images a predetermined monitoring area, and
the text information includes data representing information associated with a situation of the monitoring area in characters.
18. The signal processing apparatus of claim 1, further comprising:
the imaging unit.
19. The signal processing apparatus of claim 18, further comprising:
an image sensor including the imaging unit and the recognition unit.
20. A signal processing method, comprising:
identifying content of a captured image imaged by an imaging unit;
generating text information including data representing content of the recognized captured image in characters; and
controlling the sending of the text information.
21. A program for causing a computer to execute a process comprising the steps of:
identifying content of a captured image imaged by an imaging unit;
generating text information including data representing content of the recognized captured image in characters; and
controlling the sending of the text information.
22. An image forming apparatus comprising:
an imaging unit;
an identifying unit that identifies content of the captured image imaged by the imaging unit;
a text information generating unit that generates text information including data representing the content of the recognized captured image in characters; and
and a transmission control unit that controls transmission of the text information.
CN202080037215.4A 2019-05-28 2020-05-15 Signal processing device, signal processing method, program, and imaging device Pending CN113841187A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2019-099011 2019-05-28
JP2019099011 2019-05-28
PCT/JP2020/019373 WO2020241292A1 (en) 2019-05-28 2020-05-15 Signal processing device, signal processing method, program, and imaging device

Publications (1)

Publication Number Publication Date
CN113841187A true CN113841187A (en) 2021-12-24

Family

ID=73553442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080037215.4A Pending CN113841187A (en) 2019-05-28 2020-05-15 Signal processing device, signal processing method, program, and imaging device

Country Status (5)

Country Link
US (1) US20220309848A1 (en)
JP (1) JP7367014B2 (en)
CN (1) CN113841187A (en)
DE (1) DE112020002741T5 (en)
WO (1) WO2020241292A1 (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020115423A1 (en) * 2001-02-19 2002-08-22 Yasuhiko Hatae Emergency information notifying system, and apparatus, method and moving object utilizing the emergency information notifying system
JP2004217188A (en) * 2003-01-17 2004-08-05 Matsushita Electric Ind Co Ltd On-vehicle display device and display method
JP2005338934A (en) * 2004-05-24 2005-12-08 Nissan Motor Co Ltd In-vehicle communication device
JP2007172483A (en) * 2005-12-26 2007-07-05 Kayaba Ind Co Ltd Drive recorder
KR20090081459A (en) * 2008-01-24 2009-07-29 주식회사 토페스 Traffic Condition Information Service System
JP2011079350A (en) * 2009-10-02 2011-04-21 Toyota Motor Corp Failure detecting device for vehicle, electronic control unit, failure detection method for vehicle
JP2012095040A (en) * 2010-10-26 2012-05-17 Nippon Seiki Co Ltd Imaging device
JP2014123303A (en) * 2012-12-21 2014-07-03 Secom Co Ltd Monitoring system
JP2015207049A (en) * 2014-04-17 2015-11-19 株式会社デンソー Vehicle accident situation prediction device, vehicle accident situation prediction system and vehicle accident notification device
CN107161097A (en) * 2017-04-06 2017-09-15 南京航空航天大学 Vehicle running intelligent security system based on triones navigation system
CN207022040U (en) * 2015-12-29 2018-02-16 昶洧新能源汽车发展有限公司 Mobile unit and service provider computer
CN207097257U (en) * 2015-12-29 2018-03-13 昶洧新能源汽车发展有限公司 Onboard system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006120137A (en) * 2001-02-19 2006-05-11 Hitachi Kokusai Electric Inc Image information reporting system
JP2017090220A (en) 2015-11-09 2017-05-25 トヨタ自動車株式会社 Radar device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020115423A1 (en) * 2001-02-19 2002-08-22 Yasuhiko Hatae Emergency information notifying system, and apparatus, method and moving object utilizing the emergency information notifying system
JP2004217188A (en) * 2003-01-17 2004-08-05 Matsushita Electric Ind Co Ltd On-vehicle display device and display method
JP2005338934A (en) * 2004-05-24 2005-12-08 Nissan Motor Co Ltd In-vehicle communication device
JP2007172483A (en) * 2005-12-26 2007-07-05 Kayaba Ind Co Ltd Drive recorder
KR20090081459A (en) * 2008-01-24 2009-07-29 주식회사 토페스 Traffic Condition Information Service System
JP2011079350A (en) * 2009-10-02 2011-04-21 Toyota Motor Corp Failure detecting device for vehicle, electronic control unit, failure detection method for vehicle
JP2012095040A (en) * 2010-10-26 2012-05-17 Nippon Seiki Co Ltd Imaging device
JP2014123303A (en) * 2012-12-21 2014-07-03 Secom Co Ltd Monitoring system
JP2015207049A (en) * 2014-04-17 2015-11-19 株式会社デンソー Vehicle accident situation prediction device, vehicle accident situation prediction system and vehicle accident notification device
CN207022040U (en) * 2015-12-29 2018-02-16 昶洧新能源汽车发展有限公司 Mobile unit and service provider computer
CN207097257U (en) * 2015-12-29 2018-03-13 昶洧新能源汽车发展有限公司 Onboard system
CN107161097A (en) * 2017-04-06 2017-09-15 南京航空航天大学 Vehicle running intelligent security system based on triones navigation system

Also Published As

Publication number Publication date
WO2020241292A1 (en) 2020-12-03
JPWO2020241292A1 (en) 2020-12-03
US20220309848A1 (en) 2022-09-29
JP7367014B2 (en) 2023-10-23
DE112020002741T5 (en) 2022-03-03

Similar Documents

Publication Publication Date Title
US20210387640A1 (en) Information processing apparatus, information processing method, and program
US11873007B2 (en) Information processing apparatus, information processing method, and program
CN112534487B (en) Information processing apparatus, moving body, information processing method, and program
US11815887B2 (en) Vehicle control device, vehicle control method, vehicle, information processing device, information processing method, and program
EP3900994A1 (en) Vehicle control device, vehicle control method, program, and vehicle
JPWO2019039281A1 (en) Information processing equipment, information processing methods, programs, and mobiles
WO2021241189A1 (en) Information processing device, information processing method, and program
JPWO2020009060A1 (en) Information processing equipment and information processing methods, computer programs, and mobile equipment
WO2021065559A1 (en) Information processing device, information processing method, and information processing device
JP7192771B2 (en) Information processing device, information processing method, program, and vehicle
US20220277556A1 (en) Information processing device, information processing method, and program
WO2020129810A1 (en) Information processing apparatus, information processing method, and program
WO2022024803A1 (en) Training model generation method, information processing device, and information processing system
JP7367014B2 (en) Signal processing device, signal processing method, program, and imaging device
JP7451423B2 (en) Image processing device, image processing method, and image processing system
WO2020129689A1 (en) Moving body control device, moving body control method, moving body, information processing device, information processing method, and program
WO2024048180A1 (en) Information processing device, information processing method, and vehicle control system
WO2022014327A1 (en) Information processing device, information processing method, and program
WO2023171401A1 (en) Signal processing device, signal processing method, and recording medium
WO2023068116A1 (en) On-vehicle communication device, terminal device, communication method, information processing method, and communication system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination