WO2020065708A1 - Système informatique, procédé de notification de conduite imprudente de véhicule et programme - Google Patents

Système informatique, procédé de notification de conduite imprudente de véhicule et programme Download PDF

Info

Publication number
WO2020065708A1
WO2020065708A1 PCT/JP2018/035343 JP2018035343W WO2020065708A1 WO 2020065708 A1 WO2020065708 A1 WO 2020065708A1 JP 2018035343 W JP2018035343 W JP 2018035343W WO 2020065708 A1 WO2020065708 A1 WO 2020065708A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
dangerous driving
camera device
image
distributed ledger
Prior art date
Application number
PCT/JP2018/035343
Other languages
English (en)
Japanese (ja)
Inventor
篤 古城
将仁 谷口
Original Assignee
株式会社ウフル
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ウフル filed Critical 株式会社ウフル
Priority to PCT/JP2018/035343 priority Critical patent/WO2020065708A1/fr
Priority to JP2020547623A priority patent/JPWO2020065708A1/ja
Publication of WO2020065708A1 publication Critical patent/WO2020065708A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/40Business processes related to the transportation industry
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a computer system including a camera device provided in a vehicle and a distributed ledger, a method of notifying a dangerously driven vehicle, and a program.
  • the present invention relates to IoT (Internet of Things), and its technical field corresponds to G06Q or the like in the IPC classification.
  • dangerous driving such as hunting has become a social problem.
  • Examples of such dangerous driving include abnormal approach, follow-up, chasing, unreasonable interrupts, sudden braking, passing horn, lapping, and screaming. Therefore, there is a need for a technique for notifying a vehicle corresponding to such dangerous driving.
  • a configuration is disclosed in which the presence of a dangerous vehicle is notified based on the degree of danger of a stopped vehicle located in the vicinity, not due to dangerous driving (see Patent Document 1).
  • a stopped vehicle located in the vicinity is photographed, and the degree of danger is determined based on the photographed image.
  • Patent Literature 1 merely determines whether a stopped vehicle is a dangerous vehicle, and does not determine whether a vehicle traveling around the own vehicle is a dangerous vehicle. Did not. In addition, it does not judge whether or not the vehicle is performing dangerous driving such as humid driving.
  • the object of the present invention is to provide a computer system, a dangerous driving vehicle notification method, and a program that can easily notify that a dangerous driving vehicle is present in the vicinity while guaranteeing the creditworthiness.
  • the present invention provides the following solutions.
  • the present invention is a computer system including a camera device provided in a vehicle and a distributed ledger, Acquisition means for acquiring an image of another vehicle around the image taken by the camera device, Analyzing means for analyzing the image to specify a vehicle number of the other vehicle in the vicinity, Determining means for analyzing the image to determine whether the other vehicle in the vicinity is in dangerous driving, When it is determined that it is dangerous driving, the specified vehicle number, recording means to record in the distributed ledger as a dangerous driving vehicle, Detecting means for analyzing the image to detect whether or not the other vehicles in the vicinity have been replaced; Inquiring means for, when detecting that the vehicle has been replaced, referencing the vehicle number of the replaced vehicle to the distributed ledger, As a result of the inquiry, when the vehicle number of the replaced vehicle is recorded in the distributed ledger, a notifying unit that notifies a user terminal that the dangerous driving vehicle is present in the vicinity, A computer system is provided.
  • a camera device provided in a vehicle
  • a computer system including a distributed ledger obtains an image of another vehicle around the camera device, and analyzes the image to obtain the image. Identifying the vehicle number of the other vehicle in the vicinity, analyzing the image, determining whether the other vehicle in the vicinity is in dangerous driving, and determining that the vehicle is in dangerous driving, the identified vehicle
  • the number is recorded in the distributed ledger as a dangerous driving vehicle
  • the image is analyzed, it is detected whether or not the other vehicles around the vehicle have been replaced, and if it is detected that the vehicle has been replaced, the vehicle number of the replaced vehicle is detected.
  • Inquiring into the distributed ledger and as a result of the inquiry, if the vehicle number of the replaced vehicle is recorded in the distributed ledger, it is determined that the dangerous driving vehicle is present in the vicinity. Notifies the terminal.
  • the present invention is in the category of computer systems.
  • other categories such as methods and programs exhibit the same functions and effects according to the categories.
  • the present invention it is easy to provide a computer system, a dangerous driving vehicle notification method, and a program that can easily notify that a dangerous driving vehicle is present in the vicinity while ensuring the creditworthiness.
  • the system is realized by the camera device provided in the vehicle and the distributed ledger without the intervention of the server, so that the data is less likely to be falsified.
  • FIG. 1 is a diagram showing an outline of a dangerous driving vehicle notification system 1.
  • FIG. 2 is an overall configuration diagram of the dangerous driving vehicle notification system 1.
  • FIG. 3 is a flowchart showing a dangerous driving vehicle recording process executed by the camera device 10.
  • FIG. 4 is a flowchart illustrating a first dangerous driving vehicle notification process executed by the camera device 10.
  • FIG. 5 is a flowchart illustrating a second dangerously-driving vehicle notification process executed by the camera device 10.
  • FIG. 6 is a diagram illustrating an example of a notification screen in which the camera device 10 notifies a user terminal that a dangerous driving vehicle is present in the vicinity.
  • FIG. 7 is a diagram illustrating an example of a notification screen in which the camera device 10 notifies a user terminal that a dangerous driving vehicle is present in the vicinity.
  • FIG. 8 is a diagram illustrating an example of a notification screen in which the camera device 10 notifies the user terminal that the dangerously-driving vehicle exists in its own traveling direction.
  • FIG. 9 is a diagram illustrating an example of a notification screen in which the camera device 10 notifies the user terminal that the dangerous driving vehicle exists on the route to the destination.
  • FIG. 1 is a diagram for explaining an outline of a dangerous driving vehicle notification system 1 according to a preferred embodiment of the present invention.
  • the dangerous driving vehicle notification system 1 is a computer system that includes a camera device 10 and a distributed ledger and notifies a dangerous driving vehicle.
  • the dangerous driving vehicle notification system 1 includes other terminals such as a user terminal (an in-vehicle terminal such as a car navigation system and a mobile terminal such as a smartphone and a tablet terminal) owned by a user (not shown). May be.
  • a user terminal an in-vehicle terminal such as a car navigation system and a mobile terminal such as a smartphone and a tablet terminal
  • a mobile terminal such as a smartphone and a tablet terminal
  • the camera device 10 is connected to the camera device 10 and a user terminal provided in another vehicle so as to be able to perform data communication via a public line network or the like, and executes necessary data transmission / reception. Further, the camera device 10 shares the distributed ledger and performs necessary data inquiry, recording, and the like.
  • the camera device 10 captures an image such as a moving image or a still image of another vehicle (hereinafter, referred to as another vehicle) around the vehicle in which the camera device 10 is provided, and acquires the image. At this time, the camera device 10 may acquire time information and position information at which the image was taken. The camera device 10 acquires a shooting time as time information from a built-in timer or the like. Further, the camera device 10 acquires its own position as position information from GPS (Global Positioning System) or the like.
  • GPS Global Positioning System
  • the camera device 10 may collect not only images but also sounds from other vehicles using a sound collecting device such as a microphone.
  • the camera device 10 analyzes the image and extracts feature points (shape, contour, etc.) or feature amounts (statistical values such as average, variance, and histogram of pixel values).
  • the camera device 10 specifies a vehicle number (an automobile registration number mark, a vehicle number mark or a sign, which means a license plate) of another vehicle based on the extracted feature points or feature amounts. Further, based on the extracted feature points or feature amounts, the camera device 10 determines a speed difference, a distance, a positional relationship between a vehicle provided with the camera device and another vehicle, and a time period during which the other vehicle is around itself. And the traveling locus of the other vehicle, the direction of the headlight, the presence or absence of lighting, and the like.
  • the camera device 10 determines whether or not another vehicle is in dangerous driving based on the result of the image analysis.
  • the traveling speed is in the vicinity of a predetermined speed or more for a predetermined time or more, the inter-vehicle distance is equal to or less than a predetermined value, the presence or absence of a meandering operation, the presence or absence of a width approaching operation, the presence or absence of a sudden stop, the driving It is determined whether or not the vehicle is in dangerous driving based on at least one of the criteria of the presence or absence of the vehicle.
  • the camera device 10 determines that the other vehicle is in dangerous driving, the camera device 10 records the vehicle number specified as a result of the image analysis in the distributed ledger as the dangerous driving vehicle.
  • the distributed ledger may be, for example, a blockchain, an IOTA, or the like.
  • the camera device 10 combines the newly recorded vehicle number with the data recorded in the past and records the data in the distributed ledger.
  • the camera device 10 that can record in the distributed ledger may be performed only by a device that has been authenticated in advance. Further, the camera device 10 may issue a token to a user associated with the camera device 10 when newly recording the data in the distributed ledger.
  • the camera device 10 may record the time information and the position information in the distributed ledger along with the vehicle number.
  • the camera device 10 analyzes the image of the newly obtained image and detects whether or not another vehicle has been replaced.
  • the camera device 10 determines whether or not another vehicle has been replaced by identifying the vehicle number in the newly acquired image and comparing the identified vehicle number with the identified vehicle number in the images acquired so far. I do.
  • the camera device 10 When the camera device 10 detects that the other vehicle has been replaced, the camera device 10 queries the distributed ledger for the vehicle number of the replaced other vehicle at this replaced timing.
  • the camera device 10 specifies the vehicle number of the replaced other vehicle, and queries the vehicle number specified this time and the vehicle number recorded in the distributed ledger.
  • the camera device 10 notifies the user terminal that the dangerously-driving vehicle is present in the vicinity.
  • the camera device 10 notifies, for example, via a car navigation system, that a dangerous driving vehicle is present in the vicinity by displaying the sound or displaying on a display unit.
  • the camera device 10 notifies, for example, via a smartphone application that a dangerous driving vehicle is present in the vicinity by displaying the sound or a display on a display unit. If the camera device 10 has acquired the time information and the position information along with the image, the camera device 10 displays the position where the dangerously driving vehicle exists on the map displayed by the user terminal according to the time information and the position information. The notification may be made by this.
  • the camera device 10 may display a dangerous driving vehicle in its own traveling direction or a target route on a map displayed by the user terminal.
  • the camera device 10 may perform this notification by consuming the token. In this case, this notification is sent to the user terminal associated with the user who issued the token. That is, this notification is sent to the user holding the token.
  • the distributed ledger is realized by the distributed ledger technology, and is a ledger shared and managed by each node (camera device 10).
  • data is recorded as one ledger, and this ledger is shared by each node, and data is recorded as a block in a daisy chain, and each node shares this block.
  • the camera device 10 captures an image of another vehicle, such as a moving image or a still image, and acquires the captured image (step S01).
  • the camera device 10 photographs the license plate of each vehicle and the vehicle as the vehicle number.
  • the camera device 10 may not only acquire an image but also collect sound emitted from another vehicle by a sound collecting device such as a microphone. Further, the camera device 10 may acquire time information (acquired from a built-in timer or the like) and position information (acquired from a GPS or the like) at the time of capturing an image.
  • the camera device 10 analyzes the acquired image to extract feature points or feature amounts in the image.
  • the camera device 10 specifies a vehicle number (a license plate in this example) of another vehicle based on the extracted feature points or feature amounts (step S02).
  • the camera device 10 may determine a distance to another vehicle (for example, a distance between vehicles, a change in a distance between vehicles, and an unnecessary sudden stop, and the like) based on a feature point or a feature amount in an image, and a communication with another vehicle.
  • Positional relationship for example, identification of side-by-side operation, excessive interruption and unnecessary sudden stop, etc.), speed and speed difference of other vehicles (meandering operation, change in distance between vehicles, excessive interruption and unnecessary sudden stop, etc.) Identification), changes in headlights (identification such as passing), and travel trajectory (identification such as meandering operation).
  • Such specified items are mainly items corresponding to dangerous driving. That is, the camera device 10 specifies, as a result of the image analysis, whether or not another vehicle is driving which corresponds to dangerous driving.
  • the camera device 10 When the camera device 10 has acquired a voice, the camera device 10 recognizes the voice by a spectrum analyzer or the like, and recognizes the voice based on the voice waveform. The camera device 10 specifies an abnormal sound (specification such as horn or swearing) based on the recognized voice. That is, the camera device 10 similarly specifies whether or not another vehicle is driving dangerous driving as a result of the voice recognition.
  • an abnormal sound specificallyation such as horn or swearing
  • the camera device 10 determines whether or not the other vehicle is performing a driving corresponding to a dangerous driving as a result of the image analysis (Step S03).
  • the camera device 10 has a traveling speed of at least a predetermined value, a position around a predetermined time or more, an inter-vehicle distance of not more than a predetermined value, a meandering operation, a side-by-side operation, a sudden stop, or a boost operation based on at least one criterion.
  • the camera device 10 may determine whether or not the vehicle is a dangerously driven vehicle based on other criteria such as passing or unreasonable interruption, without being limited to these criteria.
  • the bus 10 may determine whether or not the vehicle is a dangerously driven vehicle based on the specified abnormal sound in addition to the above-described example.
  • the camera device 10 determines that dangerous driving is being performed, the camera device 10 records the identified vehicle number of the dangerous driving vehicle in the distributed ledger as a dangerous driving vehicle (step S04). For example, the camera device 10 combines the newly recorded vehicle number with the data recorded in the past, and records the data in the distributed ledger. Each node shares the recorded distributed ledger.
  • the camera device 10 that can be recorded in the distributed ledger may be only one that has been authenticated in advance. Further, when the time information and the position information are acquired, the camera device 10 may record the time information and the position information together with the vehicle number. Further, when recording the vehicle number in the distributed ledger, the camera device 10 may issue a token to a user associated with the camera device 10.
  • the camera device 10 When the camera device 10 acquires a new image, the camera device 10 analyzes the image of the newly acquired image and extracts feature points or feature amounts of the newly acquired image. The camera device 10 specifies the vehicle number of another vehicle in this image based on the extracted feature points or feature amounts. The camera device 10 detects whether or not the other vehicle has been replaced by comparing the vehicle number specified this time with the vehicle number specified in the image of the other vehicle acquired so far (step S05). The camera device 10 detects whether or not the vehicle has been switched based on whether or not the vehicle number specified this time is different from the vehicle number specified in the images so far.
  • the camera device 10 may have acquired time information and position information at the time of capturing the image.
  • the camera device 10 When the camera device 10 detects that the other vehicle has been replaced, the camera device 10 queries the distributed ledger for the vehicle number of the replaced other vehicle (a newly specified vehicle number) at the timing when the replacement is detected (step S10). S06).
  • the camera device 10 notifies the user terminal (vehicle-mounted terminal or mobile terminal) that the dangerously-driving vehicle is present in the vicinity. (Step S07).
  • the camera device 10 notifies the user by displaying a notification screen indicating that the dangerous driving vehicle is present in the vicinity on the display unit or outputting a notification sound, for example, via a car navigation system.
  • the camera device 10 notifies the user by displaying a notification screen indicating that the dangerously-driving vehicle is present in the vicinity on the display unit or outputting a notification sound, for example, via a smartphone application.
  • the camera device 10 displays the position where the dangerous driving vehicle exists on the map displayed by the user terminal according to the time information and the position information.
  • the notification may be made by this.
  • the camera device 10 based on the position information and the time information of the dangerous driving vehicle identified by another camera device 10 as another node, based on the own driving direction or the dangerous driving vehicle on the route to the destination, The notification may be made by displaying on a map displayed by the user terminal.
  • the camera device 10 may notify that a dangerously driven vehicle is present in the vicinity by consuming the token.
  • the user terminal associated with the user to whom the token has been issued is notified after the token is consumed.
  • FIG. 2 is a diagram showing a system configuration of a dangerous driving vehicle notification system 1 according to a preferred embodiment of the present invention.
  • the dangerous driving vehicle notification system 1 is a computer system that includes a camera device 10 and a distributed ledger provided in the vehicle, and notifies a dangerous driving vehicle.
  • the camera device 10 is connected to a camera device 10 provided in another vehicle and a user terminal owned by a user so as to be able to perform data communication via a public line network or the like.
  • the dangerous driving vehicle notification system 1 includes other terminals such as a user terminal (an in-vehicle terminal such as a car navigation system and a mobile terminal such as a smartphone and a tablet terminal) owned by a user (not shown). May be.
  • a user terminal an in-vehicle terminal such as a car navigation system and a mobile terminal such as a smartphone and a tablet terminal
  • a mobile terminal such as a smartphone and a tablet terminal
  • the camera device 10 is connected to the camera device 10 and a user terminal provided in another vehicle so as to be able to perform data communication via a public line network or the like, and executes necessary data transmission / reception. Further, the camera device 10 shares the distributed ledger and performs necessary data inquiry, recording, and the like.
  • the distributed ledger is a ledger shared and managed by each node as described above.
  • the camera device 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and as a communication unit, a device for enabling communication with a user terminal or another camera device 10.
  • a Wi-Fi (Wireless-Fidelity) compliant device or the like compliant with IEEE 802.11 is provided.
  • the camera device 10 includes, as a recording unit, a data storage unit such as a hard disk, a semiconductor memory, a recording medium, and a memory card.
  • the camera device 10 includes, as a processing unit, various devices that execute various processes.
  • the control unit reads a predetermined program, and realizes the position information acquisition module 20 and the notification module 21 in cooperation with the communication unit. Further, in the camera device 10, the control unit reads a predetermined program, thereby realizing the recording module 30 in cooperation with the recording unit. In the camera device 10, the control unit reads a predetermined program, and cooperates with the processing unit to cooperate with the image capturing module 40, the sound collection module 41, the time information acquisition module 42, the image analysis module 43, the identification module 44, A voice recognition module 45, a dangerous driving determination module 46, a token issuing module 47, another vehicle replacement detection module 48, and an inquiry module 49 are realized.
  • FIG. 3 is a diagram illustrating a flowchart of the dangerous driving vehicle recording process executed by the camera device 10. The processing executed by each module described above will be described together with this processing.
  • the photographing module 40 is another vehicle around the vehicle in which the photographing module 40 is provided (for example, a vehicle existing at a position of 360 degrees around the vehicle, a vehicle existing in front and rear positions, and hereinafter referred to as other vehicles). ), An image such as a moving image or a still image is captured, and an image of another vehicle is obtained (step S10).
  • the photographing module 40 photographs the vehicle number (license plate) of the other vehicle and part or all of the other vehicle.
  • the camera device 10 acquires an image of another vehicle by capturing an image of another vehicle by the imaging module 40.
  • the sound collection module 41 collects sound (for example, horn sound, occupant sound) emitted from another vehicle by a sound collection device such as a microphone (step S11).
  • the camera device 10 uses the sound collection module 41 to acquire the sound of another vehicle that collects the sound of another vehicle.
  • step S11 may be omitted.
  • the camera device 10 may execute a process in which a process corresponding to audio is omitted in a process described below.
  • the time information acquisition module 42 acquires time information (step S12).
  • the time information acquisition module 42 acquires, as time information, the time (date and date and time) at which the image was captured from a timer or the like incorporated therein.
  • the position information acquisition module 20 acquires its own position information (Step S13).
  • the position information acquisition module 20 acquires information on its current position from GPS or the like as position information.
  • the position information acquisition module 20 acquires information on the current position of the vehicle provided with the position information acquisition module 20 by acquiring the position of itself (camera device 10) as position information.
  • steps S12 and S13 described above may be omitted.
  • the camera device 10 may execute a process in which processes corresponding to time information and position information are omitted in processes to be described later.
  • the image analysis module 43 performs image analysis on the acquired image (Step S14).
  • the image analysis module 43 performs image analysis by extracting feature points or feature amounts of the image.
  • the image analysis module 43 extracts shapes, contours, and the like as feature points. Further, the image analysis module 43 extracts a statistical value such as an average, a variance, and a histogram of pixel values as the feature amount.
  • the image analysis module 43 extracts a feature point or a feature amount to extract the other vehicle and the vehicle number of the other vehicle reflected in the image.
  • the identification module 44 identifies the extracted data relating to the other vehicle as the other vehicle image data based on the extracted feature points or feature amounts (step S15).
  • the identification module 44 determines, as the other vehicle image data, the vehicle number, the speed of the other vehicle, the speed difference between the vehicle itself and the other vehicle, the distance between the vehicle itself and the other vehicle, and the distance between the vehicle itself and the other vehicle. It specifies the positional relationship between the other vehicles, the time during which the other vehicle is around the vehicle, the direction of the headlights of the other vehicle, whether or not the vehicle is lit, the traveling locus of the other vehicle, and the like.
  • the specifying module 44 specifies a character string described on a license plate as a vehicle number.
  • the specific module 44 determines a speed, a speed difference, a distance, a positional relationship, a headlight direction, whether or not the headlight is turned on or not, a traveling locus, and the like, based on a state of the front, rear, or side of the other vehicle in the image or a change in the state. Identify.
  • the specifying module 44 specifies the image capturing time as the time during which another vehicle is present in the vicinity.
  • the specific module 44 uses the vehicle number and various dangerous driving (other vehicles approaching and following at a predetermined speed or more, the inter-vehicle distance is abnormally narrow, meandering driving, side-by-side driving, and unnecessary driving as the other vehicle image data. Items that correspond to sudden stop, unreasonable interruption, swaying operation, etc.) will be specified.
  • the specifying module 44 is not limited to the example described above, and may specify other items.
  • the specifying module 44 may be configured to specify at least one item instead of specifying all the items described above. In this case, the camera device 10 may use the specified item in the processing described below.
  • the speed of the other vehicle or the speed difference between itself and the other vehicle is an item that may correspond to a sudden stop, a side-by-side driving, an unreasonable interruption, a fluttering operation, or the like.
  • the distance or positional relationship between the vehicle itself and another vehicle is an item that may correspond to a sudden stop, a side-by-side operation, an unreasonable interruption, a swaying operation, a meandering operation, or the like.
  • the time during which another vehicle is around the vehicle is an item that may have a possibility of driving in a frenzy, sticking with the vehicle, or the like.
  • the direction of the headlights of the other vehicles and the presence or absence of lighting are items that may correspond to the swaying operation or the like.
  • the traveling locus of the other vehicle is an item that may correspond to a sudden stop, a side-by-side driving, an unreasonable interruption, a swing operation, a meandering operation, or the like.
  • the voice recognition module 45 performs voice recognition on the obtained voice (step S16).
  • the speech recognition module 45 analyzes the acquired speech using a spectrum analyzer or the like, and recognizes the acquired speech. At this time, the voice recognition module 45 recognizes an abnormal sound (horn sound or swearing or the like) emitted by another vehicle.
  • the identification module 44 identifies data on the recognized abnormal sound as other vehicle audio data based on the recognized voice (step S17). In step S17, the identification module 44 identifies the presence / absence of a horn sound, the presence / absence of a yell, and the like as the other vehicle voice data.
  • the specifying module 44 specifies items corresponding to various dangerous driving (for example, humid driving) as the voice data of another vehicle.
  • the specifying module 44 is not limited to the example described above, and may specify other items.
  • the specifying module 44 may be configured to specify any one of the items, instead of specifying all of the items described above. In this case, the camera device 10 may use the specified item in the processing described below.
  • the presence / absence of horn of another vehicle or the presence / absence of screams are items that may correspond to the driving in a humorous manner.
  • the dangerous driving determination module 46 determines whether or not another vehicle is a dangerous driving vehicle that is performing a driving corresponding to the dangerous driving based on the results of the image analysis and the voice recognition (step S18).
  • the traveling speed is equal to or more than a predetermined value, while the vehicle is traveling around a predetermined time or more, the inter-vehicle distance is equal to or less than a predetermined threshold value, the meandering operation, the side-by-side operation, the sudden stop or the boosting operation is performed. It is determined whether the corresponding operation is being performed.
  • the dangerous driving determination module 46 sets the other vehicle corresponding to the driving content to be dangerous. It is determined that the vehicle is a driving vehicle. For example, when the items identified by the identification module 44 are the speed difference and the time, the vehicle is identified as a dangerously-driving vehicle because the vehicle is performing a driving corresponding to a fluttering operation. Similarly, even when the specified item is one or more, if the specified module 44 corresponds to at least one of the driving contents described above, the specified module 44 sets the other vehicle as a dangerous driving vehicle. Is determined.
  • the dangerous driving determination module 46 sets a score in advance for each of the items specified by the specific module 44, and when the total of the scores is equal to or greater than a predetermined threshold, sets the other vehicles satisfying this score in dangerous driving. It is determined that the vehicle is a dangerously driven vehicle that performs dangerous driving, and another vehicle that does not satisfy this score is determined to be a safely driven vehicle that does not perform dangerously driving.
  • This score is calculated for each item by a number evenly allocated (for example, speed, speed difference, distance, positional relationship, time, headlight direction and lighting, running trajectory and horn and swearing) Based on the score "1"), the scores are added according to the number of specified items.
  • the dangerous driving determination module 46 determines that the score is “3”. If the score is equal to or more than a predetermined threshold, the dangerous driving determining module 46 determines that the other vehicle is a dangerous driving vehicle.
  • the dangerous driving determination module 46 sets a score in advance for each of the items specified by the specific module 44, and when the total of the scores is equal to or greater than a predetermined threshold, sets the other vehicles satisfying this score in dangerous driving. It is determined that the vehicle is a dangerously driven vehicle that performs dangerous driving, and another vehicle that does not satisfy this score is determined to be a safely driven vehicle that does not perform dangerously driving.
  • the scores are unequally assigned to each item (for example, the speed and speed difference are scored “3”, the distance and the positional relationship are scored “3”, and the headlights and lighting are scored “1”.
  • the trajectory is set to score“ 2 ”
  • the horn or scream is set to score“ 5 ”
  • the actions and actions that lead to more dangerous driving are set to a high score, and those that do not are set to a low score).
  • the scores are added according to the content of the specified item. For example, when the speed difference, the distance, and the horn are specified, the dangerous driving determination module 46 determines that the score is “11”. When the score is equal to or more than a predetermined threshold, the dangerous driving determining module 46 determines that the other vehicle is a dangerous driving vehicle.
  • the dangerous driving determination module 46 may be configured to determine whether another vehicle is a dangerous driving vehicle by a method other than the example described above.
  • step S18 when the dangerous driving determination module 46 determines that the vehicle is not a dangerous driving vehicle (step S18 NO), the camera device 10 ends this processing.
  • step S18 when the dangerous driving determination module 46 determines that the vehicle is a dangerous driving vehicle (step S18 YES), the recording module 30 stores the identified vehicle number, time information, and position information of the other vehicle in the dangerous vehicle. The data is recorded in the distributed ledger (step S19).
  • step S19 the recording module 30 combines the dangerous vehicle data previously recorded in the distributed ledger with the dangerous vehicle data of another vehicle newly determined this time, and records the data in the distributed ledger. Other nodes share the distributed ledger newly recorded this time.
  • the recording module 30 generates a hash value in which dangerous vehicle data of another vehicle newly determined this time is combined with dangerous vehicle data recorded in the distributed ledger in the past, and records this hash value in the distributed ledger. .
  • the camera device 10 that can be recorded in the distributed ledger may be only a device that has been authenticated in advance. That is, the camera device 10 that the recording module 30 can record in the distributed ledger may be a device that has been authenticated in advance.
  • the authentication method of the camera device 10 a user who owns the camera device 10 can use a dedicated application, an authentication site, or a prior application to make an identification of the user (name, telephone number, identifier of the user terminal to be used). , The number of the vehicle in which the camera device 10 is installed, etc.) and a password, thereby authenticating the camera device 10 as a device that can be recorded in the distributed ledger.
  • the token issuing module 47 issues a token to the user who has recorded the dangerous vehicle data in the distributed ledger (step S20).
  • the token issuing module 47 issues a token to the user previously linked to the camera device 10. This token is issued each time dangerous vehicle data is recorded in the distributed ledger or every predetermined number of times.
  • An example of a user associated with the camera device 10 is realized by associating the identifier of the user with the identifier of the camera device 10 in advance.
  • step S20 may be omitted.
  • the camera device 10 only needs to execute a process in which processes related to the issued token are omitted in processes described later.
  • the camera device 10 may notify that the dangerously-driving vehicle is in the vicinity.
  • the camera device 10 outputs a dangerous driving vehicle notification indicating that the dangerous driving vehicle is nearby to a user terminal or the like used by the user associated with the camera device 10.
  • the user terminal may receive the dangerous driving vehicle notification, display a notification screen described later on its own display unit, or emit a notification sound via a speaker or the like.
  • the camera device 10 executes the notification by consuming the issued token.
  • FIG. 4 is a diagram illustrating a flowchart of the first dangerous driving vehicle notification process executed by the camera device 10. The processing executed by each module described above will be described together with this processing.
  • the photographing module 40 photographs an image of another vehicle (step S30).
  • the processing in step S30 is the same as the processing in step S10 described above.
  • the time information acquisition module 42 acquires time information (step S31).
  • the processing in step S31 is the same as the processing in step S12 described above.
  • the position information acquisition module 20 acquires its own position information (step S32).
  • the processing in step S32 is the same as the processing in step S13 described above.
  • step S33 The image analysis module 43 performs image analysis on the acquired image (step S33).
  • the process in step S33 is the same as the process in step S14 described above.
  • the identification module 44 identifies the other vehicle image data based on the extracted feature points or feature amounts (Step S34).
  • the process of step S34 is substantially the same as the process of step S15 described above, but the other vehicle image data to be specified may be only the vehicle number.
  • the other vehicle replacement detection module 48 includes the other vehicle image data in the image acquired this time and the images acquired so far (for example, the image acquired during the above-described dangerous driving vehicle recording process and the image acquired at the time before this time). Then, it is detected whether or not the other vehicle has been replaced with the other vehicle image data (step S35). In step S35, the other-vehicle replacement detection module 48 compares the vehicle numbers in the other-vehicle image data with each other, and detects whether or not the other vehicle has been replaced based on whether or not they are different.
  • step S35 when the other vehicle replacement detection module 48 does not detect that another vehicle has been replaced (step S35 NO), the camera device 10 ends this processing.
  • step S35 when the other vehicle replacement detection module 48 detects that the other vehicle has been replaced (step S35 YES), the inquiry module 49 queries the distributed ledger for the vehicle number of the replaced other vehicle. (Step S36).
  • step S36 the inquiry module 49 inquires, for example, the vehicle number of the replaced other vehicle with the vehicle number of the other vehicle previously recorded in the distributed ledger.
  • the inquiry module 49 inquires the hash value of the vehicle number of the other vehicle with the hash value recorded in the distributed ledger.
  • the inquiry module 49 determines whether or not the vehicle number of another vehicle is recorded in the distributed ledger as a result of the inquiry (step S37). In step S37, when the inquiry module 49 determines that the vehicle number is not recorded in the distributed ledger (step S37: NO), the camera device 10 ends this processing.
  • step S37 when the inquiry module 49 determines that this vehicle number is recorded in the distributed ledger (step S37 YES), the notification module 21 determines that the dangerous driving vehicle exists around itself. The user is notified (step S38). In step S38, the notification module 21 executes this notification to the user terminal associated with this user.
  • the notification module 21 specifies the user terminal associated with the user by referring to a database or the like recorded in advance by the recording module 30 associated with the identifier of the user and the identifier of the user terminal. The notification module 21 notifies the identified user terminal that the dangerous driving vehicle exists around the user.
  • the notification module 21 notifies the user of this as a user terminal, for example, by notifying an in-vehicle terminal or a portable terminal that a dangerously-driving vehicle is present in the vicinity. At this time, the notification module 21 notifies the user of this effect by executing either or both of the display on the screen of the user terminal and the output by voice.
  • the notification module 21 notifies the user that the dangerously driving vehicle is present around the user by consuming the token issued when the vehicle number of the dangerously driving vehicle is recorded in the distributed ledger.
  • the notification module 21 consumes one token when executing this notification to the user once.
  • the consumption of the token is not necessarily constant, and may be variable.
  • the amount of tokens to be consumed may be changed according to the value of this score by recording the score together with the score with the dangerous vehicle data.
  • the token consumption when notifying the vehicle number of the other vehicle having a higher score is set to be smaller.
  • the other vehicle having a lower score is a dangerously driven vehicle having lower maliciousness. Therefore, the token consumption when notifying the vehicle number of the other vehicle having a lower score is set higher.
  • the token consumption may be set to 0.
  • the opposite may be the token consumption.
  • the notification module 21 may notify the user terminal without requiring the token.
  • FIG. 6 is a diagram illustrating an example of a notification screen in which the notification module 21 notifies the user terminal that a dangerous driving vehicle is present in the vicinity.
  • the user terminal 100 displays the content notified by the notification module 21 on its own display unit.
  • the user terminal 100 displays, as a notification screen, a text on the display unit of the user terminal 100 indicating that the dangerous driving vehicle is present in the vicinity. For example, the user terminal 100 displays "Dangerous driving vehicle is present around you. Drive carefully.” Also, at this time, the user terminal 100 converts the text to be displayed into a voice, and outputs the text after the voice conversion using a speaker or the like.
  • the content of the notification executed by the notification module 21 may be either display on the display unit of the user terminal or output by voice.
  • FIG. 7 is a diagram illustrating an example of a notification screen in which the notification module 21 notifies the user terminal that a dangerous driving vehicle is present in the vicinity.
  • the user terminal 100 displays the content notified by the notification module 21 on its own display unit.
  • the user terminal 100 displays the positions of the vehicle 110 provided with the camera device 10 and the dangerous driving vehicle 120 on a map as a notification screen.
  • the notification module 21 notifies the time information and the position information of the vehicle 110 by including them in the notification content.
  • the user terminal 100 displays the current position of the vehicle 110 and the current position of the dangerous driving vehicle 120 on a map based on the current position of the vehicle 110 and the specified time information and position information of the dangerous driving vehicle 120.
  • the text 130 as described above. At this time, the above-mentioned sound may be output.
  • notification content executed by the notification module 21 only displays the dangerous driving vehicle on the map, and does not need to perform either text display or voice output on the display unit of the user terminal. Further, the notification content executed by the notification module 21 may be either display on the display unit of the user terminal or output by voice.
  • the above is the first dangerous driving vehicle notification processing.
  • FIG. 5 is a diagram illustrating a flowchart of the second dangerous driving vehicle notification process executed by the camera device 10. The processing executed by each module described above will be described together with this processing.
  • the time information acquisition module 42 acquires time information (step S40).
  • the processing in step S40 is the same as the processing in step S12 described above.
  • Step S41 The position information acquisition module 20 acquires its own position information (Step S41).
  • the process in step S41 is the same as the process in step S13 described above.
  • the inquiry module 49 inquires of the distributed ledger about the time information and the position information acquired this time (step S42). In step S42, the inquiry module 49 inquires, for example, the acquired time information and position information with the time information and position information of other vehicles that have been recorded in the distributed ledger. Alternatively, the inquiry module 49 inquires the hash values of the time information and the position information with the hash values recorded in the distributed ledger.
  • the inquiry module 49 determines whether the dangerous driving vehicle is on its own traveling direction or on the route to the destination as a result of the inquiry (step S43). In step S43, the inquiry module 49 determines whether there is a dangerous driving vehicle that satisfies the condition based on the position information and the time information of the dangerous driving vehicle. For example, it is determined whether or not this dangerously driving vehicle is within a predetermined range in its own traveling direction, and whether or not this dangerously driving vehicle is on the route to the destination within the time required to reach the destination. I do.
  • step S43 when the inquiry module 49 determines that the dangerous driving vehicle is not on its own traveling direction or on the route to the destination (step S43 NO), the camera device 10 ends this processing.
  • step S43 when the inquiring module 49 determines that the dangerously driving vehicle is on its own traveling direction or on the route to the destination (step S43 ⁇ YES), the notifying module 49 determines that the dangerously driving vehicle Is notified to the user that the vehicle is on the route to the destination or the destination (step S44).
  • step S44 the notification module 21 specifies the user terminal associated with the user by referring to a database or the like recorded in advance by the recording module 30 in association with the identifier of the user and the identifier of the user terminal. The notification module 21 notifies the identified user terminal that the dangerous driving vehicle is on its own traveling direction or on the route to the destination.
  • the notification module 21 notifies the user of this by notifying the user terminal that the dangerous driving vehicle is on its own traveling direction or on the route to the destination. At this time, the notification module 21 notifies the user of this effect by executing either or both of the display on the screen of the user terminal and the output by voice.
  • the notification module 21 consumes the token issued when the vehicle number of the dangerously driven vehicle is recorded in the distributed ledger in the same manner as in the processing of step S38 described above, so that the dangerously driven vehicle can move in its own traveling direction or Notify the user that the vehicle is on the route to the destination.
  • FIG. 8 is a diagram illustrating an example of a notification screen in which the notification module 21 notifies the user terminal that the dangerously-driving vehicle exists in its own traveling direction.
  • the user terminal 100 displays the content notified by the notification module 21 on its own display unit.
  • the user terminal 100 displays, as a notification screen, a text on the display unit of the user terminal 100 indicating that a dangerous driving vehicle exists in the traveling direction of the vehicle provided with the camera device 10. For example, the user terminal 100 displays "Dangerous driving vehicle exists in your traveling direction. Drive carefully.” Also, at this time, the user terminal 100 converts the text to be displayed into a voice, and outputs the text after the voice conversion using a speaker or the like.
  • the notification module 21 also notifies the user terminal when the dangerous driving vehicle is on the route to the destination, as in FIG. 8 described above. At this time, for example, the user terminal 100 displays "Dangerous driving vehicle exists on the route to your destination. Drive carefully.” Also, at this time, the user terminal 100 converts the text to be displayed into a voice, and outputs the text after the voice conversion using a speaker or the like.
  • the content of the notification executed by the notification module 21 may be either display on the display unit of the user terminal or output by voice.
  • FIG. 9 is a diagram illustrating an example of a notification screen in which the notification module 21 notifies the user terminal that the dangerously driving vehicle is present on the route to the destination.
  • the user terminal 100 displays the content notified by the notification module 21 on its own display unit.
  • the user terminal 100 displays, on the map, the vehicle 110 provided with the camera device 10, the destination 200, the route 210 to the destination, and the position of the dangerous driving vehicle 120 as a notification screen.
  • the notification module 21 notifies the time information and the position information of the vehicle 110 by including them in the notification content.
  • the user terminal 100 displays the current position of the vehicle 110, the destination 200, and the route to the destination on a map based on the current position of the vehicle 110 and the time information and the position information specified for the dangerous driving vehicle 120. 210, the current position of the dangerous driving vehicle 120 and the text 220 as described above are displayed together. At this time, the sound described above may be output.
  • the notification module 21 also notifies the user terminal when the dangerous driving vehicle exists in its own traveling direction, as in FIG. 9 described above. At this time, for example, the user terminal 100 displays the vehicle 110, the traveling direction, the current position of the dangerous driving vehicle, and the above-described text on a map. At this time, the sound described above may be output.
  • notification content executed by the notification module 21 only displays the dangerous driving vehicle on the map, and does not need to perform either text display or voice output on the display unit of the user terminal. Further, the notification content executed by the notification module 21 may be either display on the display unit of the user terminal or output by voice.
  • the means and functions described above are implemented when a computer (including a CPU, an information processing device, and various terminals) reads and executes a predetermined program.
  • the program is provided, for example, in the form of being provided from a computer via a network (SaaS: Software as a Service).
  • the program is provided in a form recorded on a computer-readable recording medium such as a flexible disk, a CD (eg, a CD-ROM), and a DVD (eg, a DVD-ROM, a DVD-RAM).
  • the computer reads the program from the recording medium, transfers the program to an internal recording device or an external recording device, records the program, and executes the program.
  • the program may be recorded in advance on a recording device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and may be provided to the computer from the recording device via a communication line.

Landscapes

  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Multimedia (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Signal Processing (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

L'invention concerne un système informatique comprenant un dispositif de prise de vues (10) dont est équipé un véhicule, et un registre distribué, ledit système informatique : acquérant des images d'un autre véhicule proche, prises par le dispositif de prise de vues (10) ; analysant les images pour identifier un numéro de véhicule de l'autre véhicule proche ; analysant les images pour déterminer si oui ou non l'autre véhicule proche est conduit imprudemment ; enregistrant le numéro de véhicule identifié dans le registre distribué comme véhicule conduit imprudemment, s'il est déterminé que l'autre véhicule proche est conduit imprudemment ; analysant les images pour détecter si un autre véhicule adopte ou non des positions commutées avec l'autre véhicule proche ; s'il est détecté qu'un autre véhicule adopte des positions commutées avec l'autre véhicule proche, vérifiant le registre distribué pour obtenir un numéro de véhicule du véhicule qui adopte des positions commutées avec l'autre véhicule proche ; et notifiant à un terminal d'utilisateur que le véhicule conduit imprudemment est proche, si le résultat de la vérification indique que le numéro de véhicule du véhicule qui adopte des positions commutées avec l'autre véhicule proche est enregistré dans le registre distribué.
PCT/JP2018/035343 2018-09-25 2018-09-25 Système informatique, procédé de notification de conduite imprudente de véhicule et programme WO2020065708A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2018/035343 WO2020065708A1 (fr) 2018-09-25 2018-09-25 Système informatique, procédé de notification de conduite imprudente de véhicule et programme
JP2020547623A JPWO2020065708A1 (ja) 2018-09-25 2018-09-25 コンピュータシステム、危険運転車両通知方法及びプログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/035343 WO2020065708A1 (fr) 2018-09-25 2018-09-25 Système informatique, procédé de notification de conduite imprudente de véhicule et programme

Publications (1)

Publication Number Publication Date
WO2020065708A1 true WO2020065708A1 (fr) 2020-04-02

Family

ID=69950384

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/035343 WO2020065708A1 (fr) 2018-09-25 2018-09-25 Système informatique, procédé de notification de conduite imprudente de véhicule et programme

Country Status (2)

Country Link
JP (1) JPWO2020065708A1 (fr)
WO (1) WO2020065708A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256033A (zh) * 2020-10-30 2021-01-22 安徽江淮汽车集团股份有限公司 驾驶状态识别方法、设备、存储介质及装置
CN112863175A (zh) * 2020-12-31 2021-05-28 平安科技(深圳)有限公司 汽车道路监测数据处理方法、装置、设备及存储介质
WO2023084814A1 (fr) * 2021-11-10 2023-05-19 日本電信電話株式会社 Système de communication, serveur, procédé de communication et programme de communication

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011197947A (ja) * 2010-03-18 2011-10-06 Fujitsu Ltd 車両間隔管理装置、車両間隔管理方法および車両間隔管理プログラム
JP2014203398A (ja) * 2013-04-09 2014-10-27 株式会社デンソー 危険車両通知装置、危険車両通知プログラム、危険車両通知プログラムを記録した記録媒体
JP2016139281A (ja) * 2015-01-28 2016-08-04 三菱自動車工業株式会社 運転支援装置
US20170046952A1 (en) * 2015-08-12 2017-02-16 Inventec (Pudong) Technology Corp. Dangerous Vehicle Warning Method and Dangerous Vehicle Warning System
JP2017151546A (ja) * 2016-02-22 2017-08-31 パナソニックIpマネジメント株式会社 安全運転支援装置、および、制御方法
CN107870983A (zh) * 2017-09-30 2018-04-03 深圳市易成自动驾驶技术有限公司 基于区块链的车辆违章信息管理方法、区块链及存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006119767A (ja) * 2004-10-19 2006-05-11 Nec Corp 違反車両通報システム、取締サーバ、及び違反車両通報プログラム
JPWO2007080921A1 (ja) * 2006-01-13 2009-06-11 日本電気株式会社 情報記録システム、情報記録装置、情報記録方法及び情報収集プログラム
US10024684B2 (en) * 2014-12-02 2018-07-17 Operr Technologies, Inc. Method and system for avoidance of accidents
JPWO2016113986A1 (ja) * 2015-01-14 2017-06-22 オムロン株式会社 通報受付システム及び通報受付方法
KR101906709B1 (ko) * 2017-01-24 2018-10-10 인하대학교 산학협력단 커넥티드 블랙박스

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011197947A (ja) * 2010-03-18 2011-10-06 Fujitsu Ltd 車両間隔管理装置、車両間隔管理方法および車両間隔管理プログラム
JP2014203398A (ja) * 2013-04-09 2014-10-27 株式会社デンソー 危険車両通知装置、危険車両通知プログラム、危険車両通知プログラムを記録した記録媒体
JP2016139281A (ja) * 2015-01-28 2016-08-04 三菱自動車工業株式会社 運転支援装置
US20170046952A1 (en) * 2015-08-12 2017-02-16 Inventec (Pudong) Technology Corp. Dangerous Vehicle Warning Method and Dangerous Vehicle Warning System
JP2017151546A (ja) * 2016-02-22 2017-08-31 パナソニックIpマネジメント株式会社 安全運転支援装置、および、制御方法
CN107870983A (zh) * 2017-09-30 2018-04-03 深圳市易成自动驾驶技术有限公司 基于区块链的车辆违章信息管理方法、区块链及存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256033A (zh) * 2020-10-30 2021-01-22 安徽江淮汽车集团股份有限公司 驾驶状态识别方法、设备、存储介质及装置
CN112863175A (zh) * 2020-12-31 2021-05-28 平安科技(深圳)有限公司 汽车道路监测数据处理方法、装置、设备及存储介质
CN112863175B (zh) * 2020-12-31 2022-11-22 平安科技(深圳)有限公司 汽车道路监测数据处理方法、装置、设备及存储介质
WO2023084814A1 (fr) * 2021-11-10 2023-05-19 日本電信電話株式会社 Système de communication, serveur, procédé de communication et programme de communication

Also Published As

Publication number Publication date
JPWO2020065708A1 (ja) 2021-08-30

Similar Documents

Publication Publication Date Title
JP4268208B2 (ja) 車両画像データ生成プログラムおよび車両画像データ生成装置
WO2020065708A1 (fr) Système informatique, procédé de notification de conduite imprudente de véhicule et programme
CN107705552B (zh) 一种应急车道占用行为检测方法、装置及系统
JP2016146162A (ja) 運転判定装置、運転判定プログラム、演算システム、検知装置、検知システム、検知方法及びプログラム
JP2022508551A (ja) ビデオ監視及びオブジェクト認識
CN110580808B (zh) 一种信息处理方法、装置、电子设备及智能交通系统
KR102085489B1 (ko) 하이브리드 음주측정관리 시스템
US9536157B2 (en) Method for identification of a projected symbol on a street in a vehicle, apparatus and vehicle
CN108182815A (zh) 车辆智能提醒方法
Hakim et al. Implementation of an image processing based smart parking system using Haar-Cascade method
WO2016201867A1 (fr) Procédé et appareil d'identification de mise en réseau de véhicules m2m
CN109697473A (zh) 一种施工隧道车辆违章的检测方法、计算机装置以及计算机可读存储介质
CN111985304A (zh) 巡防告警方法、系统、终端设备及存储介质
CN109325755A (zh) 基于汽车轮毂的电子计费系统
JP2022084726A (ja) 運転支援装置、方法及びプログラム
US10147013B2 (en) Method and apparatus for crowdsourced vehicle identification
US20210118243A1 (en) Vehicular communications through identifiers and online systems
CN112419366B (zh) 车辆的追踪方法、系统及计算机可读存储介质
KR102250077B1 (ko) 차량번호판 인식률을 모니터링 하는 방법, 장치 및 컴퓨터 판독가능 저장 매체
WO2021075277A1 (fr) Dispositif de traitement d'informations, procédé et programme
TW201720692A (zh) 辨識交通號誌以提示駕駛人之系統
CN106778891B (zh) 一种基于行车影像分析的智能通信方法
CN112417922B (zh) 目标识别方法和装置
JP2020042489A (ja) ナンバープレート識別装置、ナンバープレート識別方法及びプログラム
TWI768453B (zh) 停車管理方法、管理裝置及存儲介質

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18935310

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020547623

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18935310

Country of ref document: EP

Kind code of ref document: A1