CN117022307A - Method and device for controlling vehicle based on wearable equipment - Google Patents

Method and device for controlling vehicle based on wearable equipment Download PDF

Info

Publication number
CN117022307A
CN117022307A CN202311095534.2A CN202311095534A CN117022307A CN 117022307 A CN117022307 A CN 117022307A CN 202311095534 A CN202311095534 A CN 202311095534A CN 117022307 A CN117022307 A CN 117022307A
Authority
CN
China
Prior art keywords
vehicle
pedestrian
wearable device
identification
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311095534.2A
Other languages
Chinese (zh)
Inventor
余凯
王怀章
王超银
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Anting Horizon Intelligent Transportation Technology Co ltd
Original Assignee
Shanghai Anting Horizon Intelligent Transportation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Anting Horizon Intelligent Transportation Technology Co ltd filed Critical Shanghai Anting Horizon Intelligent Transportation Technology Co ltd
Priority to CN202311095534.2A priority Critical patent/CN117022307A/en
Publication of CN117022307A publication Critical patent/CN117022307A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians

Abstract

The embodiment of the disclosure discloses a method and a device for controlling a vehicle based on a wearable device, a computer readable storage medium and an electronic device, wherein the method for controlling the vehicle based on the wearable device comprises the following steps: acquiring induction information aiming at the current environment and collected by wearable equipment of a pedestrian; based on the sensing information, adjusting a physical parameter of an identification on the wearable device, the physical parameter being used to generate a prompt for vehicles around the pedestrian, the physical parameter comprising at least one of: color, brightness, shape. According to the embodiment of the disclosure, the wearable device automatically adjusts the identification, so that the vehicle obtains clear and reliable identification images, and the accuracy of vehicle identification is improved. In addition, the wearable device can prompt the environment of the pedestrian to the vehicle, so that the vehicle can adjust driving behaviors according to the identified identifications in a targeted manner, and the risk of dangerous situations of the pedestrian and the vehicle is reduced.

Description

Method and device for controlling vehicle based on wearable equipment
The application is a divisional application of an application patent application with the application number of CN 202111109541.4 and the application name of a vehicle control method and a method for controlling a vehicle based on a wearable device.
Technical Field
The disclosure relates to the field of computer technology, in particular to a vehicle control method and device, a method and device for controlling a vehicle based on wearable equipment, a computer readable storage medium and electronic equipment.
Background
As the most promising technology for future smart cities and the strategic direction of future car industry development, intelligent driving faces a great challenge. Pedestrian detection in the smart city is used as a key part of intelligent driving, and is used for strongly supporting automatic driving application and pushing traffic intelligent construction.
Current pedestrian detection methods generally acquire the position of a pedestrian by a sensor provided at a position on a vehicle or at an intersection or the like, or recognize the position of a pedestrian from an image taken of the pedestrian using computer vision techniques.
Disclosure of Invention
Embodiments of the present disclosure provide a vehicle control method and apparatus, a method and apparatus for controlling a vehicle based on a wearable device, a computer-readable storage medium, and an electronic device.
Embodiments of the present disclosure provide a vehicle control method including: based on an environment image acquired by a vehicle in the driving process, determining the identification of a pedestrian wearable device of the pedestrian; determining a control strategy corresponding to the identifier and used for controlling the vehicle; the control vehicle performs driving behavior corresponding to the control strategy.
According to another aspect of an embodiment of the present disclosure, there is provided a method of controlling a vehicle based on a wearable device, the method comprising: acquiring induction information aiming at the current environment and collected by wearable equipment of a pedestrian; based on the sensed information, a physical parameter of the identity on the wearable device is adjusted, the physical parameter being used to generate a prompt for vehicles surrounding the pedestrian.
According to another aspect of an embodiment of the present disclosure, there is provided a vehicle control apparatus including: the first determining module is used for determining the identification of the pedestrian wearable equipment of the pedestrian based on the environment image acquired by the vehicle in the driving process; the second determining module is used for determining a control strategy corresponding to the identifier and used for controlling the vehicle; the first control module is used for controlling the vehicle to execute driving behaviors corresponding to the control strategies.
According to another aspect of an embodiment of the present disclosure, there is provided an apparatus for controlling a vehicle based on a wearable device, the apparatus including: the acquisition module is used for acquiring induction information aiming at the current environment and acquired by the wearable equipment of the pedestrian; the adjusting module is used for adjusting the physical parameters of the mark on the wearable equipment based on the induction information, wherein the physical parameters are used for generating prompts for vehicles around pedestrians.
According to another aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium storing a computer program for executing the above-described vehicle control method or a method of controlling a vehicle based on a wearable device.
According to another aspect of an embodiment of the present disclosure, there is provided an electronic device including: a processor; a memory for storing processor-executable instructions; and the processor is used for reading the executable instructions from the memory and executing the instructions to realize the vehicle control method or the method for controlling the vehicle based on the wearable device.
According to the vehicle control method and device, the method and device for controlling the vehicle based on the wearable device, the computer readable storage medium and the electronic device, the identification of the wearable device of the pedestrian is determined from the environment image acquired by the vehicle in the driving process, then the control strategy corresponding to the identification and used for controlling the vehicle is determined, the vehicle is controlled to execute the driving behavior corresponding to the control strategy, a large number of sensors are not required to be arranged on the vehicle or a road, the pedestrian is not required to be identified from the image by adopting a complex pedestrian identification algorithm, the detection of the position of the pedestrian can be realized by only identifying the identification on the pedestrian from the shot environment image, so that the hardware cost is reduced, the data processing amount of the image identification is reduced, the problem of data redundancy is solved, the efficiency of pedestrian detection is further improved, the sensitivity of the vehicle for driving behavior according to the pedestrian detection result is improved, and the intelligent driving safety is improved. The wearable device can automatically adjust the identification, so that the vehicle obtains clear and reliable identification images, and the accuracy of vehicle identification is improved. The wearable device can also pointedly adapt the physical parameters of the identification to the environment of the pedestrian, and prompt the environment of the pedestrian to the vehicle, so that the vehicle can pointedly adjust driving behavior according to the identification, and the risk of dangerous situations of the pedestrian and the vehicle is reduced.
The technical scheme of the present disclosure is described in further detail below through the accompanying drawings and examples.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing embodiments thereof in more detail with reference to the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the disclosure, and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure, without limitation to the disclosure. In the drawings, like reference numerals generally refer to like parts or steps.
Fig. 1A is a system diagram to which the present disclosure is applicable.
Fig. 1B is a schematic diagram of an application scenario of a system architecture to which the present disclosure is applicable.
Fig. 1C, 1D are exemplary schematic diagrams of an identification on a wearable device provided by an exemplary embodiment of the present disclosure.
Fig. 2 is a flow chart of a vehicle control method provided in an exemplary embodiment of the present disclosure.
Fig. 3 is a flow chart of a vehicle control method provided in another exemplary embodiment of the present disclosure.
Fig. 4 is a flow chart of a vehicle control method provided in yet another exemplary embodiment of the present disclosure.
Fig. 5A is a schematic diagram illustrating identification of a male child provided in an exemplary embodiment of the present disclosure.
Fig. 5B is a schematic diagram illustrating identification of a female child provided in an exemplary embodiment of the present disclosure.
Fig. 5C is a schematic diagram illustrating identification of a male elderly person provided in an exemplary embodiment of the present disclosure.
Fig. 5D is a schematic diagram illustrating identification of a female elderly person provided in an exemplary embodiment of the present disclosure.
Fig. 6 is a flow chart of a method for controlling a vehicle based on a wearable device according to an exemplary embodiment of the present disclosure.
Fig. 7 is a flow chart of a method of controlling a vehicle based on a wearable device provided in another exemplary embodiment of the present disclosure.
Fig. 8 is a schematic structural view of a vehicle control apparatus provided in an exemplary embodiment of the present disclosure.
Fig. 9 is a schematic structural view of a vehicle control apparatus provided in another exemplary embodiment of the present disclosure.
Fig. 10 is a schematic structural view of an apparatus for controlling a vehicle based on a wearable device according to an exemplary embodiment of the present disclosure.
Fig. 11 is a schematic structural view of an apparatus for controlling a vehicle based on a wearable device according to another exemplary embodiment of the present disclosure.
Fig. 12 is a block diagram of an electronic device provided in an exemplary embodiment of the present disclosure.
Detailed Description
Hereinafter, example embodiments according to the present disclosure will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present disclosure and not all of the embodiments of the present disclosure, and that the present disclosure is not limited by the example embodiments described herein.
It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
It will be appreciated by those of skill in the art that the terms "first," "second," etc. in embodiments of the present disclosure are used merely to distinguish between different steps, devices or modules, etc., and do not represent any particular technical meaning nor necessarily logical order between them.
It should also be understood that in embodiments of the present disclosure, "plurality" may refer to two or more, and "at least one" may refer to one, two or more.
It should also be appreciated that any component, data, or structure referred to in the presently disclosed embodiments may be generally understood as one or more without explicit limitation or the contrary in the context.
In addition, the term "and/or" in this disclosure is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the front and rear association objects are an or relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and that the same or similar features may be referred to each other, and for brevity, will not be described in detail.
Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
Embodiments of the present disclosure may be applicable to electronic devices such as terminal devices, computer systems, servers, etc., which may operate with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with the terminal device, computer system, server, or other electronic device include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, network personal computers, minicomputer systems, mainframe computer systems, and distributed cloud computing technology environments that include any of the above systems, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc., that perform particular tasks or implement particular abstract data types. The computer system/server may be implemented in a distributed cloud computing environment in which tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computing system storage media including memory storage devices.
Summary of the application
Currently, a method for detecting pedestrians by a vehicle generally deploys a sensor for pedestrian detection at a vehicle or an intersection, but the coverage of the sensor for pedestrian detection is small. In order to ensure the detection quality of pedestrians and meet the requirement of pedestrian recognition, a large number of sensors are often required to be deployed on vehicles or intersections and the like, and a fully covered sensor network is constructed, so that redundant nodes exist, and information redundancy, mutual interference and waste of hardware cost are caused. If the sensor nodes are deployed in the detection area too few, detection dead zones may exist, so that missing identification is caused. In addition, the gesture of pedestrian, clothing are different, lead to the condition of missing to examine, false detection, seriously influence pedestrian's precision of detecting, cause the potential safety hazard.
Therefore, it is necessary to design a new pedestrian recognition method, to realize deployment with fewer sensor nodes, to reduce redundancy of information, and to ensure accuracy of whole pedestrian detection.
Exemplary System
Fig. 1A illustrates an exemplary system architecture 100 of a vehicle control method and apparatus, a method and apparatus for controlling a vehicle based on a wearable device, to which embodiments of the present disclosure may be applied.
As shown in fig. 1A, the system architecture 100 may include a vehicle 101, a network 102, a server 103, and a wearable device 104 on a pedestrian. The vehicle 101 is provided with a camera 1011, the wearable device 104 is provided with a mark, the camera 1011 can shoot the surrounding environment of the vehicle, and the shot environment image comprises the mark on the wearable device 104. The number and type of cameras 1011 may be arbitrarily set, for example, the cameras 1011 may be 360-degree panoramic cameras, and the environmental image may be a panoramic image accordingly. Vehicle 101 may include a terminal device 1012, and network 102 is used to provide a medium for communication links between terminal device 1012 and server 103.
A user may interact with the server 103 via the network 102 using the terminal device 1012 to receive or send messages, etc. Various communication client applications such as an image recognition application, a navigation class application, a search class application, an instant messaging tool, etc. may be installed on the terminal device 1012.
The terminal device 1012 may be various electronic devices including, but not limited to, an in-vehicle terminal (e.g., a center control device of a vehicle), a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), and the like.
The server 103 may be a server providing various services, such as a background image server that recognizes images uploaded by the terminal device 1012. The background image server may recognize the received image, generate a signal for controlling driving behavior of the vehicle according to the recognition result, and feed back to the terminal device 1012.
The wearable device 104 can collect sensing information through a camera, a range finder and the like arranged on the wearable device, and adjust the physical parameters of the marks on the wearable device based on the sensing information, so that the wearable device is conveniently shot by the camera 1011 on the vehicle.
It should be noted that, the vehicle control method provided by the embodiment of the present disclosure may be executed by the server 103 or may be executed by the terminal device 1012, and accordingly, the vehicle control apparatus may be provided in the server 103 or may be provided in the terminal device 1012. The method of controlling a vehicle based on a wearable device provided by the embodiments of the present disclosure is generally performed by the wearable device 104, and accordingly, the apparatus of controlling a vehicle based on a wearable device is generally disposed in the wearable device 104.
It should be understood that the number of vehicles 101, cameras 1011, terminal devices 1012, networks 102, servers 103, and wearable devices 104 in fig. 1A are merely illustrative. There may be any number of vehicles 101, cameras 1011, terminal devices 1012, networks 102, servers 103, and wearable devices 104, as desired for implementation. When the execution subject of the vehicle control method is the terminal device 1012, the system architecture described above may not include the network 102 and the server 103, but only the vehicle 101, the camera 1011, the terminal device 1012, and the wearable device 104.
As shown in fig. 1B, a schematic diagram of an application scenario of the system architecture is shown. In fig. 1B, a vehicle 101 is traveling on a road, and a pedestrian 105 appears in front of the vehicle. The pedestrian 105 carries a wearable device, for example, the wearable device is a garment, and a specific identifier representing the identity of the pedestrian 105 is printed on the garment. As shown in fig. 1C and fig. 1D, the clothes of the pedestrian is the wearable device, the wearable device shown in fig. 1C is printed with the marks with the circular patterns, and the wearable device shown in fig. 1D is printed with the marks with the "x" patterns. When the camera on the vehicle 101 captures an identification on the clothing of the pedestrian, the terminal device on the vehicle executes the above-described vehicle control method, and determines that the pedestrian 105 is currently walking in the direction indicated by the broken-line arrow in the figure by recognizing the position and the movement track of the identification, and collides with the travel track of the vehicle 101. The terminal device determines a control strategy of the vehicle based on the identification result of the identification, and controls the vehicle to execute driving behaviors corresponding to the control strategy. For example, decelerating and bypassing the pedestrian in the direction indicated by the solid arrow in the figure.
Meanwhile, the wearable device on the pedestrian 105 may include various sensors, such as a camera, a range finder, and the like. The sensor collects sensing information of the surrounding environment of the pedestrian 105, the wearable device executes the method for controlling the vehicle based on the wearable device based on the sensing information, and the physical parameters of the identification are adjusted to generate a prompt for the pedestrian vehicle 105.
Exemplary method
Fig. 2 is a flow chart of a vehicle control method provided in an exemplary embodiment of the present disclosure. The present embodiment may be applied to an electronic device (such as the terminal device 1012 or the server 103 shown in fig. 1A), and as shown in fig. 2, the method includes the following steps:
step 201, determining the identity of the wearable device of the pedestrian based on the environmental image acquired by the vehicle during running.
In this embodiment, the electronic device may determine the identity of the wearable device of the pedestrian based on the environmental image acquired by the vehicle during the driving process. Wherein the ambient image may be acquired by a camera 1011 as shown in fig. 1A. The environment image may include a pedestrian, where the pedestrian has a wearable device 104 as shown in fig. 1A, and an identifier is disposed on the wearable device 104. The identification can be in the form of graphics, symbols, characters, two-dimensional codes and the like with various shapes. As shown in fig. 1C and fig. 1D, the clothes of the pedestrian is the wearable device, the wearable device shown in fig. 1C is printed with the marks with the circular patterns, and the wearable device shown in fig. 1D is printed with the marks with the "x" patterns.
The electronic device may identify the identifier from the environmental image according to an existing image identification method (e.g., an image identification method based on a neural network), and may obtain the distance between the identifier and the vehicle (e.g., determine the distance between the identifier and the camera according to the coordinates of the identifier in the image and camera parameters calibrated in advance, and thus obtain the attribute such as the distance between the identifier and the vehicle), and the relative speed between the identifier and the vehicle. Since the tag is on a wearable device on the pedestrian, the location of the identified tag can be taken as the location of the pedestrian.
Step 202, determining a control strategy for controlling the vehicle corresponding to the identification.
In this embodiment, the electronic device may determine a control policy for controlling the vehicle to which the identification corresponds. Specifically, a control strategy of the vehicle is determined according to the identified information such as the position, the speed and the like of the identification. The control strategy includes a plurality of types, for example, when the relative speed of the mark and the vehicle is greater than a preset speed and the distance is less than a preset distance, the control strategy is a braking strategy; according to the information of the moving direction, the moving speed, the moving direction, the moving speed and the like of the mark, whether the vehicle collides with a pedestrian or not can be predicted, and if collision is predicted, the control strategy is a deceleration strategy.
In step 203, the control vehicle performs driving behavior corresponding to the control strategy.
In the present embodiment, the electronic apparatus may control the vehicle to perform driving behavior corresponding to the control strategy. For example, when the control strategy is a braking strategy, the electronic device may send a braking signal to a control system of the vehicle, and the control system of the vehicle may control braking of the vehicle according to the braking signal. For another example, when the control strategy is a deceleration strategy, the electronic device may send a deceleration signal to a control system of the vehicle, and the control system of the vehicle may control the vehicle to decelerate according to the deceleration signal.
According to the method provided by the embodiment of the disclosure, the identification of the wearable equipment of the pedestrian is determined from the environment image acquired by the vehicle in the driving process, then the control strategy corresponding to the identification and used for controlling the vehicle is determined, the vehicle is controlled to execute the driving behavior corresponding to the control strategy, a large number of sensors are not required to be arranged on the vehicle or a road, the pedestrian is not required to be identified from the image by adopting a complex pedestrian identification algorithm, and the detection of the position of the pedestrian can be realized by only identifying the identification on the pedestrian from the photographed environment image, so that the hardware cost is reduced, the data processing amount of the image identification is reduced, the problem of data redundancy is solved, the efficiency of pedestrian detection is further improved, the sensitivity of the vehicle for driving behavior according to the pedestrian detection result is improved, and the intelligent driving safety is improved.
In some alternative implementations, step 202 may be performed as follows:
first, the identified physical parameter is determined.
As an example, the physical parameter may include, but is not limited to, at least one of: color, brightness, shape, speed of movement, distance from the vehicle, etc.
The electronic apparatus can determine the color, brightness, shape, and the like of the logo by recognizing the logo image photographed by the camera 1011 as shown in fig. 1A. The distance of the vehicle from the identity may also be determined using image recognition based methods (e.g., depth map recognition methods). The movement speed of the mark is determined according to the change of the distance determined at different moments. In addition, in addition to determining the physical parameter of the tag based on the image recognition method, other sensors (e.g., laser radar, ultrasonic rangefinder, etc.) may be used to determine the physical parameter of the tag, such as distance from the vehicle, speed, etc.
Then, a control strategy for controlling the vehicle is determined based on the physical parameter.
Wherein the identified physical parameter may be varied. For example, the identification may be displayed by an LED array, flexible display, or the like, such that a physical parameter of the identification may be adjusted. Thus, the control strategy can be adjusted in response to changes in the physical parameter. For example, different colors of the logo correspond to different control strategies, and different brightnesses of the logo correspond to different control strategies. As an example, the brightness of the sign may change according to a change in the illumination condition around the pedestrian, and when the illumination condition is poor, the brightness of the sign increases, and at this time, the difference between the brightness of the sign and the surrounding environment in the captured environment image is large, and at this time, the control strategy is to run at a reduced speed.
According to the method, the physical parameters of the identifier are identified, and the control strategy of the vehicle is determined according to the physical parameters, so that the control strategy is adjusted according to the characteristics of the identifier in a targeted manner, and the accuracy of vehicle control is improved.
In some alternative implementations, the electronic device can determine the identified physical parameter as follows:
first, the shape and color of the logo are identified.
Specifically, the electronic apparatus may recognize the identification image photographed by the camera 1011 as shown in fig. 1A to determine the shape and color of the identification.
The identified physical parameter is then determined based on the shape and color.
Specifically, the shape and color may be determined as the physical parameter of the logo, and the brightness of the logo image may be determined as the physical parameter from the RGB components of the color.
For example, a distance sensing device may be provided on the wearable device and the identification may be displayed by the LED array. When the pedestrian is sensed to be close to the vehicle, the color of the mark can be changed into red, the electronic equipment recognizes the red mark, the control strategy of the vehicle can be determined as a braking strategy or a decelerating strategy, and then a control signal for braking or decelerating the vehicle is sent out. For another example, different types of pedestrians are provided with different identifications, for example, old people, children and adults can respectively correspond to the identifications with different shapes, and the electronic equipment can identify the type of the identifications, determine the identity of the pedestrians and further determine different control strategies. For example, when the behavioral elderly are determined by identifying the shape of the logo, the control strategy may be: speed reduction, whistle volume improvement, flashing warning lights, etc.
According to the implementation mode, the physical parameters are embodied into the shape and the color, so that identification of the identification is simpler, and the efficiency of determining the control strategy is improved.
In some alternative implementations, as shown in fig. 3, step 203 may be performed as follows:
step 2031, predicting a motion track of a pedestrian based on physical parameters to obtain at least one predicted track.
The physical parameter in this embodiment may include information such as the identified moving speed, moving direction, and the current distance from the vehicle. Based on the moving speed and the moving direction, a moving track of the pedestrian for a period of time (for example, 3 seconds) in the future can be calculated.
Step 2032, determining a current motion state of the vehicle.
The current motion state of the vehicle may include information such as a running speed, a running direction, a current position (e.g., latitude and longitude information) of the vehicle, and the like.
Step 2033, predicting a dangerous state between the vehicle and the pedestrian based on the at least one predicted trajectory and the motion state.
As an example, the dangerous state may indicate whether the vehicle collides with a pedestrian. Specifically, when a predicted trajectory intersecting a future movement trajectory of the vehicle exists in the at least one predicted trajectory, the dangerous state may be represented by information indicating that the vehicle is about to collide with the pedestrian. When a predicted trajectory intersecting a future movement trajectory of the vehicle does not exist in the at least one predicted trajectory, the dangerous state may be represented by information indicating that the vehicle does not collide with the pedestrian.
Step 2034, controlling the vehicle to perform a risk avoidance operation based on the dangerous state.
As an example, when the dangerous state indicates that a pedestrian collides with the vehicle, the vehicle is controlled to be braked urgently. For another example, when the dangerous state indicates that the pedestrian and the vehicle cannot collide, the distance between the pedestrian and the vehicle is smaller than the preset distance, and the vehicle is controlled to slow down.
According to the method, the motion trail of the pedestrian is predicted according to the identified physical parameters, so that the dangerous state between the vehicle and the pedestrian is determined, and the characteristics of simplicity and high accuracy in the identification process of the identification can be utilized, so that the vehicle can be controlled to execute the risk avoiding operation in time when collision between the vehicle and the pedestrian possibly occurs, and the running safety of the vehicle is further improved.
In some alternative implementations, after step 2033 above, the electronic device may also perform the steps of:
if the dangerous state indicates that a dangerous situation occurs between the pedestrian and the vehicle, and the vehicle is detected to be manually controlled to continue running, the vehicle is controlled to execute the risk avoiding operation and execute the warning operation.
Specifically, in some special situations, for example, when a driver of a vehicle drives a vehicle with a drunk driving, a toxic driving or a psychological cause, if the electronic device detects that the dangerous state indicates that a dangerous situation occurs between the pedestrian and the vehicle, the electronic device automatically closes the current manual driving state of the vehicle, and even if the driver takes a manual control operation of the vehicle, the electronic device also sends out an instruction for enabling the control system of the vehicle to take the risk avoidance operation and simultaneously executes a warning operation. As an example, the warning operation may include issuing a warning message to the driver, locking the door, sending a warning message to the police, etc.
According to the method, when dangerous situations can occur between pedestrians and vehicles, if the driver has intention of maliciously striking the pedestrians, the right of the driver to manually control the vehicles is closed, and the vehicles are automatically controlled to avoid danger and warn, so that the risk of collision between the vehicles and the pedestrians is further reduced, and meanwhile, the driver with the intention of maliciously striking the pedestrians can be controlled.
In some alternative implementations, step 202 may be performed as follows:
first, communication instruction information transmitted by an identification is received.
Wherein the identification may comprise a device having a context awareness functionality and a communication functionality. The communication interface included by the electronic equipment and the identification can establish communication connection through Bluetooth, 5G communication and wireless broadcasting. The electronic device may be provided on the vehicle, and the electronic device may be a separate device or server capable of communicating with the vehicle.
As an example, a device such as a camera, an ultrasonic range finder, a laser range finder, a brightness sensor, etc. may be provided in the sign, and these devices may sense the environment around the pedestrian, and according to the environmental sensing result, send communication instruction information including the sensing result to the above-mentioned electronic device.
Then, a control strategy for controlling the vehicle is determined based on the communication instruction information.
As an example, the communication indication information may include distance information indicating a distance between the pedestrian and the vehicle, and the electronic device may adjust a control strategy of the vehicle (e.g., decelerating or braking when the distance is closer, and normally traveling when the distance is farther) based on the distance information. For another example, the communication indication information may include illumination information indicating a current illumination condition, and when the illumination information indicates that the current illumination condition is poor, the electronic device may adjust a control policy of the vehicle to: turning on the lamp, decelerating, etc. For example, the vehicle may be controlled to turn on the marker light, the low beam light, the high beam light, and the fog light accordingly according to the magnitude of the ambient light indicated by the illumination information. And meanwhile, the vehicle is controlled to be decelerated according to preset corresponding deceleration conditions based on the ambient brightness indicated by the illumination information (for example, the lower the ambient brightness is, the lower the running speed is).
Alternatively, the communication indication information may be combined with the identified recognition result to determine the control strategy. For example, when the identification result of the identification is normal, the control strategy is determined by the identification result of the identification, and when the identification fails, the control strategy is determined by the communication instruction information. Or when the current running environment is poor (such as the conditions of low illuminance, low visibility and the like are determined by means of image recognition, sensor sensing and the like, or the weather conditions are determined to be poor by acquiring weather forecast and the like), if the communication indication information and the identification recognition result both indicate no collision risk, the vehicle can normally run, otherwise, the vehicle is slowed down or stopped. Further, the specific degree of deceleration may be adjusted based on the distance information included in the communication indication information, for example, the closer the distance is, the higher the degree of deceleration is, the farther the distance is, and the lower the degree of deceleration is.
According to the method, various information for adjusting the control strategy of the vehicle can be analyzed from the communication indication information through the communication indication information actively sent by the vehicle receiving identification, more modes of adjusting the control strategy can be realized, more guarantees are provided on the basis of identifying the identification to adjust the control strategy, and further the driving safety of the vehicle is provided.
In some alternative implementations, as shown in fig. 4, step 202 may include the following sub-steps:
step 2021, determining identity attributes of the pedestrian based on the identification in the environment image.
In particular, pedestrians of different identities, the logos on which have different characteristics. For example, the elderly corresponds to an identification indicating the elderly, the children corresponds to an identification indicating the children, the cleaners corresponds to an identification indicating the cleaners, and the like. The electronic device may identify the characteristic of the identifier, and determine an identity attribute corresponding to the identifier, where the identity attribute may include, for example, a sex of the pedestrian and an age bracket corresponding to the pedestrian.
As shown in fig. 5A to 5D, the symbol in fig. 5A represents a male child, the symbol in fig. 5B represents a female child, the symbol in fig. 5C represents a male elderly person, and the symbol in fig. 5D represents a female elderly person.
Step 2022, determining a vehicle control level corresponding to the identity attribute.
In particular, different identity attributes correspond to different vehicle control levels, e.g., adult identity corresponds to a first control level, child identity corresponds to a second control level, and geriatric identity corresponds to a third control level.
Step 2023, based on the vehicle control level, determines a control strategy for controlling the vehicle corresponding to the vehicle control level.
Continuing with the example in step 2022, if the vehicle control level is the first control level, if the pedestrian is closer to the vehicle, then the speaker in the vehicle may not emit a warning tone and the off-vehicle or in-vehicle lights may not blink on the basis of the corresponding control strategy. If the vehicle control level is the second control level, if the distance between the pedestrian and the vehicle is relatively short, on the basis of corresponding control strategies, the loudspeaker in the vehicle can send out prompt tones indicating that the pedestrian is a child, and the lamp light outside or inside the vehicle blinks to perform operations such as whistling at relatively low volume. If the vehicle control level is the third control level, if the distance between the pedestrian and the vehicle is relatively short, on the basis of corresponding control strategies, the loudspeaker in the vehicle can send out prompt sound indicating that the pedestrian is the old person, and the lamp light outside or inside the vehicle blinks to perform operations such as whistling at relatively high volume.
In one application scenario, if the vehicle is a vehicle in a park, or the like, the operation for displaying various effects to the outside of the vehicle may be adjusted according to the identity attribute of the pedestrian. For example, when the identity attribute of the pedestrian indicates a pedestrian child, the control level of the vehicle may be adjusted to a child level, at which time a speaker outside the vehicle plays a child-type song, or the blinking rhythm of the light outside the vehicle is controlled to be increased, or a child-type video or the like is displayed on a display screen provided outside the vehicle. When the identity attribute of the pedestrian indicates a pedestrian as an aged person, the control level of the vehicle can be adjusted to the aged person level, and at this time, a song with a relaxed rhythm is played by a loudspeaker outside the vehicle, or a video of the aged person type is displayed on a display screen arranged outside the vehicle, or the like.
According to the method, the identity attribute of the pedestrian is determined by identifying the identity on the pedestrian, and the control strategy of the vehicle is adjusted according to the identity attribute, so that the identity attribute of the pedestrian around the vehicle is accurately and efficiently determined, the pertinence of the control strategy is stronger, and the risk of dangerous situations of the pedestrian and the vehicle is further reduced.
Fig. 6 is a flow chart of a method for controlling a vehicle based on a wearable device according to an exemplary embodiment of the present disclosure. The present embodiment may be applied to the wearable device 104 shown in fig. 1A, and as shown in fig. 6, the method includes the following steps:
step 601, acquiring sensing information aiming at the current environment by the wearable equipment of the pedestrian.
In this embodiment, the wearable device may acquire sensing information for the current environment acquired by the wearable device of the pedestrian. Wherein the sensing information may include at least one kind. For example, a camera may be disposed on the wearable device, and the sensing information may include an image captured by the camera; the wearable device can be provided with a distance meter (such as an ultrasonic distance meter, a laser distance meter, an infrared distance meter and the like), and the sensing information can comprise distance information obtained by the distance meter for measuring the distance of objects around the pedestrians; the wearable device can be provided with a brightness sensor, and the sensing information can comprise illumination information of the brightness sensor for sensing illumination conditions of the environment where the pedestrian is located.
The wearable device may be provided on the pedestrian in various forms, for example on the clothing of the pedestrian, or on a backpack, a hat, etc.
Step 602, adjusting a physical parameter of an identification on a wearable device based on the sensing information.
In this embodiment, the wearable device may adjust, based on the sensing information, a physical parameter of the identifier on the wearable device, where the physical parameter is used to generate a prompt for a vehicle around the pedestrian.
Wherein the physical parameter may include, but is not limited to, at least one of: color, brightness, shape, etc. The identified physical parameter may be adjusted. For example, the identification may be presented by an LED array, display screen, or the like disposed on the wearable device. The physical parameter of the identification can be changed according to the difference of the sensing information. For example, when the illumination information included in the sensing information indicates that the illumination condition around the pedestrian is poor, the wearable device may control the brightness of the logo to increase. When the sensing information includes distance information indicating that the distance of the pedestrian from the vehicle is less than a preset distance, the wearable device may control the sign to change color (e.g., to red), change shape, blink, etc.
The vehicle may perform the method described in the corresponding embodiment of fig. 2 above, namely, to identify the above-mentioned identifier from the environmental image captured by the camera included therein, and to control the driving behavior of the vehicle according to the physical parameter of the identifier.
According to the method provided by the embodiment of the disclosure, the physical parameters of the identification on the wearable equipment are adjusted based on the induction information by acquiring the induction information which is acquired by the wearable equipment of the pedestrian and is aimed at the current environment, so that the automatic identification adjustment of the wearable equipment is realized, the vehicle obtains clear and reliable identification images, and the accuracy of the vehicle identification is improved. In addition, the wearable device can also pointedly adapt the physical parameters of the identification to the environment of the pedestrian, and prompt the environment of the pedestrian to the vehicle, so that the vehicle can pointedly adjust driving behavior according to the identification, and the risk of dangerous situations of the pedestrian and the vehicle is reduced.
In some alternative implementations, step 602 may be performed as follows:
first, based on the sensed information, a distance between the pedestrian and the vehicle is determined.
As an example, the sensed information may include an image captured by a camera on the wearable device, the wearable device may identify a vehicle in the image, and determine a distance between the pedestrian and the vehicle based on characteristics of a position, a size, etc. of the vehicle in the image. The sensing information may include distance information obtained by a distance meter on the wearable device ranging the vehicle, which the wearable device may obtain.
The identified physical parameter on the wearable device is then adjusted based on the distance.
As an example, when the distance is equal to or less than the preset distance threshold, the shape of the mark may be adjusted to a preset shape (for example, the mark of the "x" symbol shown in fig. 1D), or the color of the mark may be adjusted to a preset color (for example, a circular mark of which the color is red shown in fig. 1C), or the brightness may be adjusted to a preset brightness, or the mark may be controlled to blink.
Alternatively, based on the distances determined at different times, the relative speed between the pedestrian and the vehicle may be determined, and if the vehicle is predicted to collide with the pedestrian according to the determined relative speed, the identified physical parameter on the wearable device may be adjusted, where the physical parameter indicates that the collision is likely.
According to the method, the distance between the pedestrian and the vehicle is determined, the physical parameters of the mark are adjusted according to the distance, prompt can be sent to the vehicle in time, and the vehicle can make corresponding driving behaviors according to the adjusted mark, so that the risk of dangerous situations between the pedestrian and the vehicle is reduced.
In some optional implementations, after determining the distance between the pedestrian and the vehicle based on the sensed information, the method may further include:
First, communication instruction information for controlling the vehicle is generated based on the distance.
As an example, when the above-mentioned distance is less than or equal to a preset distance, communication instruction information for controlling deceleration or braking of the vehicle may be transmitted to the vehicle. Alternatively, the relative speed between the pedestrian and the vehicle may be calculated according to the distances at different times, and if it is predicted that the vehicle and the pedestrian may collide according to the determined relative speed, communication instruction information for controlling deceleration or braking of the vehicle may be transmitted.
Then, the communication instruction information is transmitted to the vehicle.
After the vehicle receives the communication indication information, the corresponding driving behavior can be adjusted.
According to the method, the distance between the wearable device and the vehicle is sensed, communication indication information for controlling the vehicle is sent to the vehicle according to the change of the distance, so that driving behavior of the vehicle can be controlled by pedestrians, and when the vehicle fails to detect the pedestrians, the vehicle can still avoid collision risks, so that safety of the pedestrians is further guaranteed.
In some alternative implementations, step 602 may be performed as follows including:
based on the sensed information, the brightness of the identification on the wearable device is adjusted.
The sensing information can comprise illumination information, and the illumination information can be information that a brightness sensor arranged on the wearable equipment senses illumination environments around pedestrians. For example, when a pedestrian is in an environment with darker light such as night, cloudy day, etc., the illumination information collected by the brightness sensor indicates that the current illumination condition is worse, and at this time, the brightness of the wearable device control mark is increased.
According to the method, the illumination environment around the pedestrian is automatically sensed, and the brightness of the mark is automatically adjusted according to the change of the illumination environment, so that the brightness of the mark can be increased in the dark-light environment, and the mark can be identified more easily by a vehicle.
In some alternative implementations, the method further includes:
first, the charge of a battery on a wearable device is determined.
Wherein, the battery is a rechargeable battery.
And then, in response to determining that the electric quantity is smaller than or equal to a preset electric quantity threshold value, starting the solar power generation equipment on the wearable equipment to charge the battery.
The solar power generation equipment can comprise a hard solar panel or a flexible solar film, and the flexible solar film can be arranged on the surfaces of clothes, hats, backpacks and the like of pedestrians, so that the solar power generation equipment has the characteristics of being lighter and not easy to damage.
According to the implementation mode, the solar power generation device is arranged on the wearable equipment, so that the battery of the wearable equipment can be charged in time, the sign display fault caused by power interruption is avoided, and the risk of collision between pedestrians and vehicles is reduced.
In some alternative implementations, as shown in fig. 7, the method may further include the steps of:
step 603, determining obstacles around the pedestrian based on the sensing information.
As an example, the sensing information may include an image photographed by a camera provided on the wearable device, and the wearable device may recognize the type of the obstacle from the image using an existing image recognition method. For another example, the sensing information may further include distance information sensed by devices such as an infrared range finder and an ultrasonic range finder, and if it is sensed that a distance between an object and a pedestrian is smaller than a preset distance, it is determined that an obstacle exists around the pedestrian.
At step 604, a risk attribute of the obstacle is determined.
As an example, the hazard attributes may correspond to the type of obstacle. For example, when the obstacle is identified as a vehicle by the method described in step 603, the risk attribute may indicate that the degree of risk is high at present; if the obstacle is a stationary facility (e.g., a railing, a lamp post, a wall, etc.), the dangerous attribute may represent a level of danger occurring at present, or the like; if the obstacle is other pedestrian, the pedestrian actively avoids the dangerous consciousness, and at this time, the dangerous attribute can indicate that the degree of the dangerous occurrence is lower.
As another example, if the obstacle is determined by a distance sensed by the rangefinder described in the above example, the distance may correspond to a dangerous attribute. For example, when it is determined that the pedestrian is farther from the obstacle, the risk attribute may indicate that the degree of risk is currently lower, and when it is determined that the pedestrian is farther from the obstacle, the risk attribute may indicate that the degree of risk is currently higher.
Step 605, based on the dangerous attribute, outputting prompt information for representing that dangerous objects exist around the pedestrian.
Specifically, corresponding prompt information can be output according to the height of the dangerous degree represented by the dangerous attribute. For example, when the hazard attribute indicates that the degree of the hazard occurring at present is high, the display brightness of the sign may be increased, a higher volume of the alert sound may be output, the color of the sign may be changed to red, or the like. When the dangerous attribute indicates that the degree of the current occurrence of the danger is low, a prompting sound with low volume can be output, the color of the mark is changed to yellow, and the like.
For another example, the wearable device may include a positioning device, the prompt information may further include current position information of the pedestrian, the wearable device may send the position information to the vehicle, and the vehicle may further adjust driving behavior according to the position of the pedestrian.
According to the method, the obstacle around the pedestrian is determined, the dangerous attribute is determined according to the obstacle, the prompt information is output, the pedestrian can be timely reminded when the obstacle is sensed, and the risk of collision between the pedestrian and the obstacle is reduced.
Exemplary apparatus
Fig. 8 is a schematic structural view of a vehicle control apparatus provided in an exemplary embodiment of the present disclosure. The present embodiment is applicable to an electronic apparatus, as shown in fig. 8, a vehicle control apparatus including: a first determining module 801, configured to determine an identifier of a pedestrian wearable device of a pedestrian based on an environmental image acquired by a vehicle during a driving process; a second determining module 802, configured to determine a control policy for controlling the vehicle corresponding to the identifier; a first control module 803 for controlling the vehicle to perform driving behavior corresponding to the control strategy.
In this embodiment, the first determining module 801 may determine the identity of the wearable device of the pedestrian based on the environmental image acquired by the vehicle during the driving process. Wherein the ambient image may be acquired by a camera 1011 as shown in fig. 1A. The environment image may include a pedestrian, where the pedestrian has a wearable device 104 as shown in fig. 1A, and an identifier is disposed on the wearable device 104. The identification can be in the form of graphics, symbols, characters, two-dimensional codes and the like with various shapes.
The first determining module 801 may identify the identifier from the environmental image according to an existing image identification method (for example, an image identification method based on a neural network), and may obtain the distance between the identifier and the vehicle (for example, determine the distance between the identifier and the camera according to the coordinates of the identifier in the image and the camera parameters calibrated in advance, and further obtain the attribute such as the distance between the identifier and the vehicle), the relative speed between the identifier and the vehicle, and the like. Since the tag is on a wearable device on the pedestrian, the location of the identified tag can be taken as the location of the pedestrian.
In this embodiment, the second determination module 802 may determine a control strategy for controlling the vehicle corresponding to the identification. Specifically, a control strategy of the vehicle is determined according to the identified information such as the position, the speed and the like of the identification. The control strategy includes a plurality of types, for example, when the relative speed of the mark and the vehicle is greater than a preset speed and the distance is less than a preset distance, the control strategy is a braking strategy; according to the information of the moving direction, the moving speed, the moving direction, the moving speed and the like of the mark, whether the vehicle collides with a pedestrian or not can be predicted, and if collision is predicted, the control strategy is a deceleration strategy.
In this embodiment, the first control module 803 may control the vehicle to perform driving behavior corresponding to the control strategy. For example, when the control strategy is a braking strategy, the first control module 803 may send a braking signal to a control system of the vehicle, and the control system of the vehicle may control the vehicle to brake according to the braking signal. For another example, when the control strategy is a deceleration strategy, the first control module 803 may send a deceleration signal to a control system of the vehicle, and the control system of the vehicle may control the vehicle to decelerate according to the deceleration signal.
Referring to fig. 9, fig. 9 is a schematic structural view of a vehicle control apparatus provided in another exemplary embodiment of the present disclosure.
In some alternative implementations, the second determining module 802 may include: a first determining unit 8021 for determining the identified physical parameter; a second determining unit 8022 for determining a control strategy for controlling the vehicle based on the physical parameter.
In some alternative implementations, the first determining unit 802 may include: an identification subunit 80211 for identifying the shape and color of the logo; a determination subunit 80212 for determining the identified physical parameter based on the shape and the color.
In some alternative implementations, the first control module 803 may include: the first prediction unit 8031 is configured to predict a motion track of a pedestrian based on a physical parameter, so as to obtain at least one predicted track; a third determining unit 8032 for determining a current movement state of the vehicle; a second prediction unit 8033 for predicting a dangerous state between the vehicle and the pedestrian according to at least one predicted trajectory and a movement state; a control unit 8034 for controlling the vehicle to perform a risk avoidance operation based on the dangerous state.
In some alternative implementations, the apparatus may further include: and a second control module 804 for controlling the vehicle to perform the danger avoiding operation and the warning operation if the dangerous state indicates that a dangerous situation occurs between the pedestrian and the vehicle is detected to be manually controlled to continue running.
In some alternative implementations, the second determining module 802 may include: a receiving unit 8023 for receiving communication instruction information sent by the identifier; a fourth determination unit 8024 for determining a control strategy for controlling the vehicle based on the communication instruction information.
In some alternative implementations, the second determining module 802 may include: a fifth determining unit 8025 for determining an identity attribute of the pedestrian based on the identification in the environment image; a sixth determining unit 8026, configured to determine a vehicle control level corresponding to the identity attribute; a seventh determining unit 8027 for determining a control strategy for controlling the vehicle, which corresponds to the vehicle control level, based on the vehicle control level.
According to the vehicle control device provided by the embodiment of the disclosure, the identification of the wearable equipment of the pedestrian is determined from the environment image acquired by the vehicle in the driving process, then the control strategy corresponding to the identification and used for controlling the vehicle is determined, the vehicle is controlled to execute the driving behavior corresponding to the control strategy, a large number of sensors are not required to be arranged on the vehicle or a road, the pedestrian is not required to be identified from the image by adopting a complex pedestrian identification algorithm, and the detection of the position of the pedestrian can be realized by only identifying the identification on the pedestrian from the photographed environment image, so that the hardware cost is reduced, the data processing amount of the image identification is reduced, the problem of data redundancy is solved, the efficiency of pedestrian detection is further improved, the sensitivity of the vehicle for driving behavior according to the pedestrian detection result is improved, and the intelligent driving safety is improved.
Fig. 10 is a schematic structural view of an apparatus for controlling a vehicle based on a wearable device according to an exemplary embodiment of the present disclosure. The present embodiment is applicable to an electronic apparatus, as shown in fig. 10, a vehicle control apparatus including: the acquiring module 1001 is configured to acquire sensing information, acquired by a wearable device of a pedestrian, for a current environment; the adjustment module 1002 is configured to adjust, based on the sensing information, a physical parameter of the identifier on the wearable device, where the physical parameter is used to generate a prompt for a vehicle around the pedestrian.
In this embodiment, the acquiring module 1001 may acquire sensing information for the current environment acquired by the wearable device of the pedestrian. Wherein the sensing information may include at least one kind. For example, a camera may be disposed on the wearable device, and the sensing information may include an image captured by the camera; the wearable device can be provided with a distance meter (such as an ultrasonic distance meter, a laser distance meter, an infrared distance meter and the like), and the sensing information can comprise distance information obtained by the distance meter for measuring the distance of objects around the pedestrians; the wearable device can be provided with a brightness sensor, and the sensing information can comprise illumination information of the brightness sensor for sensing illumination conditions of the environment where the pedestrian is located.
The wearable device may be provided on the pedestrian in various forms, for example on the clothing of the pedestrian, or on a backpack, a hat, etc.
In this embodiment, the adjustment module 1002 may adjust the physical parameter of the identifier on the wearable device, based on the sensing information, where the physical parameter is used to generate a prompt for a vehicle around the pedestrian.
Wherein the physical parameter may include, but is not limited to, at least one of: color, brightness, shape, etc. The identified physical parameter may be adjusted. For example, the identification may be presented by an LED array, display screen, or the like disposed on the wearable device. The physical parameter of the identification can be changed according to the difference of the sensing information. For example, the adjustment module 902 may control the illumination of the logo to increase when the sensed information includes illumination information indicating poor illumination conditions around the pedestrian. When the sensing information includes distance information indicating that the distance of the pedestrian from the vehicle is less than the preset distance, the adjustment module 1002 may control the logo to change color (e.g., to red), change shape, blink, etc.
The vehicle may perform the method described in the corresponding embodiment of fig. 2 above, namely, to identify the above-mentioned identifier from the environmental image captured by the camera included therein, and to control the driving behavior of the vehicle according to the physical parameter of the identifier.
Referring to fig. 11, fig. 11 is a schematic structural view of an apparatus for controlling a vehicle based on a wearable device according to another exemplary embodiment of the present disclosure.
In some alternative implementations, the adjustment module 1002 may include: an eighth determining unit 10021 for determining a distance between the pedestrian and the vehicle based on the sensing information; a first adjustment unit 10022 is configured to adjust the identified physical parameter on the wearable device based on the distance.
In some alternative implementations, the apparatus may further include: a generation module 1003 for generating communication instruction information for controlling the vehicle based on the distance; and a transmitting module 1004 for transmitting the communication instruction information to the vehicle.
In some alternative implementations, the adjustment module 1002 may include: a second adjusting unit 10023 is configured to adjust the brightness of the identifier on the wearable device based on the sensing information.
In some alternative implementations, the apparatus may further include: a third determining module 1005 for determining an amount of power of a battery on the wearable device; and a charging module 1006, configured to, in response to determining that the electric quantity is less than or equal to a preset electric quantity threshold, turn on the solar power generation device on the wearable device to charge the battery.
In some alternative implementations, the apparatus may further include: a fourth determining module 1007 for determining obstacles around the pedestrian based on the sensing information; a fifth determining module 1008 for determining a hazard attribute of the obstacle; the output module 1009 is configured to output prompt information for indicating that dangerous objects exist around the pedestrian based on the dangerous attribute.
According to the device for controlling the vehicle based on the wearable equipment, which is provided by the embodiment of the disclosure, the physical parameters of the identification on the wearable equipment are adjusted based on the induction information by acquiring the induction information which is acquired by the wearable equipment of the pedestrian and is aimed at the current environment, so that the identification is automatically adjusted by the wearable equipment, the vehicle is enabled to obtain clear and reliable identification images, and the accuracy of vehicle identification is improved. In addition, the wearable device can also pointedly adapt the physical parameters of the identification to the environment of the pedestrian, and prompt the environment of the pedestrian to the vehicle, so that the vehicle can pointedly adjust driving behavior according to the identification, and the risk of dangerous situations of the pedestrian and the vehicle is reduced.
Exemplary electronic device
Next, an electronic device according to an embodiment of the present disclosure is described with reference to fig. 12. The electronic device may be either or both of the terminal device 1012, the server 103 as shown in fig. 1A, or the wearable device 104 as shown in fig. 1A, or a stand-alone device independent thereof, which may communicate with the terminal device 1012, the server 103, or the wearable device 104 to receive the acquired input signals therefrom.
Fig. 12 illustrates a block diagram of an electronic device according to an embodiment of the disclosure.
As shown in fig. 12, the electronic device 1200 includes one or more processors 1201 and memory 1202.
The processor 1201 may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities, and may control other components in the electronic device 1200 to perform desired functions.
Memory 1202 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or nonvolatile memory. Volatile memory can include, for example, random Access Memory (RAM) and/or cache memory (cache) and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on a computer readable storage medium and the processor 1201 may execute the program instructions to implement the vehicle control methods of the various embodiments of the present disclosure above or methods of controlling a vehicle based on a wearable device and/or other desired functions. Various contents such as an environment image, sensing information, etc. may also be stored in the computer-readable storage medium.
In one example, the electronic device 1200 may further include: an input device 1203 and an output device 1204, which are interconnected via a bus system and/or other forms of connection mechanism (not shown).
For example, when the electronic device is the terminal device 1012 or the server 103 or the wearable device 104, the input means 1203 may be a device such as a camera, a mouse, a keyboard, or the like for inputting images, various commands, or the like. When the electronic device is a stand-alone device, the input means 1203 may be a communication network connector for receiving the inputted image, various commands from the terminal device 1012, the server 103, the wearable device 104.
The output device 1204 may output various information to the outside, including instructions for controlling the vehicle. The output device 1204 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, etc.
Of course, only some of the components of the electronic device 1200 that are relevant to the present disclosure are shown in fig. 12, components such as buses, input/output interfaces, etc. are omitted for simplicity. In addition, the electronic device 1200 may include any other suitable components, depending on the particular application.
Exemplary computer program product and computer readable storage Medium
In addition to the methods and devices described above, embodiments of the present disclosure may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform steps in a vehicle control method or a method of controlling a vehicle based on a wearable device according to various embodiments of the present disclosure described in the "exemplary methods" section of the present description.
The computer program product may write program code for performing the operations of embodiments of the present disclosure in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present disclosure may also be a computer-readable storage medium, having stored thereon computer program instructions, which when executed by a processor, cause the processor to perform steps in a vehicle control method or a method of controlling a vehicle based on a wearable device according to various embodiments of the present disclosure described in the above "exemplary method" section of the present disclosure.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present disclosure have been described above in connection with specific embodiments, however, it should be noted that the advantages, benefits, effects, etc. mentioned in the present disclosure are merely examples and not limiting, and these advantages, benefits, effects, etc. are not to be considered as necessarily possessed by the various embodiments of the present disclosure. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, since the disclosure is not necessarily limited to practice with the specific details described.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different manner from other embodiments, so that the same or similar parts between the embodiments are mutually referred to. For system embodiments, the description is relatively simple as it essentially corresponds to method embodiments, and reference should be made to the description of method embodiments for relevant points.
The block diagrams of the devices, apparatuses, devices, systems referred to in this disclosure are merely illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, firmware. The above-described sequence of steps for the method is for illustration only, and the steps of the method of the present disclosure are not limited to the sequence specifically described above unless specifically stated otherwise. Furthermore, in some embodiments, the present disclosure may also be implemented as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
It is also noted that in the apparatus, devices and methods of the present disclosure, components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered equivalent to the present disclosure.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit the embodiments of the disclosure to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.

Claims (10)

1. A method of controlling a vehicle based on a wearable device, comprising:
acquiring induction information aiming at the current environment and collected by wearable equipment of a pedestrian;
based on the sensing information, adjusting a physical parameter of an identification on the wearable device, the physical parameter being used to generate a prompt for vehicles around the pedestrian, the physical parameter comprising at least one of: color, brightness, shape.
2. The method of claim 1, wherein the adjusting the identified physical parameter on the wearable device based on the sensed information comprises:
determining a distance between the pedestrian and the vehicle based on the sensed information;
based on the distance, a physical parameter of an identification on the wearable device is adjusted.
3. The method of claim 1, wherein the adjusting the identified physical parameter on the wearable device based on the sensed information comprises:
Based on the sensed information, adjusting the brightness of the identification on the wearable device.
4. The method of claim 1, wherein the method further comprises:
determining obstacles around the pedestrian based on the sensing information;
determining a hazard attribute of the obstacle;
and outputting prompt information for representing that dangerous objects exist around the pedestrian based on the dangerous attribute.
5. An apparatus for controlling a vehicle based on a wearable device, comprising:
the acquisition module is used for acquiring induction information aiming at the current environment and acquired by the wearable equipment of the pedestrian;
an adjustment module for adjusting, based on the sensing information, a physical parameter of an identification on the wearable device, the physical parameter being used to generate a prompt for a vehicle around the pedestrian, the physical parameter comprising at least one of: color, brightness, shape.
6. The apparatus of claim 5, wherein the adjustment module comprises:
an eighth determining unit configured to determine a distance between the pedestrian and the vehicle based on the sensing information;
and the first adjusting unit is used for adjusting the physical parameters of the mark on the wearable equipment based on the distance.
7. The apparatus of claim 5, wherein the adjustment module comprises:
and the second adjusting unit is used for adjusting the brightness of the mark on the wearable equipment based on the sensing information.
8. The apparatus of claim 5, wherein the apparatus further comprises:
a fourth determining module for determining obstacles around the pedestrian based on the sensing information;
a fifth determining module, configured to determine a dangerous attribute of the obstacle;
and the output module is used for outputting prompt information for representing dangerous objects around the pedestrians based on the dangerous attribute.
9. A computer readable storage medium storing a computer program for performing the method of any one of the preceding claims 1-4.
10. An electronic device, the electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the method of any of the preceding claims 1-4.
CN202311095534.2A 2021-09-22 2021-09-22 Method and device for controlling vehicle based on wearable equipment Pending CN117022307A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311095534.2A CN117022307A (en) 2021-09-22 2021-09-22 Method and device for controlling vehicle based on wearable equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111109541.4A CN113771869B (en) 2021-09-22 2021-09-22 Vehicle control method and method for controlling vehicle based on wearable device
CN202311095534.2A CN117022307A (en) 2021-09-22 2021-09-22 Method and device for controlling vehicle based on wearable equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202111109541.4A Division CN113771869B (en) 2021-09-22 2021-09-22 Vehicle control method and method for controlling vehicle based on wearable device

Publications (1)

Publication Number Publication Date
CN117022307A true CN117022307A (en) 2023-11-10

Family

ID=78852520

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311095534.2A Pending CN117022307A (en) 2021-09-22 2021-09-22 Method and device for controlling vehicle based on wearable equipment
CN202111109541.4A Active CN113771869B (en) 2021-09-22 2021-09-22 Vehicle control method and method for controlling vehicle based on wearable device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202111109541.4A Active CN113771869B (en) 2021-09-22 2021-09-22 Vehicle control method and method for controlling vehicle based on wearable device

Country Status (1)

Country Link
CN (2) CN117022307A (en)

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0383478A1 (en) * 1989-02-13 1990-08-22 D. Alfred Owens Safety wear
US20100063652A1 (en) * 2008-09-11 2010-03-11 Noel Wayne Anderson Garment for Use Near Autonomous Machines
WO2015013240A1 (en) * 2013-07-25 2015-01-29 Elwha Llc Systems for preventing collisions of vehicles with pedestrians
JP6429368B2 (en) * 2013-08-02 2018-11-28 本田技研工業株式会社 Inter-vehicle communication system and method
CN103927904B (en) * 2014-04-08 2017-02-01 中国科学院合肥物质科学研究院 Early warning method of pedestrian anti-collision early warning system using smartphone
CN104346955A (en) * 2014-10-16 2015-02-11 浙江吉利汽车研究院有限公司 Man-vehicle communication-based pedestrian collision avoiding method and collision avoiding system
CN105225474A (en) * 2015-08-19 2016-01-06 奇瑞汽车股份有限公司 Based on the traffic collision early warning system of intelligent wearable device
CN106364485A (en) * 2016-08-30 2017-02-01 戴姆勒股份公司 Operation method for communication equipment, communication equipment for vehicle and wearable equipment
JP6271674B1 (en) * 2016-10-20 2018-01-31 パナソニック株式会社 Pedestrian communication system, in-vehicle terminal device, pedestrian terminal device, and safe driving support method
CN106740573B (en) * 2016-12-22 2020-06-12 深圳市元征科技股份有限公司 Vehicle early warning method based on intelligent wearable device and intelligent wearable device
CN107440182A (en) * 2017-07-28 2017-12-08 刘皓麟 Intelligent wearable device
CN207473802U (en) * 2017-11-20 2018-06-08 贺建国 A kind of intelligent and safe caution system
CN110341722A (en) * 2019-07-25 2019-10-18 百度在线网络技术(北京)有限公司 Running method and device, electronic equipment, the readable medium of automatic driving vehicle
US11485377B2 (en) * 2020-02-06 2022-11-01 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicular cooperative perception for identifying a connected vehicle to aid a pedestrian
CN111294564A (en) * 2020-03-03 2020-06-16 维沃移动通信有限公司 Information display method and wearable device
CN113223325A (en) * 2021-03-26 2021-08-06 南京市德赛西威汽车电子有限公司 Method for safely passing signal-lamp-free intersection

Also Published As

Publication number Publication date
CN113771869A (en) 2021-12-10
CN113771869B (en) 2024-03-15

Similar Documents

Publication Publication Date Title
CN110349405B (en) Real-time traffic monitoring using networked automobiles
KR102267331B1 (en) Autonomous vehicle and pedestrian guidance system and method using the same
CN113044059A (en) Safety system for a vehicle
JP2019502183A (en) System and method for sending a message to a vehicle
JPWO2019069581A1 (en) Image processing device and image processing method
US9779312B2 (en) Environment recognition system
CN112534487B (en) Information processing apparatus, moving body, information processing method, and program
CN111002981A (en) Prompting method, prompting device, automatic driving vehicle and storage medium
KR20180090610A (en) Method and apparatus for outputting information about a lane
KR20170050362A (en) Driver Assistance Apparatus, Vehicle Having The Same and Vehicle Safety system
KR20160091040A (en) Vehicle and Control Method Thereof
US11958410B2 (en) Artificially intelligent mobility safety system
CN113352989B (en) Intelligent driving safety auxiliary method, product, equipment and medium
CN107548466B (en) Method and device for detecting road barrier
EP4044149A1 (en) Information processing device, information processing system, and information processing method
CN113771869B (en) Vehicle control method and method for controlling vehicle based on wearable device
CN107599965B (en) Electronic control device and method for vehicle
WO2023179494A1 (en) Danger early warning method and apparatus, and vehicle
US20220027643A1 (en) Information processing apparatus, information processing method, and program
JP2021149309A (en) Information processing device, information processing method, and computer program
CN111028531A (en) Prompting method, prompting device, automatic driving vehicle and storage medium
JP7470967B2 (en) Systems, programs, machine learning methods, and machine learning models
US20210264786A1 (en) Warning device of vehicle and warning method thereof
CN112298017A (en) Interactive system, method and computer storage medium
CN112298022A (en) Welcome system, method, vehicle and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination