WO2022194246A9 - 控制方法、装置和系统 - Google Patents

控制方法、装置和系统 Download PDF

Info

Publication number
WO2022194246A9
WO2022194246A9 PCT/CN2022/081453 CN2022081453W WO2022194246A9 WO 2022194246 A9 WO2022194246 A9 WO 2022194246A9 CN 2022081453 W CN2022081453 W CN 2022081453W WO 2022194246 A9 WO2022194246 A9 WO 2022194246A9
Authority
WO
WIPO (PCT)
Prior art keywords
behavior
data
terminal
control device
indicate
Prior art date
Application number
PCT/CN2022/081453
Other languages
English (en)
French (fr)
Other versions
WO2022194246A1 (zh
Inventor
杨凡
彭康
张桂成
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022194246A1 publication Critical patent/WO2022194246A1/zh
Publication of WO2022194246A9 publication Critical patent/WO2022194246A9/zh

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle

Definitions

  • the present application relates to the technical field of intelligent driving, and more particularly, to a control method, device and system in the technical field of intelligent driving.
  • Lights and whistle sounds are important tools for inter-vehicle communication.
  • Existing smart cars can perceive lights and/or whistle sounds through sensors installed on the car, judge changes in the behavior of surrounding vehicles, and Change assisted driving to improve driving safety.
  • the lights and horn sound of the vehicle may not be able to communicate, which will affect the safety of intelligent driving and autonomous driving. It brings certain safety risks, and may even cause traffic accidents. Therefore, it will have a greater impact on the driving safety of smart cars.
  • the present application provides a control method, device and system, which can improve driving safety.
  • an embodiment of the present application provides a control method, the method including: acquiring first behavior indication data of a first terminal, where the first behavior indication data is used to indicate a first behavior of the first terminal; according to the The first behavior instruction data is for determining a first behavior purpose of the first behavior; based on the first behavior purpose, reminding the user and/or controlling the second terminal to perform the first operation.
  • the method may be applicable to an application scenario in which at least one first terminal reminds the second terminal through a first behavior in unmanned driving, automatic driving, intelligent driving or networked driving.
  • this embodiment of the present application does not limit the first behavior.
  • the first behavior may include a flashing light behavior and/or a siren sounding behavior.
  • the first behavior may also include a gear changing behavior and/or a lane changing behavior.
  • the method may be applied to a control system, and the control system may include the central control device, an audio collection device, and a video collection device.
  • the control system may also include a network service device.
  • control method may be executed by the central control device or the network service device, which is not limited in this embodiment of the present application.
  • it is only introduced by taking the control method executed by the central control device as an example.
  • the process of the control method executed by the network service device can refer to the process executed by the central control device, which will not be repeated in this embodiment of the application.
  • the central control device can determine the first behavior purpose of the first terminal based on the first behavior indication data; and in response to the first behavior purpose, control the second terminal to perform the first operation, that is, That is to say, the central control device on the second terminal can judge the first behavior purpose (or intention) of the first terminal based on the first behavior indication data of the first terminal, and give timely information based on the first behavior purpose (or intention) Feedback, which can improve driving safety.
  • this embodiment of the present application does not limit specific content included in the first behavior indication data.
  • the first behavior indication data may include at least one of a first image, a first audio, image feature data, and audio feature data
  • the first image includes the light of the first terminal.
  • the first audio contains the whistle sound of the first terminal
  • the image feature data is used to indicate the position, color, brightness, flash frequency and/or flash times of the light
  • the audio feature data is used to indicate the whistle The position, length, frequency and/or number of times of the sound.
  • the audio feature data and the image feature data can more comprehensively describe the light and whistle sound of the first terminal, and can improve the accuracy of determining the first behavior.
  • the central control device may acquire the first behavior indication data in various ways, which is not limited in this embodiment of the present application.
  • the central control device may receive the first image collected by the image collection device.
  • the central control device may receive the first audio collected by an audio collection device.
  • the image feature data is obtained by performing feature extraction on the first image
  • the audio feature data is obtained by performing feature extraction on the first audio.
  • feature extraction methods refer to existing related technologies. This embodiment of the present application does not limit it.
  • the central control device may perform feature extraction on the first image to obtain the image feature data.
  • the central control device may perform feature extraction on the first audio to obtain the audio feature data.
  • the central control device may determine the first behavior purpose based on the first behavior indication data and a preset mapping relationship A, and the mapping relationship A is used to indicate the relationship between the behavior indication data and the behavior purpose. Correspondence between.
  • the embodiment of the present application does not limit the specific form of the mapping relationship A.
  • the mapping relationship A may be a mapping table A.
  • the central control device obtains the mapping relationship by searching the mapping table, which can improve the calculation speed.
  • mapping relationship A may identify the model A for behavior purposes.
  • the central control device calculates the mapping relationship through the behavior purpose recognition model, which can improve the calculation accuracy.
  • the central control device may obtain the mapping relationship A in various ways, which is not limited in this embodiment of the present application.
  • the central control device may generate the mapping relationship A by itself.
  • the central control device may receive the mapping relationship A from other devices.
  • the central control device may preconfigure the behavior purpose recognition model.
  • the central control device may determine the purpose of the first behavior based on the first behavior indication data and reference data, the reference data including at least one of the following data: driving state data, environmental data, road condition data and weather data, the driving state data is used to indicate the driving state of the second terminal, the environment data is used to indicate the environment outside and/or inside the second terminal, and the road condition data is used to indicate the road condition around the second terminal.
  • driving state data is used to indicate the driving state of the second terminal
  • the environment data is used to indicate the environment outside and/or inside the second terminal
  • the road condition data is used to indicate the road condition around the second terminal.
  • the central control device determines the purpose of the first behavior by combining the reference data and the data of the first behavior index, which can improve the accuracy of determining the purpose of the first behavior.
  • the central control device may obtain the reference data in various ways, which is not limited in this embodiment of the present application.
  • the central control device may receive the reference data from other devices.
  • the central control device may determine the reference data based on the image collected by the image collection device and/or the audio collection device.
  • the central control device determines the purpose of the first behavior based on the first behavior indication data, reference data, and a preset mapping relationship B, and the mapping relationship B is used to indicate behavior indication data and behavior under different reference data. Correspondence between purposes.
  • mapping relationship B can refer to the specific form and acquisition method of the mapping relationship A, and will not be repeated here to avoid repetition.
  • the central control device reminds the user of the first behavior purpose and/or provides further operation suggestions for the user in response to the first behavior purpose.
  • the central control device may remind the user in various ways, which are not limited in this embodiment of the present application.
  • the central control device may remind the purpose of the first behavior and/or the operation suggestion through prompt text.
  • the central control device reminds the purpose of the first behavior and/or the operation suggestion through the prompt text, which can give the user enough driving freedom and improve driving flexibility while improving driving safety. sex.
  • the central control device may broadcast the first behavior purpose and/or the operation suggestion by voice.
  • the central control device broadcasts the first behavior purpose and/or the operation suggestion through voice, which can give the user enough driving freedom, and improve driving flexibility while improving driving safety . In addition, it is possible to avoid situations where the user does not see text reminders on the screen.
  • the central control device controls the second terminal to perform the first operation in response to the first behavior purpose.
  • the central control device controls the second terminal to perform the first operation, and can respond to the first behavior of the first terminal in time, thereby improving driving safety.
  • it can be applied to unmanned driving application scenarios.
  • the central control device may determine the first operation corresponding to the first behavior purpose based on the first behavior purpose.
  • the central control device may determine the first operation corresponding to the first behavior purpose based on the first behavior purpose in various ways, which is not limited in this embodiment of the present application.
  • the central control device may determine the first operation based on the first behavior purpose and a preset mapping relationship C, where the mapping relationship C is used to indicate the correspondence between behavior purpose and operation .
  • mapping relationship C can refer to the specific form and acquisition method of the mapping relationship A. To avoid repetition, details are not repeated here.
  • the central control device may send a control request to the user, where the control request is used to ask the user whether to perform the first operation; control the second terminal to execute the first operation after receiving the first instruction; or stop controlling the second terminal to execute the first operation after obtaining the second instruction from the user.
  • the central control device controls the second terminal according to the actual needs of the user, so as to improve the driving flexibility while improving the driving safety.
  • the central control device reminds the user of the first behavior purpose in response to the first behavior purpose, and controls the second terminal to perform the first operation.
  • the method may further include: the central control device may acquire a second operation actually performed by the second terminal within a preset period of time; if the second operation is not exactly the same as the first operation, updating the Mapping relationship A and mapping relationship B.
  • the central control device may send update information to the device generating each mapping relationship (such as mapping relationship A or mapping relationship B), where the updating information is used to indicate the correspondence between the first behavior purpose and the second operation.
  • each mapping relationship such as mapping relationship A or mapping relationship B
  • the control method provided in the embodiment of the present application continuously updates each mapping relationship according to the actual situation, and can improve the accuracy of behavior purpose identification.
  • the embodiment of the present application further provides a control method, which may include: the network service device acquires the first behavior indication data of the first terminal, and the first behavior indication data is used to indicate the first behavior of the first terminal.
  • Behavior the network service device determines the first behavior purpose of the first behavior according to the first behavior indication data; the network service device sends indication information to the central control device, and the indication information is used to indicate the first behavior purpose; In response to the first indication information, the central control device reminds the user of the purpose of the first behavior and/or controls the second terminal to perform the first operation.
  • the network service device may receive the first behavior indication data from the central control device.
  • the central control device itself does not need to determine the first behavior based on the first behavior indication data, and it is transferred to be executed by the network service device.
  • the computing power and calculation amount of the central control device can be reduced,
  • the network service device is set on the cloud side, statistics and calculations can be performed based on big data, which can improve the accuracy of judging the purpose of the first behavior, thereby improving driving safety.
  • the embodiment of the present application also provides a control method, which may include: the central control device acquires the first behavior indication data of the first terminal, and the first behavior indication data is used to indicate the first behavior of the first terminal. behavior; the central control device determines the first behavior purpose of the first behavior according to the first behavior indication data; the central control device sends first indication information to the network service device, and the first indication information is used to indicate the first behavior purpose: the network service device sends second indication information to the central control device based on the first behavior purpose indicated by the first indication information, and the second indication information is used to indicate to remind the user of the first behavior purpose and/or controlling the second terminal to perform the first operation; the central control device reminding the user of the purpose of the first behavior and/or controlling the second terminal to perform the first operation in response to the second indication information.
  • the central control device itself does not need to determine the first operation based on the purpose of the first behavior, and it is transferred to be performed by the network service device.
  • the computing power and calculation amount of the central control device can be reduced, and
  • the network service device is set on the cloud side, statistics and calculations can be performed based on big data, which can improve the accuracy of judging the first operation, thereby improving driving safety.
  • the embodiment of the present application further provides a central control device, configured to execute the method implemented by the central control device in the foregoing aspects or any possible implementation manners of the aspects.
  • the central control device may include a unit for executing the method executed by the central control device in any possible implementation manner of the foregoing aspects or aspects.
  • the embodiment of the present application further provides a network service device, configured to execute the method implemented by the network service device in the foregoing aspects or any possible implementation manners of the aspects.
  • the network service device may include a unit for performing the method performed by the network service device in the above aspects or any possible implementation manners of the aspects.
  • the embodiment of the present application further provides a central control device, which includes: a memory, at least one processor, a transceiver, and instructions stored in the memory and executable on the processor. Further, the memory, the processor and the communication interface communicate with each other through an internal connection path. Executing the instruction by the at least one processor causes the central control device to implement the method executed by the central control device in any possible implementation of the above aspects or aspects.
  • the central control device described in the fourth aspect or the sixth aspect may be integrated into the terminal.
  • the embodiment of the present application further provides a network service device, which includes: a memory, at least one processor, a transceiver, and instructions stored in the memory and operable on the processor. Further, the memory, the processor and the communication interface communicate with each other through an internal connection path. Executing the instruction by the at least one processor causes the network service device to implement the method performed by the network service device in the above aspects or any possible implementation manners of the aspects.
  • the network service device described in the fifth aspect or the seventh aspect may be integrated into a server, such as a cloud server.
  • the present application further provides a computer-readable storage medium for storing a computer program, and the computer program includes methods for implementing the above aspects or any possible implementation thereof.
  • the present application further provides a computer program product including instructions, which, when run on a computer, cause the computer to implement the methods described in the above aspects or any possible implementation thereof.
  • the present application further provides a chip, including: an input interface, an output interface, and at least one processor.
  • the chip device further includes a memory.
  • the at least one processor is used to execute the code in the memory, and when the at least one processor executes the code, the chip implements the methods described in the above aspects or any possible implementation thereof.
  • FIG. 1 is a schematic diagram of an application scenario 100 to which a control method provided in an embodiment of the present application is applicable;
  • FIG. 2 is a schematic block diagram of a control system 200 to which the control method provided in the embodiment of the present application is applied;
  • FIG. 3 is a schematic flowchart of a control method 300 provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of a control method 400 provided by an embodiment of the present application.
  • FIG. 5 is a schematic block diagram of a control device 500 provided by an embodiment of the present application.
  • FIG. 6 is a schematic block diagram of a control device 600 provided by an embodiment of the present application.
  • FIG. 7 is a schematic block diagram of a chip 700 provided by an embodiment of the present application.
  • Fig. 1 shows a schematic diagram of an application scenario 100 to which a control method provided by an embodiment of the present application is applicable.
  • the control method provided by the embodiment of the present application can be applied to at least one first terminal (the first terminal 110 is shown in Figure 1) in unmanned driving, automatic driving, intelligent driving or networked driving.
  • the first terminal 110 may be a motor vehicle (such as an unmanned vehicle, a smart vehicle, an electric vehicle, a digital vehicle, etc.), a drone, a rail vehicle, or a traffic light, etc., capable of emitting lights and/or beeping to the outside world.
  • a motor vehicle such as an unmanned vehicle, a smart vehicle, an electric vehicle, a digital vehicle, etc.
  • a drone a drone, a rail vehicle, or a traffic light, etc., capable of emitting lights and/or beeping to the outside world.
  • the end of the flute is a motor vehicle (such as an unmanned vehicle, a smart vehicle, an electric vehicle, a digital vehicle, etc.), a drone, a rail vehicle, or a traffic light, etc.
  • the second terminal 120 can be a motor vehicle (such as an unmanned vehicle, a smart vehicle, an electric vehicle, a digital vehicle, etc.), a drone, a rail car, a bicycle, or a traffic light, etc., which can receive lights and / or terminal beep sound.
  • a motor vehicle such as an unmanned vehicle, a smart vehicle, an electric vehicle, a digital vehicle, etc.
  • a drone such as a rail car, a bicycle, or a traffic light, etc., which can receive lights and / or terminal beep sound.
  • the distance between the first terminal 110 and the second terminal 120 may be smaller than a preset first threshold, that is, the first terminal 110 may be located at the center of the second terminal 120.
  • the first threshold is within the radius.
  • the first terminal 110 may be located near the second terminal 120 , for example, the light of the first terminal 110 may be observed and/or the whistle of the first terminal 110 may be received at the location of the second terminal 120 .
  • the first terminal 110 is driving behind the second terminal 120, if the first terminal 110 finds that the trunk of the second terminal 120 is not closed, etc., the first terminal 110 can remind the second terminal 120 by honking.
  • the first terminal 110 can remind the second terminal by turning on the left turn signal.
  • vehicle 2 can detect lights and/or whistle sounds of external vehicles through various sensors installed on the vehicle, And remind the car owner of the flashing light behavior and/or the whistle sound behavior of the vehicle 1, yet, the car owner who has the vehicle 2 in the driving process does not notice the light and/or the whistle sound of the vehicle 1 from the outside (such as the sound of the vehicle 2 The car owner lowers his head to adjust the air conditioner and does not notice the lights of vehicle 1, or the car owner does not notice the whistle of vehicle 1 due to the active noise reduction function in the car). In the above cases, the car owner may not be able to obtain the vehicle 1 reminder, therefore, the safety of driving is poor.
  • this embodiment of the present application may also be applicable to an application scenario where the first terminal 110 reminds the second terminal through other behaviors, for example, the first terminal 110 changes the speed (including speed up or down) behavior, lane change behavior,
  • the application scenario of reminding the second terminal 120 by docking behavior can refer to the application scenario of reminding by flashing lights and/or whistle behavior.
  • Fig. 2 shows a schematic block diagram of a control system 200 to which the control method provided by the embodiment of the present application is applied.
  • the control system 200 may include an image acquisition device 210 and/or an audio acquisition device 220 , and a central control device 230 , wherein the central control device 230 may communicate with the image acquisition device 210 and the audio acquisition device 220 respectively.
  • the image acquisition device 210 is configured to acquire a first image, the first image includes the light of the first terminal, that is, the first image can indicate the flashing behavior of the first terminal.
  • the image acquisition device 210 may be a camera.
  • the first image may be a frame of image or an image stream, which is not limited in this embodiment of the present application.
  • the first image is an image stream
  • a dynamic flashing process of lights can be obtained.
  • the audio collection device 220 is used to collect the first audio, which includes the whistle sound of the first terminal, that is, the first audio can indicate the whistle behavior of the first terminal;
  • the audio collection device 220 may be an audio sensor.
  • the central control device 230 is used to obtain the first behavior indication data of the first terminal, the first behavior indication data is used to indicate the first behavior of the first terminal, and the first behavior may include flashing lights and/or honking Behavior: determining a first behavior purpose of the first behavior based on the first behavior indication data; in response to the first behavior purpose, controlling the second terminal to perform a first operation.
  • the central control device 230 may be a central control system.
  • the central control device can comprehensively analyze and judge the purpose of the first behavior of the first terminal based on the blinking behavior and/or whistle behavior of the first terminal .
  • the first operation may include at least one of a vehicle speed control operation (such as a braking operation, a start operation, an acceleration operation or a deceleration operation, etc.), a flashing light operation, a horn operation, and a lane change operation. Examples are not limited to this.
  • the embodiment of the present application does not limit the specific forms of the image acquisition device 210 , the audio acquisition device 220 and the central control device 230 .
  • the image acquisition device 210, the audio acquisition device 220, and the central control device 230 may be three independent devices, and these three devices may be respectively installed or integrated in the second terminal.
  • the image acquisition device 210 and the audio acquisition device 220 can be integrated into one device, the central control device 230 is an independent device, and the device and the central control device 230 are respectively installed or integrated in a second in the terminal.
  • the image acquisition device 210 and the central control device 230 can be integrated into one device, the audio acquisition device 220 is an independent device, and the device and the audio acquisition device 220 are respectively installed or integrated in a second in the terminal.
  • the audio collection device 220 and the central control device 230 can be integrated into one device, and the image collection device 210 is an independent device, and the device and the image collection device 210 are respectively installed or integrated in a second in the terminal.
  • the image acquisition device 210 the audio acquisition device 220, and the central control device 230, for the sake of clarity, in the following description, the image acquisition device, the audio acquisition device, and the central control device are all used as examples. describe.
  • the communication between the central control device 230 and the image acquisition device 210 (or the audio acquisition device 220) may be performed in various ways, which is not limited in this embodiment of the present application.
  • the central control device 230 may communicate with the image acquisition device 210 (or the audio acquisition device 220 ) in a wired manner, which is not limited in this embodiment of the present application.
  • the above-mentioned wired manner may be to realize communication through a data line connection or through an internal bus connection.
  • the central control device 230 may communicate with the image collection device 210 (or the audio collection device 220 ) in a wireless manner.
  • the above-mentioned wireless method may be to realize communication through a communication network
  • the communication network may be a local area network, or a wide area network through a relay (relay) device, or include a local area network and a wide area network.
  • the communication network can be a wireless fidelity (wireless fidelity, Wifi) hotspot network, a wifi peer-to-peer (peer-to-peer, P2P) network, bluetooth (bluetooth) network, zigbee network, near field communication (near field communication, NFC) network or possible general short-distance communication network in the future.
  • the communication network may be a third-generation mobile communication technology (3rd-generation wireless telephone technology, 3G) network, a fourth-generation mobile communication technology (the 4th generation mobile communication technology, 4G ) network, fifth-generation mobile communication technology (5th-generation mobile communication technology, 5G) network, public land mobile network (public land mobile network, PLMN) or the Internet (Internet), etc., which are not limited in this embodiment of the present application.
  • 3G third-generation mobile communication technology
  • 4G fourth-generation mobile communication technology
  • 5th-generation mobile communication technology 5th-generation mobile communication technology
  • PLMN public land mobile network
  • Internet Internet
  • the central control device can determine the first behavior purpose of the first terminal based on the first behavior indication data; and in response to the first behavior purpose, control the second terminal to perform the first operation, that is, That is to say, the central control device on the second terminal can judge the first behavior purpose (or intention) of the first terminal based on the first behavior indication data of the first terminal, and give timely information based on the first behavior purpose (or intention) Feedback, which can improve driving safety.
  • control system 200 may further include a network service device 240, and the network service device 240 may communicate with the image collection device 210, the audio collection device 220 and the central control device 230 respectively.
  • the network service apparatus 240 may be a computer device having communication and computing capabilities.
  • the network service device 240 is used to obtain the first behavior indication data; determine the first behavior purpose based on the first behavior indication data; sending indication information, where the indication information is used to instruct the second terminal to perform the first operation.
  • the central control device 230 is configured to control the second terminal to perform the first operation in response to the indication information.
  • the network service device 240 may be set locally or on the cloud side, which is not limited in this embodiment of the present application.
  • the network service device 240 when the network service device 240 is set on the cloud side, the network service device 240 can communicate with the image acquisition device 210 (or the audio acquisition device 220 or the central control device 230 ) in a wireless manner.
  • the network service device 240 may be a server on the cloud side.
  • the network service device 240 when the network service device 240 is configured locally, the network service device 240 may be installed as an independent device or integrated in the second terminal.
  • the network service device 240 when the network service device 240 is set locally, the network service device 240 may be integrated with at least one of the image acquisition device 210, the audio acquisition device 220, and the central control device 230 into one device, The device is installed or integrated in the second terminal.
  • the central control device 230 itself does not need to determine the purpose of the first behavior based on the first behavior indication data, and the network service device 240 executes it. In this way, the capability requirements of the central control device 230 can be reduced And the amount of calculation, and when the network service device 240 is set on the cloud side, it can perform big data statistics and calculations, which can improve the accuracy of judging the purpose of the first behavior, thereby improving the safety of driving.
  • Fig. 3 shows a schematic flowchart of a control method 300 provided by an embodiment of the present application. As shown in Fig. 3, the method 300 can be applied to the application scenario 100 shown in Fig.
  • the control system 200 may be executed by a central control device in the control system 200 .
  • the central control device acquires first behavior indication data of the first terminal, where the first behavior indication data is used to indicate a first behavior of the first terminal, and the first behavior includes a flashing light behavior and/or a whistle behavior.
  • first terminals there may be one or more first terminals, which is not limited in this embodiment of the present application.
  • this embodiment of the present application does not limit specific content included in the first behavior indication data.
  • the first behavior indication data may include at least one of a first image, a first audio, image feature data, and audio feature data
  • the first image includes the light of the first terminal.
  • the first audio contains the whistle sound of the first terminal
  • the image feature data is used to indicate the position, color, brightness, flash frequency and/or flash times of the light
  • the audio feature data is used to indicate the whistle The position, length, frequency and/or number of times of the sound.
  • the central control device may acquire the first behavior indication data in various ways, which is not limited in this embodiment of the present application.
  • the central control device may receive the first image collected by the image collection device.
  • the central control device may receive the first audio collected by an audio collection device.
  • the image feature data is obtained by performing feature extraction on the first image
  • the audio feature data is obtained by performing feature extraction on the first audio.
  • feature extraction methods refer to existing related technologies. This embodiment of the present application does not limit it.
  • the central control device may perform feature extraction on the first image to obtain the image feature data.
  • the central control device may perform feature extraction on the first audio to obtain the audio feature data.
  • the central control device determines a first behavior purpose of the first behavior according to the first behavior indication data.
  • the central control device may determine the first behavior purpose based on the first behavior indication data and a preset mapping relationship A, and the mapping relationship A is used to indicate the relationship between the behavior indication data and the behavior purpose. Correspondence between.
  • the embodiment of the present application does not limit the specific form of the mapping relationship A.
  • the mapping relationship A may be a mapping table A.
  • mapping relationship A may be a mapping table as shown in Table 1 below.
  • the first row indicates data
  • Fog lights flash three times quickly and whistle twice Reminder of low visibility ahead
  • the left turn signal flashes light and dark alternately 4 times, and the horn once Reminder about overtaking Long whistle for 10 seconds Remind the car coming on the curve
  • mapping relationship A may identify the model A for behavior purposes.
  • the central control device may obtain the mapping relationship A in various ways, which is not limited in this embodiment of the present application.
  • the central control device may generate the mapping relationship A by itself.
  • the central control device can train the behavior purpose recognition model A.
  • the central control device may receive the mapping relationship A from other devices.
  • the central control device may receive the behavior purpose recognition model A from the model training device, and the model training device is used for training to obtain the behavior purpose recognition model.
  • the central control device may receive the mapping table A (or the behavior purpose identification model A) from the cloud network service device.
  • the central control device may preconfigure the behavior purpose identification model.
  • S320 may include: the central control device determining the purpose of the first behavior based on the first behavior indication data and reference data, the reference data including at least one of the following data: driving state data, environmental data, road conditions data and weather data, the driving state data is used to indicate the driving state of the second terminal, the environment data is used to indicate the environment outside and/or inside the second terminal, and the road condition data is used to indicate the environment around the second terminal road conditions.
  • driving state data is used to indicate the driving state of the second terminal
  • the environment data is used to indicate the environment outside and/or inside the second terminal
  • the road condition data is used to indicate the environment around the second terminal road conditions.
  • driving state data may include current vehicle speed, current position, whether to brake, whether to accelerate and decelerate, etc.
  • the external environment data may include points of interest (point of interest, POI) around the second terminal
  • the internal environment data may include whether the music in the car is turned on, the sound level of the music, the opening of the car windows, etc.
  • the road condition data may include whether it is currently at a crossroad, whether there is a curve ahead, whether it is currently on a one-way street, and so on.
  • weather data may include sunny days, rainy days, and snowy days.
  • the central control device may obtain the reference data in various ways, which is not limited in this embodiment of the present application.
  • the central control device may receive the reference data from other devices.
  • the central control device can receive the reference data from the navigation device.
  • the central control device may determine the reference data based on the image collected by the image collection device and/or the audio collection device.
  • the central control device determines the purpose of the first behavior based on the first behavior indication data, reference data, and a preset mapping relationship B, and the mapping relationship B is used to indicate behavior indication data and behavior under different reference data. Correspondence between purposes.
  • mapping relationship B can refer to the specific form and acquisition method of the mapping relationship A, and will not be repeated here to avoid repetition.
  • the central control device reminds the user and/or controls the second terminal to perform the first operation in response to the first behavior purpose.
  • the central control device reminds the user of the first behavior purpose and/or provides further operation suggestions for the user in response to the first behavior purpose.
  • the central control device may remind the user in various ways, which are not limited in this embodiment of the present application.
  • the central control device may broadcast the first behavior purpose and/or the operation suggestion by voice.
  • the central control device may remind the purpose of the first behavior and/or the operation suggestion through prompt text.
  • the central control device controls the second terminal to perform the first operation in response to the first behavior purpose.
  • the central control device may determine the first operation corresponding to the first behavior purpose based on the first behavior purpose.
  • the central control device may determine the first operation corresponding to the first behavior purpose based on the first behavior purpose in various ways, which is not limited in this embodiment of the present application.
  • the central control device may determine the first operation based on the first behavior purpose and a preset mapping relationship C, where the mapping relationship C is used to indicate the correspondence between behavior purpose and operation .
  • mapping relationship C can refer to the specific form and acquisition method of the mapping relationship A, and will not be repeated here to avoid repetition.
  • mapping relationship C may be a mapping table as shown in Table 2 below.
  • the purpose of the first act first operation Reminder of low visibility ahead Slow down, the fog lights flash three times quickly Reminder about overtaking drive in current lane Remind the car coming on the curve Slow down and honk the horn for 10 seconds
  • the central control device may send a control request to the user, where the control request is used to ask the user whether to perform the first operation; control the second terminal to execute the first operation after receiving the first instruction; or stop controlling the second terminal to execute the first operation after obtaining the second instruction from the user.
  • the central control device reminds the user of the first behavior purpose in response to the first behavior purpose, and controls the second terminal to perform the first operation.
  • the central control device may acquire a second operation actually performed by the second terminal within a preset period of time; if the second operation is not exactly the same as the first operation, update the mapping relationship A and the mapping relationship B.
  • the central control device may send update information to the device that generates each mapping relationship (such as mapping relationship A or mapping relationship B), where the update information is used to indicate the correspondence between the first behavior purpose and the second operation,
  • each mapping relationship such as mapping relationship A or mapping relationship B
  • the update information is used to indicate the correspondence between the first behavior purpose and the second operation
  • the foregoing method 300 may also be executed by a network service device.
  • S320 and/or S330 are executed by the network service device, which can reduce the calculation capability and calculation amount of the central control device.
  • FIG. 4 shows a schematic flowchart of a control method 400 provided by an embodiment of the present application.
  • the method 400 may be applicable to the application scenario 100 shown in FIG. 1 , It can be applied to the control system 200 shown in FIG. 2 .
  • the network service device acquires first behavior indication data of the first terminal, where the first behavior indication data is used to indicate a first behavior of the first terminal.
  • the network service device may receive the first behavior instruction data from the central control device.
  • the network service device determines a first behavior purpose of the first behavior according to the first behavior indication data.
  • the network service device sends indication information to the central control device, where the indication information is used to indicate the purpose of the first action; correspondingly, the central control device receives the indication information from the network service device.
  • the central control device reminds the user of the purpose of the first behavior and/or controls the second terminal to perform the first operation.
  • control method provided by the embodiment of the present application is described above with reference to FIG. 3 and FIG. 4 , and the control device for executing the above control method will be described below with reference to FIG. 5 to FIG. 7 .
  • control device may be the central control device described in the above method 300 embodiment, capable of executing the method implemented by the central control device in the above method 300; or, the control device may be the central control device described in the above method 400 embodiment
  • the network service device described above can execute the method implemented by the network service device in the method 400 above.
  • the central control device or the network service device includes corresponding hardware and/or software modules for performing various functions.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions in combination with the embodiments for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the functional modules of the central control device or the network service device can be divided according to the above method example, for example, each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module .
  • the above integrated modules may be implemented in the form of hardware. It should be noted that the division of modules in this embodiment is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • Fig. 5 shows a possible composition diagram of a control device (such as a terminal) or a network service device (such as a server) involved in the above embodiment, as shown in Fig. 5
  • the apparatus 500 may include: a transceiver unit 510 and a processing unit 520.
  • processing unit 520 may control the transceiving unit 510 to implement the method performed by the network control device or the terminal in the embodiment of the method 200 above, and/or other processes used in the technologies described herein.
  • the apparatus 500 may include a processing unit, a storage unit and a communication unit.
  • the processing unit may be used to control and manage the actions of the apparatus 500, for example, may be used to support the apparatus 500 to execute the steps performed by the above-mentioned units.
  • the storage unit can be used to support the device 500 to execute and store program codes and data, and the like.
  • the communication unit may be used to support communication of the apparatus 500 with other devices.
  • the processing unit may be a processor or a controller. It can implement or execute the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor can also be a combination of computing functions, such as a combination of one or more microprocessors, a combination of digital signal processing (digital signal processing, DSP) and a microprocessor, and the like.
  • the storage unit may be a memory.
  • the communication unit may be a device that interacts with other electronic devices, such as a radio frequency circuit, a Bluetooth chip, and a Wi-Fi chip.
  • the control device or network service device involved in this embodiment may be a device 600 with the structure shown in FIG.
  • the apparatus 600 includes a processor 610 and a transceiver 620, and the processor 610 and the transceiver 620 communicate with each other through an internal connection path.
  • Related functions implemented by the processing unit 520 in FIG. 5 may be implemented by the processor 610
  • related functions implemented by the transceiver unit 510 may be implemented by the processor 610 controlling the transceiver 620 .
  • the apparatus 600 may further include a memory 630, and the processor 610, the transceiver 620, and the memory 630 communicate with each other through an internal connection path.
  • the relevant functions implemented by the storage unit described in FIG. 5 may be implemented by the memory 630 .
  • the embodiment of the present application also provides a computer storage medium, the computer storage medium stores computer instructions, and when the computer instructions are run on the electronic device, the electronic device executes the above related method steps to implement the control method in the above embodiment.
  • the embodiment of the present application also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, it causes the computer to execute the above-mentioned related steps, so as to realize the control method in the above-mentioned embodiment.
  • the embodiment of the present application also provides a device, which may specifically be a chip, a component or a module, and the device may include a connected processor and a memory; wherein, the memory is used to store instructions executed by a computer, and when the device is running, the processing The device can execute the computer-executed instructions stored in the memory, so that the chip executes the control methods in the above method embodiments.
  • a device which may specifically be a chip, a component or a module, and the device may include a connected processor and a memory; wherein, the memory is used to store instructions executed by a computer, and when the device is running, the processing The device can execute the computer-executed instructions stored in the memory, so that the chip executes the control methods in the above method embodiments.
  • FIG. 7 shows a schematic structural diagram of a chip 700 .
  • Chip 700 includes one or more processors 710 and interface circuitry 720 .
  • the chip 700 may further include a bus 730 . in:
  • the processor 710 may be an integrated circuit chip and has a signal processing capability. In the implementation process, each step of the above method may be implemented by an integrated logic circuit of hardware in the processor 710 or instructions in the form of software.
  • the aforementioned processor 710 may be a general processor, DSP, ASIC, FPGA or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware components. Various methods and steps disclosed in the embodiments of the present application may be implemented or executed.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the interface circuit 720 can be used for sending or receiving data, instructions or information.
  • the processor 710 can process the data, instructions or other information received by the interface circuit 720 , and can send the processing completion information through the interface circuit 720 .
  • the chip further includes a memory, which may include a read-only memory and a random access memory, and provides operation instructions and data to the processor.
  • a portion of the memory can also include non-volatile random access memory
  • NVRAM non-volatile random access memory
  • the memory stores executable software modules or data structures
  • the processor can execute corresponding operations by calling operation instructions stored in the memory (the operation instructions can be stored in the operating system).
  • the chip may be used in the control device or the access control device involved in the embodiment of the present application.
  • the interface circuit 720 may be used to output the execution result of the processor 710 .
  • processor 710 and the interface circuit 720 can be realized by hardware design, software design, or a combination of software and hardware, which is not limited here.
  • the embodiment of the present application also provides a terminal, the terminal includes the device 500 described in FIG. 5 , the device 600 described in FIG. 6 , or the chip 700 described in FIG. 7 .
  • the embodiment of the present application also provides a server, the terminal includes the device 500 described in FIG. 5 , the device 600 described in FIG. 6 , or the chip 700 described in FIG. 7 .
  • control device computer storage medium, computer program product, chip, terminal or server provided in this embodiment are all used to execute the corresponding method provided above, therefore, the beneficial effects it can achieve can refer to The beneficial effects of the corresponding method provided above will not be repeated here.
  • sequence numbers of the above-mentioned processes do not mean the order of execution, and the execution order of the processes should be determined by their functions and internal logic, and should not be used in the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • the disclosed systems, devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or May be integrated into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the functions described above are realized in the form of software function units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other various media that can store program codes.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

本申请实施例提供的控制方法、装置和系统,能够提高行驶的安全性。该方法可以包括:获取第一终端的第一行为指示数据,该第一行为指示数据用于指示该第一终端的第一行为;根据该第一行为指示数据,确定该第一行为的第一行为目的;基于该第一行为目的,对用户进行提醒和/或控制第二终端执行第一操作。

Description

控制方法、装置和系统
本申请要求于2021年03月18日递交的申请号为202110291392.1、申请名称为“控制方法、装置和系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及智能驾驶技术领域,并且更具体地,涉及智能驾驶技术领域中的控制方法、装置和系统。
背景技术
随着社会的发展和科技的进步,智能汽车正在逐步进入人们的日常生活。由于智能汽车在行驶过程中面对的行驶环境(如周边车辆行为的变化)具有多样性和多变性,因此,智能汽车上安装有越来越多的传感器,以感知周边车辆行为的变化。
灯光和鸣笛音等都是车辆间交流的重要工具,现有的智能汽车可以通过车上安装的传感器感知灯光和/或鸣笛音,判断出周围车辆行为的变化,并基于周围车辆行为的变化辅助驾驶,从而提高驾驶的安全性。
然而,在行驶过程的某些情况下,如未注意到灯光和/或鸣笛音的情况下,车辆的灯光和鸣笛音可能无法起到交流的作用,这样会给智能驾驶和无人驾驶带来了一定的安全风险,甚至可能发生交通事故,因此,会给智能汽车的行驶安全性造成较大影响。
发明内容
本申请提供了一种控制方法、装置和系统,能够提高行驶的安全性。
第一方面,本申请实施例提供了一种控制方法,该方法包括:获取第一终端的第一行为指示数据,该第一行为指示数据用于指示该第一终端的第一行为;根据该第一行为指示数据,确定该第一行为的第一行为目的;基于该第一行为目的,对用户进行提醒和/或控制第二终端执行第一操作。
在一种可能的实现方式中,该方法可以适用于无人驾驶、自动驾驶、智能驾驶或网联驾驶中至少一个第一终端通过第一行为对第二终端进行提醒的应用场景。
可选地,本申请实施例对该第一行为不作限定。
在一种可能的实现方式中,该第一行为可以包括闪灯行为和/或鸣笛行为。
在另一种可能的实现方式中,该第一行为还可以包括变速行为和/或变道行为。
在一种可能的实现方式中,该方法可以应用于控制系统,该控制系统可以包括该中控装置、音频采集装置和视频采集装置。可选地,该控制系统还可以包括网络服务装置。
需要说明的是,该控制方法可以由中控装置或该网络服务装置执行,本申请实施例对此不做限定。第一方面中仅以该控制方法由中控装置执行为例进行介绍,该控制方法由该网络服务装置执行的过程可以参考由中控装置执行的过程,本申请实施例不再赘述。
本申请实施例提供的控制方法,中控装置可以基于第一行为指示数据,确定第一终端 的第一行为目的;并响应于该第一行为目的,控制第二终端执行第一操作,也就是说,第二终端上的中控装置可以基于第一终端的第一行为指示数据,判断第一终端的第一行为目的(或意图),并基于该第一行为目的(或意图)给出及时的反馈,这样能够提高行驶的安全性。
可选地,本申请实施例对第一行为指示数据包括的具体内容不做限定。
在一种可能的实现方式中,该第一行为指示数据可以包括第一图像、第一音频、图像特征数据和音频特征数据中的至少一项,该第一图像中包含该第一终端的灯光,该第一音频中包含该第一终端的鸣笛音,该图像特征数据用于指示该灯光的位置、颜色、亮度、闪光频率和/或闪光次数,该音频特征数据用于指示该鸣笛音的方位、长短、频率和/或次数。
本申请实施例提供的控制方法,音频特征数据和图像特征数据能够更全面的描述该第一终端的灯光和鸣笛音的情况,能够提高确定第一行为的精确性。
可选地,该中控装置可以通过多种方式获取该第一行为指示数据,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以接收图像采集装置采集的该第一图像。
在另一种可能的实现方式中,该中控装置可以接收音频采集装置采集的该第一音频。
需要说明的是,该图像特征数据是对该第一图像进行特征提取得到的,该音频特征数据是对该第一音频进行特征提取得到的,具体的特征提取方式可以参考现有的相关技术,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以对该第一图像进行特征提取得到该图像特征数据。
在另一种可能的实现方式中,该中控装置可以对该第一音频进行特征提取得到该音频特征数据。
在一种可能的实现方式中,该中控装置可以基于该第一行为指示数据和预设的映射关系A,确定该第一行为目的,该映射关系A用于指示行为指示数据与行为目的之间的对应关系。
可选地,本申请实施例对该映射关系A的具体形式不做限定。
在一种可能的实现方式中,该映射关系A可以为映射表A。
本申请实施例提供的控制方法,该中控装置通过映射表查找得到映射关系,能够提高计算速度。
在另一种可能的实现方式中,该映射关系A可以为行为目的识别模型A。
本申请实施例提供的控制方法,该中控装置通过行为目的识别模型计算得到映射关系,能够提高计算精确度。
可选地,该中控装置可以通过多种方式获取该映射关系A,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以自己生成该映射关系A。
在另一种可能的实现方式中,该中控装置可以接收来自其他装置的该映射关系A。
在又一种可能的实现方式中,该中控装置可以预先配置该行为目的识别模型。
可选地,该中控装置可以基于该第一行为指示数据和参考数据,确定该第一行为目的,该参考数据包括以下数据中的至少一项:行驶状态数据、环境数据、路况数据和天气数据, 该行驶状态数据用于指示该第二终端的行驶状态,该环境数据用于指示该第二终端外部和/或内部的环境,该路况数据用于指示该第二终端周围的路况。
本申请实施例提供的控制方法,该中控装置结合参考数据和第一行为指数数据,确定第一行为目的,能够提高确定第一行为目的的精确性。
可选地,中控装置可以通过多种方式获取该参考数据,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以接收来自其它装置的该参考数据。
在一种可能的实现方式中,该中控装置可以基于图像采集装置采集的图像和/或音频采集装置确定该参考数据。
可选地,该中控装置基于该第一行为指示数据、参考数据和预设的映射关系B,确定该第一行为目的,该映射关系B用于指示不同的参考数据下行为指示数据与行为目的之间的对应关系。
需要说明的是,该映射关系B的具体形式和获取方式可以参考映射关系A的具体形式和获取方式,为避免重复,此处不再赘述。
在一种可能的实现方式中,该中控装置响应于该第一行为目的,提醒用户该第一行为目的和/或为用户提供的进一步操作建议。
可选地,该中控装置可以通过多种方式对该用户进行提醒,本申请实施例对此不做限定。
在一种可能的实现方式中,该中控装置可以通过提示文字提醒该第一行为目的和/或该操作建议。
本申请实施例提供的控制方法,该中控装置通过提示文字提醒该第一行为目的和/或该操作建议,能够给用户足够的驾驶自由度,在提高驾驶的安全性的同时提高驾驶的灵活性。
在另一种可能的实现方式中,该中控装置可以语音播报该第一行为目的和/或该操作建议。
本申请实施例提供的控制方法,该中控装置通过语音播报该第一行为目的和/或该操作建议,能够给用户足够的驾驶自由度,在提高驾驶的安全性的同时提高驾驶的灵活性。此外,可以避免由于用户没看到屏幕上的文字提醒的情况。
在另一种可能的实现方式中,该中控装置响应于该第一行为目的,控制第二终端执行第一操作。
本申请实施例提供的控制方法,该中控装置控制第二终端执行该第一操作,能够及时对第一终端的第一行为作出反应,从而提高驾驶安全性。此外,可以适用于无人驾驶的应用场景。
可选地,在该响应于该第一行为目的,控制第二终端执行第一操作之前,该中控装置可以基于该第一行为目的,确定该第一行为目的对应的该第一操作。
可选地,该中控装置可以通过多种方式,基于该第一行为目的,确定该第一行为目的对应的该第一操作,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以基于该第一行为目的和预设的映射关系C,确定该第一操作,该映射关系C用于指示行为目的和操作之间的对应关系。
需要说明的是,该映射关系C的具体形式和获取方式可以参考映射关系A的具体形 式和获取方式,为避免重复,此处不再赘述。
可选地,在该中控装置控制该第二终端执行该第一操作之前,该中控装置可以向用户发送控制请求,该控制请求用于询问用户是否执行该第一操作;当获取到用户的第一指令后控制该第二终端执行该第一操作;或当获取到用户的第二指令后终止控制该第二终端执行该第一操作。
本申请实施例提供的控制方法,该中控装置结合用户的实际需求控制该第二终端,在提高驾驶的安全性的同时提高驾驶的灵活性。
在又一种可能的实现方式中,该中控装置响应于该第一行为目的,提醒用户该第一行为目的,并控制第二终端执行第一操作。
可选地,该方法还可以包括:该中控装置可以获取该第二终端在预设时间段内实际执行的第二操作;若该第二操作与该第一操作不完全相同,则更新该映射关系A和映射关系B。
可选地,该中控装置可以向生成各映射关系(如映射关系A或映射关系B)的装置发送更新信息,该更新信息用于指示该第一行为目的和该第二操作的对应关系。
本申请实施例提供的控制方法,不断根据实际情况更新各映射关系,能够提升行为目的识别的准确率。
第二方面,本申请实施例还提供一种控制方法,该方法可以包括:网络服务装置获取第一终端的第一行为指示数据,该第一行为指示数据用于指示该第一终端的第一行为;该网络服务装置根据该第一行为指示数据,确定该第一行为的第一行为目的;该网络服务装置向该中控装置发送指示信息,该指示信息用于指示该第一行为目的;该中控装置响应于该第一指示信息,提醒用户该第一行为目的和/或控制第二终端执行第一操作。
在一种可能的实现方式中,该网络服务装置可以接收来自该中控装置的该第一行为指示数据。
在一种可能的实现方式中,该中控装置响应于该第一指示信息,提醒用户该第一行为目的和/或控制第二终端执行第一操作,包括:该中控装置响应于该第一指示信息,确定该第一操作,并控制该第二终端执行该第一操作。
需要说明的是,第二方面中未介绍的步骤可以参考第一方面中相应步骤的介绍。
采用本申请实施例提供的控制系统,中控装置自身无需基于该第一行为指示数据确定该第一行为,转为由网络服务装置执行,这样,能够降低中控装置的计算能力和计算量,并且网络服务装置设置在云侧时可以基于大数据进行统计和计算,能够提高判断第一行为目的的准确性,从而提高行驶的安全性。
第三方面,本申请实施例还提供一种控制方法,该方法可以包括:中控装置获取第一终端的第一行为指示数据,该第一行为指示数据用于指示该第一终端的第一行为;该中控装置根据该第一行为指示数据,确定该第一行为的第一行为目的;该中控装置向网络服务装置发送第一指示信息,该第一指示信息用于指示该第一行为目的;该网络服务装置基于该第一指示信息指示的该第一行为目的,向该中控装置发送第二指示信息,该第二指示信息用于指示提醒用户该第一行为目的和/或控制第二终端执行第一操作;该中控装置响应于该第二指示信息,提醒用户该第一行为目的和/或控制该第二终端执行第一操作。
需要说明的是,第二方面中未介绍的步骤可以参考第一方面中相应步骤的介绍。
采用本申请实施例提供的控制系统,中控装置自身无需基于该第一行为目的确定该第一操作,转为由网络服务装置执行,这样,能够降低中控装置的计算能力和计算量,并且网络服务装置设置在云侧时可以基于大数据进行统计和计算,能够提高判断第一操作的准确性,从而提高行驶的安全性。
第四方面,本申请实施例还提供一种中控装置,用于执行上述各方面或各方面的任意可能的实现方式中由中控装置实现的方法。具体地,该中控装置可以包括用于执行上述各方面或各方面的任意可能的实现方式中由中控装置执行的方法的单元。
第五方面,本申请实施例还提供一种网络服务装置,用于执行上述各方面或各方面的任意可能的实现方式中由网络服务装置实现的方法。具体地,该网络服务装置可以包括用于执行上述各方面或各方面的任意可能的实现方式中由网络服务装置执行的方法的单元。
第六方面,本申请实施例还提供一种中控装置,该装置包括:存储器、至少一个处理器、收发器及存储在该存储器上并可在该处理器上运行的指令。进一步,该存储器、该处理器以及该通信接口之间通过内部连接通路互相通信。所述至少一个处理器执行该指令使得该中控装置实现上述各方面或各方面的任意可能的实现方式中由中控装置执行的方法。
可选地,该第四方面或该第六方面中所述的中控装置可以集成于终端。
第七方面,本申请实施例还提供一种网络服务装置,该装置包括:存储器、至少一个处理器、收发器及存储在该存储器上并可在该处理器上运行的指令。进一步,该存储器、该处理器以及该通信接口之间通过内部连接通路互相通信。所述至少一个处理器执行该指令使得该网络服务装置实现上述各方面或各方面的任意可能的实现方式中由网络服务装置执行的方法。
可选地,该第五方面或该第七方面中所述的网络服务装置可以集成于服务器,如云服务器。
第八方面,本申请还提供一种计算机可读存储介质,用于存储计算机程序,该计算机程序包括用于实现上述各方面或其任意可能的实现方式中所述的方法。
第九方面,本申请还提供一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机实现上述各个方面或其任意可能的实现方式中所述的方法。
第十方面,本申请还提供一种芯片,包括:输入接口、输出接口、至少一个处理器。可选的,所述芯片装置还包括存储器。该至少一个处理器用于执行该存储器中的代码,当该至少一个处理器执行该代码时,该芯片实现上述各方面或其任意可能的实现方式中所述的方法。
附图说明
图1为本申请实施例提供的控制方法所适用的应用场景100的示意图;
图2为本申请实施例提供的控制方法所应用的控制系统200的示意性框图;
图3为本申请实施例提供的控制方法300的示意性流程图;
图4为本申请实施例提供的控制方法400的示意性流程图;
图5为本申请实施例提供的控制装置500的示意性框图;
图6为本申请实施例提供的控制装置600的示意性框图;
图7为本申请实施例提供的芯片700的示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
图1示出了本申请实施例提供的控制方法所适用的应用场景100的示意图。如图1所示,本申请实施例提供的控制方法可以适用于无人驾驶、自动驾驶、智能驾驶或网联驾驶中至少一个第一终端(图1中示出了第一终端110)通过闪灯行为和/或鸣笛行为对第二终端120进行提醒的应用场景。
可选地,第一终端110可以为机动车辆(如无人车、智能车、电动车、数字汽车等)、无人机、轨道车、或交通灯等,能够向外界发出灯光和/或鸣笛音的终端。
可选地,第二终端120可以为机动车辆(如无人车、智能车、电动车、数字汽车等)、无人机、轨道车、自行车或交通灯等,能够接收到外界发出的灯光和/或鸣笛音的终端。
在一种可能的实现方式中,第一终端110与第二终端120之间的距离可以小于预设的第一阈值,即第一终端110可以位于以第二终端120为中心该第一阈值为半径的范围内。
换句话说,第一终端110可以位于第二终端120附近,如在第二终端120所在位置可以观察到第一终端110的灯光和/或可以接收到第一终端110的鸣笛音。
例如:第一终端110在第二终端120的后方行驶,若第一终端110发现第二终端120的后备箱没有关上等异常情况,第一终端110可以通过鸣笛对第二终端120进行提醒。
又如:在某些禁止鸣笛的路段,若第一终端110想要超过第二终端120,第一终端110可以通过打开左转向灯对第二终端进行提醒。
以第一终端110为车辆1,第二终端120为车辆2为例,在现有技术中,车辆2可以通过车上安装的多种传感器对外界车辆的灯光和/或鸣笛音进行检测,并向车主提醒车辆1的闪灯行为和/或鸣笛音行为,然而,在驾驶过程中存在车辆2的车主没有注意到来自外界的车辆1的灯光和/或鸣笛音(如车辆2的车主低头调节空调没有注意到车辆1的灯光,或者又如由于车内的主动降噪功能导致车主没有注意到车辆1的鸣笛音)的情况,在上述情况下,可能导致车主无法及时获取车辆1的提醒,因此,行驶的安全性较差。
需要说明的是,本申请实施例中仅以第一终端110通过闪灯行为和/或鸣笛行为对第二终端120进行提醒的应用场景,但本申请实施例不限于此。
可选地,本申请实施例还可以适用于第一终端110通过其他行为对第二终端进行提醒的应用场景,例如,第一终端110通过变速(包括提速或降速)行为、变道行为、停靠行为等对第二终端120进行提醒的应用场景,通过其他行为提醒的应用场景可以参考通过闪灯行为和/或鸣笛行为提醒的应用场景。
图2示出了本申请实施例提供的控制方法所应用的控制系统200的示意性框图。如图2所示,控制系统200可以包括图像采集装置210和/或音频采集装置220,以及中控装置230,其中,中控装置230可以分别与图像采集装置210和音频采集装置220通信。
图像采集装置210用于采集第一图像,该第一图像中包含第一终端的灯光,即该第一图像能够指示该第一终端的闪灯行为。
例如:图像采集装置210可以为摄像头。
可选地,该第一图像可以为一帧图像或图像流,本申请实施例对此不作限定。
需要说明的是,当该第一图像为图像流时,可以得到灯光动态的闪灯过程。
音频采集装置220用于采集第一音频,该第一音频中包含该第一终端的鸣笛音,即该 第一音频能够指示该第一终端的鸣笛行为;
例如:音频采集装置220可以为音频传感器。
中控装置230用于获取该第一终端的第一行为指示数据,该第一行为指示数据用于指示该第一终端的第一行为,该第一行为可以包括闪灯行为和/或鸣笛行为;基于该第一行为指示数据,确定该第一行为的第一行为目的;响应于该第一行为目的,控制第二终端执行第一操作。
例如:中控装置230可以为中控系统。
需要说明的是,由于第一终端的灯光的位置、颜色、闪灯频率和/或闪灯次数不同所代表的闪灯行为目的可能不同,类似地,第一终端的鸣笛音的方位、长短、频率和/或次数不同所代表的鸣笛行为也可能不同,因此,中控装置可以基于该第一终端的闪灯行为和/或鸣笛行为综合分析和判断第一终端的第一行为目的。
可选地,该第一操作可以包括车速控制操作(如制动操作、启动操作、加速操作或减速操作等)、闪灯操作、鸣笛操作和变道操作中的至少一项,本申请实施例对此不作限定。
可选地,本申请实施例对图像采集装置210、音频采集装置220和中控装置230的具体形态不做限定。
在一种可能的实现方式中,图像采集装置210、音频采集装置220和中控装置230可以是三个独立的装置,这三个装置可以分别安装或集成在第二终端中。
在另一种可能的实现方式中,图像采集装置210和音频采集装置220可以集成在一个设备中,中控装置230是一个独立的装置,该设备和中控装置230分别安装或集成在第二终端中。
在又一种可能的实现方式中,图像采集装置210和中控装置230可以集成在一个设备中,音频采集装置220是一个独立的装置,该设备和音频采集装置220分别安装或集成在第二终端中。
在又一种可能的实现方式中,音频采集装置220和中控装置230可以集成在一个设备中,图像采集装置210是一个独立的装置,该设备和图像采集装置210分别安装或集成在第二终端中。
需要说明的是,无论图像采集装置210、音频采集装置220和中控装置230的具体形态如何,为清楚起见,在下文的描述中,都以图像采集装置、音频采集装置和中控装置来进行描述。
可选地,中控装置230与图像采集装置210(或音频采集装置220)之间可以通过多种方式进行通信,本申请实施例对此不作限定。
在一种可能的实现方式中,中控装置230可以通过有线方式与图像采集装置210(或音频采集装置220)通信,本申请实施例对此不作限定。
例如:上述有线方式可以为通过数据线连接、或通过内部总线连接实现通信。
在另一种可能的实现方式中,中控装置230可以通过无线方式与图像采集装置210(或音频采集装置220)进行通信。
例如:上述无线方式可以为通过通信网络实现通信,该通信网络可以是局域网,也可以是通过中继(relay)设备转接的广域网,或者包括局域网和广域网。当该通信网络为局域网时,示例性的,该通信网络可以是无线保真(wireless fidelity,Wifi)热点网络、wifi 对等(peer-to-peer,P2P)网络、蓝牙(bluetooth)网络、zigbee网络、近场通信(near field communication,NFC)网或者未来可能的通用短距离通信网络等。当该通信网络为广域网时,示例性的,该通信网络可以是第三代移动通信技术(3rd-generation wireless telephone technology,3G)网络、第四代移动通信技术(the 4th generation mobile communication technology,4G)网络、第五代移动通信技术(5th-generation mobile communication technology,5G)网络、公共陆地移动网络(public land mobile network,PLMN)或因特网(Internet)等,本申请实施例对此不作限定。
本申请实施例提供的控制系统,中控装置可以基于第一行为指示数据,确定第一终端的第一行为目的;并响应于该第一行为目的,控制第二终端执行第一操作,也就是说,第二终端上的中控装置可以基于第一终端的第一行为指示数据,判断第一终端的第一行为目的(或意图),并基于该第一行为目的(或意图)给出及时的反馈,这样能够提高行驶的安全性。
可选地,控制系统200还可以包括网络服务装置240,网络服务装置240可以分别与图像采集装置210、音频采集装置220和中控装置230通信。
在一种可能的实现方式中,网络服务装置240可以为具有通信和计算能力的计算机设备。
在一种可能的实现方式中,网络服务装置240用于获取该第一行为指示数据;基于该第一行为指示数据,确定该第一行为目的;基于该第一行为目的,向中控装置230发送指示信息,该指示信息用于指示第二终端执行该第一操作。相应地,中控装置230用于响应于该指示信息,控制第二终端执行该第一操作。
可选地,网络服务装置240可以设置在本地或者云侧,本申请实施例对此不作限定。
在一种可能的实现方式中,网络服务装置240设置在云侧时,网络服务装置240可以通过无线方式与图像采集装置210(或音频采集装置220或中控装置230)通信。
例如:该网络服务装置240可以为云侧的服务器。
在另一种可能的实现方式中,网络服务装置240设置在本地时,网络服务装置240可以是一个独立的装置安装或集成在第二终端中。
在又一种可能的实现方式中,网络服务装置240设置在本地时,网络服务装置240可以与图像采集装置210、音频采集装置220和中控装置230中的至少一种集成在一个设备中,该设备安装或集成在第二终端中。
采用本申请实施例提供的控制系统,中控装置230自身无需基于该第一行为指示数据确定该第一行为目的,转为由网络服务装置240执行,这样,能够降低中控装置230的能力要求和计算量,并且网络服务装置240设置在云侧时可以进行大数据统计和计算,能够提高判断第一行为目的的准确性,从而提高行驶的安全性。
图3示出了本申请实施例提供的控制方法300的示意性流程图,如图3所示,该方法300可以适用于图1所示的应用场景100,可以应用于如图2所示的控制系统200,并可以由控制系统200中的中控装置执行。
S310,中控装置获取第一终端的第一行为指示数据,该第一行为指示数据用于指示该第一终端的第一行为,该第一行为包括闪灯行为和/或鸣笛行为。
可选地,该第一终端的数量可以为一个或多个,本申请实施例对此不作限定。
可选地,本申请实施例对第一行为指示数据包括的具体内容不做限定。
在一种可能的实现方式中,该第一行为指示数据可以包括第一图像、第一音频、图像特征数据和音频特征数据中的至少一项,该第一图像中包含该第一终端的灯光,该第一音频中包含该第一终端的鸣笛音,该图像特征数据用于指示该灯光的位置、颜色、亮度、闪光频率和/或闪光次数,该音频特征数据用于指示该鸣笛音的方位、长短、频率和/或次数。
可选地,该中控装置可以通过多种方式获取该第一行为指示数据,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以接收图像采集装置采集的该第一图像。
在另一种可能的实现方式中,该中控装置可以接收音频采集装置采集的该第一音频。
需要说明的是,该图像特征数据是对该第一图像进行特征提取得到的,该音频特征数据是对该第一音频进行特征提取得到的,具体的特征提取方式可以参考现有的相关技术,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以对该第一图像进行特征提取得到该图像特征数据。
在另一种可能的实现方式中,该中控装置可以对该第一音频进行特征提取得到该音频特征数据。
S320,该中控装置根据该第一行为指示数据,确定该第一行为的第一行为目的。
在一种可能的实现方式中,该中控装置可以基于该第一行为指示数据和预设的映射关系A,确定该第一行为目的,该映射关系A用于指示行为指示数据与行为目的之间的对应关系。
可选地,本申请实施例对该映射关系A的具体形式不做限定。
在一种可能的实现方式中,该映射关系A可以为映射表A。
例如:该映射关系A可以为如下表一所示的映射表。
表一
第一行为指示数据 第一行为目的
雾灯快速闪灯三下,鸣笛两下 提醒前方可见度低
左转向灯明暗交替闪灯4下,鸣笛一下 提醒即将超车
鸣笛长音10秒 提醒弯道来车
在另一种可能的实现方式中,该映射关系A可以为行为目的识别模型A。
可选地,该中控装置可以通过多种方式获取该映射关系A,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以自己生成该映射关系A。
例如:该中控装置可以训练该行为目的识别模型A。
在另一种可能的实现方式中,该中控装置可以接收来自其他装置的该映射关系A。
例如:该中控装置可以接收来自模型训练装置的该行为目的识别模型A,该模型训练装置用于训练得到该行为目的识别模型。
又例如:该中控装置可以接收来自云网络服务装置的该映射表A(或该行为目的识别模型A)。
在又一种可能的实现方式中,该中控装置可以预先配置该行为目的识别模型。
可选地,S320可以包括:该中控装置基于该第一行为指示数据和参考数据,确定该第一行为目的,该参考数据包括以下数据中的至少一项:行驶状态数据、环境数据、路况数据和天气数据,该行驶状态数据用于指示该第二终端的行驶状态,该环境数据用于指示该第二终端外部和/或内部的环境,该路况数据用于指示该第二终端周围的路况。
例如:行驶状态数据可以包括当前的车速、当前的位置、是否制动、是否加速和减速等。
又如:外部环境数据可以包括该第二终端周围的兴趣点(point of interest,POI),内部环境数据可以包括车内音乐是否开启、音乐声音大小、车窗的打开情况等。
又如:路况数据可以包括当前是否处于十字路口、前方是否有弯道和当前是否处于单行道等。
又如:天气数据可以包括晴天、雨天和雪天等。
可选地,中控装置可以通过多种方式获取该参考数据,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以接收来自其它装置的该参考数据。
例如:该中控装置可以接收来自导航装置的该参考数据。
在一种可能的实现方式中,该中控装置可以基于图像采集装置采集的图像和/或音频采集装置确定该参考数据。
可选地,该中控装置基于该第一行为指示数据、参考数据和预设的映射关系B,确定该第一行为目的,该映射关系B用于指示不同的参考数据下行为指示数据与行为目的之间的对应关系。
需要说明的是,该映射关系B的具体形式和获取方式可以参考映射关系A的具体形式和获取方式,为避免重复,此处不再赘述。
S330,该中控装置响应于该第一行为目的,对用户进行提醒和/或控制第二终端执行第一操作。
在一种可能的实现方式中,该中控装置响应于该第一行为目的,提醒用户该第一行为目的和/或为用户提供的进一步操作建议。
可选地,该中控装置可以通过多种方式对该用户进行提醒,本申请实施例对此不做限定。
在一种可能的实现方式中,该中控装置可以语音播报该第一行为目的和/或该操作建议。
在另一种可能的实现方式中,该中控装置可以通过提示文字提醒该第一行为目的和/或该操作建议。
在另一种可能的实现方式中,该中控装置响应于该第一行为目的,控制第二终端执行第一操作。
可选地,在该响应于该第一行为目的,控制第二终端执行第一操作之前,该中控装置可以基于该第一行为目的,确定该第一行为目的对应的该第一操作。
可选地,该中控装置可以通过多种方式,基于该第一行为目的,确定该第一行为目的对应的该第一操作,本申请实施例对此不作限定。
在一种可能的实现方式中,该中控装置可以基于该第一行为目的和预设的映射关系C,确定该第一操作,该映射关系C用于指示行为目的和操作之间的对应关系。
需要说明的是,该映射关系C的具体形式和获取方式可以参考映射关系A的具体形式和获取方式,为避免重复,此处不再赘述。
例如:该映射关系C可以为如下表二所示的映射表。
表二
第一行为目的 第一操作
提醒前方可见度低 减速,雾灯快速闪三下
提醒即将超车 在当前车道行驶
提醒弯道来车 减速,鸣笛长音10秒
可选地,在该中控装置控制该第二终端执行该第一操作之前,该中控装置可以向用户发送控制请求,该控制请求用于询问用户是否执行该第一操作;当获取到用户的第一指令后控制该第二终端执行该第一操作;或当获取到用户的第二指令后终止控制该第二终端执行该第一操作。
在又一种可能的实现方式中,该中控装置响应于该第一行为目的,提醒用户该第一行为目的,并控制第二终端执行第一操作。
可选地,在S330之后,该中控装置可以获取该第二终端在预设时间段内实际执行的第二操作;若该第二操作与该第一操作不完全相同,则更新该映射关系A和映射关系B。
可选地,该中控装置可以向生成各映射关系(如映射关系A或映射关系B)的装置发送更新信息,该更新信息用于指示该第一行为目的和该第二操作的对应关系,以便于该装置更新各映射关系,这样能够不断提升行为目的识别的准确率。
可选地,上述方法300也可以由网络服务装置执行。其中,S320和/或S330由网络服务装置执行,可以降低中控装置的计算能力和计算量。
在一种可能的实现方式中,图4示出了本申请实施例提供的控制方法400的示意性流程图,如图4所示,该方法400可以适用于图1所示的应用场景100,可以应用于如图2所示的控制系统200。
S410,网络服务装置获取第一终端的第一行为指示数据,该第一行为指示数据用于指示该第一终端的第一行为。
在一种可能的实现方式中,该网络服务装置可以接收来自中控装置的该第一行为指示数据。
S420,该网络服务装置根据该第一行为指示数据,确定该第一行为的第一行为目的。
S430,该网络服务装置向中控装置发送指示信息,该指示信息用于指示该第一行为目的;相应地,该中控装置接收来自该网络服务装置的该指示信息。
S440,该中控装置响应于该指示信息,提醒用户该第一行为目的和/或控制该第二终端执行第一操作。
需要说明的是,S410~S440中未介绍的部分可以参考S310~S330中的相应介绍,为避免重复,此处不再赘述。
上面结合图3和图4介绍了本申请实施例提供控制方法,下面将结合图5至图7介绍用于执行上述控制方法的控制装置。
需要说明的是,控制装置可以为上述方法300实施例中所述的中控装置,能够执行上述方法300中由中控装置所实现的方法;或者,控制装置可以为上述方法400实施例中所 述的网络服务装置,能够执行上述方法400中由网络服务装置所实现的方法。
可以理解的是,中控装置或网络服务装置为了实现上述功能,其包含了执行各个功能相应的硬件和/或软件模块。结合本文中所公开的实施例描述的各示例的算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以结合实施例对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本实施例可以根据上述方法示例对中控装置或网络服务装置进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块可以采用硬件的形式实现。需要说明的是,本实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图5示出了上述实施例中涉及的控制装置(如终端)或网络服务装置(如服务器)的一种可能的组成示意图,如图5所示,该装置500可以包括:收发单元510和处理单元520。
其中,处理单元520可以控制收发单元510实现上述方法200实施例中由网络控制装置或终端执行的方法,和/或用于本文所描述的技术的其他过程。
需要说明的是,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
在采用集成的单元的情况下,装置500可以包括处理单元、存储单元和通信单元。其中,处理单元可以用于对装置500的动作进行控制管理,例如,可以用于支持装置500执行上述各个单元执行的步骤。存储单元可以用于支持装置500执行存储程序代码和数据等。通信单元可以用于支持装置500与其他设备的通信。
其中,处理单元可以是处理器或控制器。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,数字信号处理(digital signal processing,DSP)和微处理器的组合等等。存储单元可以是存储器。通信单元具体可以为射频电路、蓝牙芯片、Wi-Fi芯片等与其他电子设备交互的设备。
在一种可能的实现方式中,本实施例所涉及的控制装置或网络服务装置可以为具有图6所示结构的装置600,该装置600可以是终端的结构示意图,也可以是服务器的结构示意图,该装置600包括处理器610和收发器620,该处理器610和收发器620通过内部连接通路互相通信。图5中的处理单元520所实现的相关功能可以由处理器610来实现,收发单元510所实现的相关功能可以由处理器610控制收发器620来实现。
可选地,该装置600还可以包括存储器630,该处理器610、该收发器620和该存储器630通过内部连接通路互相通信。图5中所述的存储单元所实现的相关功能可以由存储器630来实现。
本申请实施例还提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,使得电子设备执行上述相关方法步骤实现上述实施例中的控制方法。
本申请实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行 时,使得计算机执行上述相关步骤,以实现上述实施例中的控制方法。
另外,本申请实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述各方法实施例中的控制方法。
图7示出了一种芯片700的结构示意图。芯片700包括一个或多个处理器710以及接口电路720。可选的,所述芯片700还可以包含总线730。其中:
处理器710可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器710中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器710可以是通用处理器、DSP、ASIC、FPGA或者其它可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
接口电路720可以用于数据、指令或者信息的发送或者接收,处理器710可以利用接口电路720接收的数据、指令或者其它信息,进行加工,可以将加工完成信息通过接口电路720发送出去。
可选的,芯片还包括存储器,存储器可以包括只读存储器和随机存取存储器,并向处理器提供操作指令和数据。存储器的一部分还可以包括非易失性随机存取存储器
(non-volatile random access memory,NVRAM)。
可选的,存储器存储了可执行软件模块或者数据结构,处理器可以通过调用存储器存储的操作指令(该操作指令可存储在操作系统中),执行相应的操作。
可选的,芯片可以使用在本申请实施例涉及的控制装置或接入控制装置中。可选的,接口电路720可用于输出处理器710的执行结果。关于本申请的一个或多个实施例提供的控制方法可参考前述各个实施例,这里不再赘述。
需要说明的,处理器710、接口电路720各自对应的功能既可以通过硬件设计实现,也可以通过软件设计来实现,还可以通过软硬件结合的方式来实现,这里不作限制。
本申请实施例还提供一种终端,该终端包含上述图5中所述的装置500、图6中所述的装置600或图7中所述的芯片700。
本申请实施例还提供一种服务器,该终端包含上述图5中所述的装置500、图6中所述的装置600或图7中所述的芯片700。
需要说明的是,本实施例提供的控制装置、计算机存储介质、计算机程序产品、芯片、终端或服务器均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本 申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应所述以权利要求的保护范围为准。

Claims (24)

  1. 一种控制方法,其特征在于,包括:
    获取第一终端的第一行为指示数据,所述第一行为指示数据用于指示所述第一终端的第一行为;
    根据所述第一行为指示数据,确定所述第一行为的第一行为目的;
    基于所述第一行为目的,对用户进行提醒和/或控制第二终端执行第一操作。
  2. 根据权利要求1所述的方法,其特征在于,在所述根据所述第一行为指示数据,确定所述第一行为的第一行为目的之前,所述方法还包括:
    获取参考数据,所述参考数据包括以下数据中的至少一项:行驶状态数据、环境数据、路况数据、天气数据,所述行驶状态数据用于指示所述第二终端的行驶状态,所述环境数据用于指示所述第二终端外部和/或内部的环境,所述路况数据用于指示所述第二终端周围的路况;
    所述根据所述第一行为指示数据,确定所述第一行为的第一行为目的,包括:
    根据所述第一行为指示数据和所述参考数据,确定所述第一行为目的。
  3. 根据权利要求1所述的方法,其特征在于,所述根据所述第一行为指示数据,确定所述第一行为目的,包括:
    将所述第一行为指示数据输入行为目的识别模型,确定所述第一行为目的,所述行为目的识别模型用于指示行为指示数据和行为目的之间的映射关系。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,在所述基于所述第一行为目的,控制第二终端执行第一操作之前,所述方法还包括:
    基于所述第一行为目的,确定所述第一行为目的对应的所述第一操作。
  5. 根据权利要求4所述的方法,其特征在于,所述基于所述第一行为目的,获得所述第一行为目的对应的所述第一操作,包括:
    基于所述第一行为目的查询映射表,确定所述映射表中与所述第一行为目的对应的所述第一操作,其中,所述映射表中包括至少一个行为目的以及所述至少一个行为目的中每个行为目的对应的操作,所述至少一个行为目的包括所述第一行为目的。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述第一操作包括车速控制操作、闪灯操作、鸣笛操作和变道操作中的至少一项,所述车速控制操作包括制动操作、启动操作、加速操作或减速操作。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述控制所述第二终端执行第一操作,包括:
    向所述第二终端发送指示信息,所述指示信息用于指示所述第二终端执行所述第一操作。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,所述第一行为包括闪灯行为和/或鸣笛行为。
  9. 根据权利要求1-8中任一项所述的方法,其特征在于,所述第一行为指示数据包括以下数据中的至少一项:第一图像、第一音频、图像特征数据和音频特征数据,其中,所述第一图像中包含所述第一终端的灯光,所述第一音频中包含所述第一终端的鸣笛音,所 述图像特征数据用于指示所述灯光的位置、颜色、亮度、闪光频率和/或闪光次数,所述音频特征数据用于指示所述鸣笛音的方位、长短、频率和/或次数。
  10. 一种控制装置,其特征在于,包括:
    收发单元,用于获取第一终端的第一行为指示数据,所述第一行为指示数据用于指示所述第一终端的第一行为;
    处理单元,用于根据所述收发单元获取的所述第一行为指示数据,确定所述第一行为的第一行为目的;基于所述第一行为目的,对用户进行提醒和/或控制第二终端执行第一操作。
  11. 根据权利要求10所述的装置,其特征在于,所述处理单元具体用于:
    获取参考数据,所述参考数据包括以下数据中的至少一项:行驶状态数据、环境数据、路况数据、天气数据,所述行驶状态数据用于指示所述第二终端的行驶状态,所述环境数据用于指示所述第二终端外部和/或内部的环境,所述路况数据用于指示所述第二终端周围的路况;
    根据所述第一行为指示数据和所述参考数据,确定所述第一行为目的。
  12. 根据权利要求10所述的装置,其特征在于,所述处理单元具体用于:
    将所述第一行为指示数据输入行为目的识别模型,确定所述第一行为目的,所述行为目的识别模型用于指示行为指示数据和行为目的之间的映射关系。
  13. 根据权利要求10-12中任一项所述的装置,其特征在于,所述处理单元还用于:
    在所述基于所述第一行为目的,控制第二终端执行第一操作之前,基于所述第一行为目的,确定所述第一行为目的对应的所述第一操作。
  14. 根据权利要求13所述的装置,其特征在于,所述处理单元具体用于:
    基于所述第一行为目的查询映射表,确定所述映射表中与所述第一行为目的对应的所述第一操作,其中,所述映射表中包括至少一个行为目的以及所述至少一个行为目的中每个行为目的对应的操作,所述至少一个行为目的包括所述第一行为目的。
  15. 根据权利要求10-14中任一项所述的装置,其特征在于,所述第一操作包括车速控制操作、闪灯操作、鸣笛操作和变道操作中的至少一项,所述车速控制操作包括制动操作、启动操作、加速操作或减速操作。
  16. 根据权利要求10-15中任一项所述的装置,其特征在于,所述处理器单元用于:
    控制所述收发单元向所述第二终端发送指示信息,所述指示信息用于指示所述第二终端执行所述第一操作。
  17. 根据权利要求10-16中任一项所述的装置,其特征在于,所述第一行为包括闪灯行为和/或鸣笛行为。
  18. 根据权利要求10-17中任一项所述的装置,其特征在于,所述第一行为指示数据包括以下数据中的至少一项:第一图像、第一音频、图像特征数据和音频特征数据,其中,所述第一图像中包含所述第一终端的灯光,所述第一音频中包含所述第一终端的鸣笛音,所述图像特征数据用于指示所述灯光的位置、颜色、亮度、闪光频率和/或闪光次数,所述音频特征数据用于指示所述鸣笛音的方位、长短、频率和/或次数。
  19. 一种控制装置,包括至少一个处理器和收发器,所述至少一个处理器和所述收发器耦合,其特征在于,所述至少一个处理器执行程序或指令时,所述控制装置实现如权利 要求1-9中任一项所述的方法。
  20. 一种终端,其特征在于,所述终端包括如权利要求10-18中任一项所述的控制装置或如权利要求19所述的控制装置。
  21. 一种芯片,包括至少一个处理器以及接口电路,所述接口电路用于为所述至少一个处理器提供数据、指令或者信息的发送或接收,其特征在于,当所述至少一个处理器执行程序代码或者指令时,实现上述权利要求1-9中任一项所述的方法。
  22. 一种计算机可读存储介质,用于存储计算机程序,其特征在于,所述计算机程序包括用于实现上述权利要求1-9中任一项所述的方法的指令。
  23. 一种计算机程序产品,所述计算机程序产品中包含指令,其特征在于,当所述指令在计算机或处理器上运行时,使得所述计算机或所述处理器实现上述权利要求1-9中任一项所述的方法。
  24. 一种车辆,其特征在于,所述车辆包括如权利要求10-18中任一项所述的控制装置或,或如权利要求19所述的控制装置,或如权利要求20所述的终端。
PCT/CN2022/081453 2021-03-18 2022-03-17 控制方法、装置和系统 WO2022194246A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110291392.1 2021-03-18
CN202110291392.1A CN115116247A (zh) 2021-03-18 2021-03-18 控制方法、装置和系统

Publications (2)

Publication Number Publication Date
WO2022194246A1 WO2022194246A1 (zh) 2022-09-22
WO2022194246A9 true WO2022194246A9 (zh) 2022-11-24

Family

ID=83322098

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/081453 WO2022194246A1 (zh) 2021-03-18 2022-03-17 控制方法、装置和系统

Country Status (2)

Country Link
CN (1) CN115116247A (zh)
WO (1) WO2022194246A1 (zh)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107839593A (zh) * 2016-09-18 2018-03-27 西华大学 汽车后车超车提示系统
CN109741609B (zh) * 2019-02-25 2021-05-04 南京理工大学 一种基于麦克风阵列的机动车鸣笛声监测方法
CN110097775A (zh) * 2019-04-29 2019-08-06 大众问问(北京)信息科技有限公司 一种行车信息提醒方法、装置及系统
CN110718093B (zh) * 2019-10-16 2021-11-16 联想(北京)有限公司 针对车辆鸣笛的处理方法和第一车辆

Also Published As

Publication number Publication date
CN115116247A (zh) 2022-09-27
WO2022194246A1 (zh) 2022-09-22

Similar Documents

Publication Publication Date Title
US10699569B2 (en) Information processing apparatus, information processing method, and program
US10625674B2 (en) System and method for generation of a preventive alert
CN107415956B (zh) 用于检测和传送非连接车辆的打滑的系统和方法
JP2019194071A (ja) 合流行動システム及び合流車両のための方法
JP2019192233A (ja) 合流行動システム及び本線車両のための方法
KR102613792B1 (ko) 촬상 장치, 화상 처리 장치 및 화상 처리 방법
JP6935800B2 (ja) 車両制御装置、車両制御方法、および移動体
JP2019500658A (ja) 車両に安全に追い付けるように運転を支援するシステムおよび方法
CN111278702B (zh) 车辆控制装置、具有该车辆控制装置的车辆以及控制方法
US20220141426A1 (en) Electronic device and method for processing data received from in-vehicle electronic device
CN110576808B (zh) 车辆、车机设备及其基于人工智能的场景信息推送方法
WO2021065626A1 (ja) 交通制御システム、交通制御方法及び制御装置
WO2021187039A1 (ja) 情報処理装置、情報処理方法及びコンピュータプログラム
WO2023232046A1 (zh) 基于车联网的车辆控制方法及控制系统
CN108847887B (zh) 一种lifi通信方法、可读存储介质和一种车载终端
WO2021070768A1 (ja) 情報処理装置、および情報処理システム、並びに情報処理方法
WO2022194246A9 (zh) 控制方法、装置和系统
US11383641B2 (en) System and method for a remote vehicle light check
WO2021098220A1 (zh) 一种控制方法及相关设备
US20220319308A1 (en) Smart traffic assistant systems and methods
JP7110914B2 (ja) 情報処理装置、プログラム、および情報処理方法
JP2012256138A (ja) 携帯端末装置およびこれを備えた運転評価システム
WO2023132055A1 (ja) 評価装置、評価方法、及びプログラム
JP7514612B2 (ja) 車両動作適応システムおよび方法
EP4368450A1 (en) Vehicle light control method, lighting system, and vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22770599

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22770599

Country of ref document: EP

Kind code of ref document: A1