CN115116247A - Control method, device and system - Google Patents

Control method, device and system Download PDF

Info

Publication number
CN115116247A
CN115116247A CN202110291392.1A CN202110291392A CN115116247A CN 115116247 A CN115116247 A CN 115116247A CN 202110291392 A CN202110291392 A CN 202110291392A CN 115116247 A CN115116247 A CN 115116247A
Authority
CN
China
Prior art keywords
behavior
terminal
data
indicating
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110291392.1A
Other languages
Chinese (zh)
Inventor
杨凡
彭康
张桂成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110291392.1A priority Critical patent/CN115116247A/en
Priority to PCT/CN2022/081453 priority patent/WO2022194246A1/en
Publication of CN115116247A publication Critical patent/CN115116247A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The control method, the control device and the control system can improve the driving safety. The method can comprise the following steps: acquiring first behavior indicating data of a first terminal, wherein the first behavior indicating data is used for indicating a first behavior of the first terminal; determining a first behavior purpose of the first behavior according to the first behavior indicating data; and based on the first action purpose, reminding the user and/or controlling the second terminal to execute the first operation.

Description

Control method, device and system
Technical Field
The present application relates to the field of smart driving technology, and more particularly, to a control method, apparatus, and system in the field of smart driving technology.
Background
With the development of society and the advancement of technology, smart vehicles are gradually entering the daily lives of people. Since the smart car has a variety and variability of running environments (e.g., changes in behavior of surrounding vehicles) faced by the smart car during running, more and more sensors are installed on the smart car to sense the changes in behavior of the surrounding vehicles.
Light and whistling sound and the like are important tools for communication between vehicles, the existing intelligent automobile can sense the light and/or the whistling sound through a sensor arranged on the automobile, judge the change of the behavior of the surrounding vehicles and assist driving based on the change of the behavior of the surrounding vehicles, so that the driving safety is improved.
However, in some situations during the driving process, such as the situation that the light and/or the whistle are not noticed, the light and the whistle of the vehicle may not play a role in communication, which may bring a certain safety risk to the intelligent driving and the unmanned driving, and may even cause a traffic accident, and therefore, may greatly affect the driving safety of the intelligent vehicle.
Disclosure of Invention
The application provides a control method, a control device and a control system, which can improve the driving safety.
In a first aspect, an embodiment of the present application provides a control method, where the method includes: acquiring first behavior indicating data of a first terminal, wherein the first behavior indicating data is used for indicating a first behavior of the first terminal; determining a first behavior purpose of the first behavior according to the first behavior indicating data; and based on the first action purpose, reminding the user and/or controlling the second terminal to execute the first operation.
In a possible implementation manner, the method can be applied to an application scenario in which at least one first terminal reminds a second terminal through a first behavior in unmanned driving, automatic driving, intelligent driving or internet driving.
Optionally, the embodiment of the present application does not limit the first behavior.
In one possible implementation, the first behavior may include a flashing behavior and/or a whistling behavior.
In another possible implementation, the first behavior may also include a shifting behavior and/or a lane-change behavior.
In one possible implementation, the method may be applied to a control system, and the control system may include the central control device, an audio capture device, and a video capture device. Optionally, the control system may further include a network service device.
It should be noted that the control method may be executed by the central control device or the network service device, and the embodiment of the present application is not limited thereto. In the first aspect, the control method is described as an example that the central control device executes, and a process executed by the network service device in the control method may refer to a process executed by the central control device, which is not described in detail in this embodiment.
According to the control method provided by the embodiment of the application, the central control device can determine the first behavior purpose of the first terminal based on the first behavior indicating data; and in response to the first action purpose, the second terminal is controlled to execute the first operation, that is, the central control device on the second terminal can judge the first action purpose (or intention) of the first terminal based on the first action indicating data of the first terminal and give timely feedback based on the first action purpose (or intention), so that the driving safety can be improved.
Optionally, the embodiment of the present application does not limit specific contents included in the first behavior indicating data.
In a possible implementation manner, the first behavior indicating data may include at least one of a first image including a light of the first terminal, a first audio including a whistling sound of the first terminal, image feature data indicating a position, a color, a brightness, a flashing frequency and/or a flashing number of the light, and audio feature data indicating an orientation, a length, a frequency and/or a number of the whistling sound.
According to the control method provided by the embodiment of the application, the audio characteristic data and the image characteristic data can more comprehensively describe the conditions of the light and the whistle of the first terminal, and the accuracy of determining the first action can be improved.
Optionally, the central control device may acquire the first behavior indication data in multiple ways, which is not limited in this embodiment of the present application.
In a possible implementation manner, the central control device may receive the first image acquired by the image acquisition device.
In another possible implementation manner, the central control device may receive the first audio collected by the audio collecting device.
It should be noted that the image feature data is obtained by performing feature extraction on the first image, and the audio feature data is obtained by performing feature extraction on the first audio, and a specific feature extraction manner may refer to the existing related art, which is not limited in this embodiment of the present application.
In a possible implementation manner, the central control device may perform feature extraction on the first image to obtain the image feature data.
In another possible implementation manner, the central control device may perform feature extraction on the first audio to obtain the audio feature data.
In a possible implementation manner, the central control device may determine the first behavior purpose based on the first behavior indication data and a preset mapping relationship a, where the mapping relationship a is used for indicating a correspondence relationship between the behavior indication data and the behavior purpose.
Optionally, the embodiment of the present application does not limit the specific form of the mapping relationship a.
In one possible implementation, the mapping relationship a may be a mapping table a.
According to the control method provided by the embodiment of the application, the central control device obtains the mapping relation through the mapping table search, and the calculation speed can be improved.
In another possible implementation, the mapping A may identify the model A for behavioral purposes.
According to the control method provided by the embodiment of the application, the central control device obtains the mapping relation through the behavior purpose recognition model calculation, and the calculation accuracy can be improved.
Optionally, the central control device may obtain the mapping relationship a in a plurality of ways, which is not limited in this embodiment of the present application.
In a possible implementation manner, the central control device may generate the mapping relationship a by itself.
In another possible implementation manner, the central control device may receive the mapping relationship a from other devices.
In yet another possible implementation manner, the central control device may pre-configure the behavior purpose recognition model.
Optionally, the central control device may determine the first behavior purpose based on the first behavior indication data and reference data, the reference data including at least one of the following data: the driving state data is used for indicating the driving state of the second terminal, the environment data is used for indicating the external and/or internal environment of the second terminal, and the road condition data is used for indicating the road condition around the second terminal.
According to the control method provided by the embodiment of the application, the central control device determines the first action purpose by combining the reference data and the first action index data, and the accuracy of determining the first action purpose can be improved.
Optionally, the central control device may obtain the reference data in various ways, which is not limited in this embodiment of the application.
In one possible implementation, the central control device may receive the reference data from other devices.
In one possible implementation, the central control device may determine the reference data based on the image and/or audio captured by the image capturing device.
Optionally, the central control device determines the first behavior purpose based on the first behavior indicating data, the reference data and a preset mapping relationship B, where the mapping relationship B is used to indicate a corresponding relationship between different reference data downlink behavior indicating data and behavior purposes.
It should be noted that the specific form and the obtaining manner of the mapping relationship B may refer to the specific form and the obtaining manner of the mapping relationship a, and are not described herein again to avoid repetition.
In one possible implementation, the central control device responds to the first action purpose, and reminds the user of the first action purpose and/or provides further operation suggestions for the user.
Optionally, the central control device may remind the user in various ways, which is not limited in the embodiment of the present application.
In a possible implementation manner, the central control device may remind the first action purpose and/or the operation suggestion through prompt words.
According to the control method provided by the embodiment of the application, the central control device reminds the first action purpose and/or the operation suggestion through the prompt words, so that the user can be given enough driving freedom, the driving safety is improved, and the driving flexibility is improved.
In another possible implementation, the central control device may voice-report the first action purpose and/or the operation suggestion.
According to the control method provided by the embodiment of the application, the central control device broadcasts the first behavior purpose and/or the operation suggestion through voice, the driving freedom degree can be enough for a user, the driving safety is improved, and the driving flexibility is improved. In addition, situations can be avoided where the user does not see the text reminder on the screen.
In another possible implementation manner, the central control device controls the second terminal to execute the first operation in response to the first action purpose.
According to the control method provided by the embodiment of the application, the central control device controls the second terminal to execute the first operation, and can timely react to the first action of the first terminal, so that the driving safety is improved. In addition, the method can be applied to unmanned application scenes.
Optionally, before the controlling the second terminal to perform the first operation in response to the first action, the central control device may determine the first operation corresponding to the first action based on the first action.
Optionally, the central control device may determine, in multiple ways and based on the first behavioral purpose, the first operation corresponding to the first behavioral purpose, which is not limited in this embodiment of the application.
In a possible implementation manner, the central control device may determine the first operation based on the first action purpose and a preset mapping relationship C, where the mapping relationship C is used to indicate a correspondence relationship between the action purpose and the operation.
It should be noted that the specific form and the obtaining manner of the mapping relationship C may refer to the specific form and the obtaining manner of the mapping relationship a, and are not described herein again to avoid repetition.
Optionally, before the central control device controls the second terminal to execute the first operation, the central control device may send a control request to a user, where the control request is used to inquire of the user whether to execute the first operation; after a first instruction of a user is acquired, controlling the second terminal to execute the first operation; or terminating to control the second terminal to execute the first operation after acquiring the second instruction of the user.
According to the control method provided by the embodiment of the application, the central control device controls the second terminal according to the actual requirements of the user, so that the driving safety is improved, and the driving flexibility is improved.
In another possible implementation manner, the central control device responds to the first action, reminds the user of the first action, and controls the second terminal to execute the first operation.
Optionally, the method may further include: the central control device can acquire a second operation actually executed by the second terminal within a preset time period; if the second operation is not identical to the first operation, the mapping relationship A and the mapping relationship B are updated.
Optionally, the central control device may send update information to the device that generates each mapping relationship (e.g., mapping relationship a or mapping relationship B), where the update information is used to indicate a correspondence between the first action and the second operation.
The control method provided by the embodiment of the application continuously updates each mapping relation according to the actual situation, and can improve the accuracy of behavior target identification.
In a second aspect, an embodiment of the present application further provides a control method, where the method may include: the network service device acquires first behavior indicating data of a first terminal, wherein the first behavior indicating data is used for indicating a first behavior of the first terminal; the network service device determines a first behavior purpose of the first behavior according to the first behavior indicating data; the network service device sends indication information to the central control device, wherein the indication information is used for indicating the first action purpose; the central control device responds to the first indication information, reminds a user of the first action purpose and/or controls the second terminal to execute the first operation.
In one possible implementation, the network service device may receive the first behavior indication data from the central control device.
In a possible implementation manner, in response to the first indication information, the central control apparatus reminds the user of the first action and/or controls the second terminal to perform the first operation, including: the central control device responds to the first indication information, determines the first operation, and controls the second terminal to execute the first operation.
It should be noted that steps not described in the second aspect may refer to the description of corresponding steps in the first aspect.
By adopting the control system provided by the embodiment of the application, the central control device does not need to determine the first behavior based on the first behavior indicating data and then is executed by the network service device, so that the computing capacity and the computing amount of the central control device can be reduced, the network service device can perform statistics and calculation based on big data when arranged on the cloud side, the accuracy of judging the first behavior can be improved, and the driving safety is improved.
In a third aspect, an embodiment of the present application further provides a control method, where the method may include: the method comprises the steps that a central control device obtains first behavior indicating data of a first terminal, wherein the first behavior indicating data are used for indicating a first behavior of the first terminal; the central control device determines a first behavior purpose of the first behavior according to the first behavior indicating data; the central control device sends first indication information to a network service device, wherein the first indication information is used for indicating the first action purpose; the network service device sends second indication information to the central control device based on the first action purpose indicated by the first indication information, wherein the second indication information is used for indicating to remind a user of the first action purpose and/or controlling a second terminal to execute a first operation; and the central control device responds to the second indication information, reminds the user of the first action purpose and/or controls the second terminal to execute the first operation.
It should be noted that steps not described in the second aspect may refer to the description of corresponding steps in the first aspect.
By adopting the control system provided by the embodiment of the application, the central control device does not need to determine the first operation based on the first action purpose, and the first operation is executed by the network service device instead, so that the computing capacity and the computing amount of the central control device can be reduced, the network service device can perform statistics and calculation based on big data when arranged on the cloud side, the accuracy of judging the first operation can be improved, and the driving safety is improved.
In a fourth aspect, an embodiment of the present application further provides a central control apparatus, configured to execute the method implemented by the central control apparatus in any possible implementation manner of the foregoing aspects or aspects. In particular, the central control device may comprise means for performing the method performed by the central control device in any possible implementation of the above aspects or aspects.
In a fifth aspect, an embodiment of the present application further provides a network service apparatus, configured to execute the method implemented by the network service apparatus in any possible implementation manner of the foregoing aspects or aspects. In particular, the network serving means may comprise means for performing the method performed by the network serving means in any possible implementation of the above aspects or aspects.
In a sixth aspect, an embodiment of the present application further provides a central control device, where the central control device includes: a memory, at least one processor, a transceiver, and instructions stored on the memory and executable on the processor. Further, the memory, the processor and the communication interface are in communication with each other via an internal connection path. The at least one processor executing the instructions causes the central control apparatus to implement the method performed by the central control apparatus in any possible implementation of the above aspects or aspects.
Optionally, the central control device in the fourth aspect or the sixth aspect may be integrated in a terminal.
In a seventh aspect, an embodiment of the present application further provides a network service apparatus, where the apparatus includes: a memory, at least one processor, a transceiver, and instructions stored on the memory and executable on the processor. Further, the memory, the processor and the communication interface are in communication with each other via an internal connection path. Execution of the instructions by the at least one processor causes the network serving apparatus to implement the method performed by the network serving apparatus in any possible implementation of the above-described aspects or aspects.
Alternatively, the network service apparatus described in the fifth aspect or the seventh aspect may be integrated in a server, such as a cloud server.
In an eighth aspect, the present application also provides a computer-readable storage medium for storing a computer program comprising instructions for implementing the method described in the above aspects or any possible implementation thereof.
In a ninth aspect, the present application also provides a computer program product comprising instructions which, when run on a computer, cause the computer to carry out the method described in the above aspects or any possible implementation thereof.
In a tenth aspect, the present application further provides a chip, including: input interface, output interface, at least one processor. Optionally, the chip device further includes a memory. The at least one processor is configured to execute the code in the memory, and when the code is executed by the at least one processor, the chip implements the method described in the above aspects or any possible implementation thereof.
Drawings
Fig. 1 is a schematic diagram of an application scenario 100 to which a control method provided in an embodiment of the present application is applied;
fig. 2 is a schematic block diagram of a control system 200 to which the control method provided in the embodiment of the present application is applied;
FIG. 3 is a schematic flow chart diagram of a control method 300 provided by an embodiment of the present application;
FIG. 4 is a schematic flow chart diagram of a control method 400 provided by an embodiment of the present application;
fig. 5 is a schematic block diagram of a control device 500 provided in an embodiment of the present application;
fig. 6 is a schematic block diagram of a control device 600 according to an embodiment of the present application;
fig. 7 is a schematic block diagram of a chip 700 provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Fig. 1 illustrates a schematic diagram of an application scenario 100 to which the control method provided in the embodiment of the present application is applied. As shown in fig. 1, the control method provided by the embodiment of the present application may be applied to an application scenario in which at least one first terminal (the first terminal 110 is shown in fig. 1) performs a warning on the second terminal 120 through a flashing behavior and/or a whistling behavior in unmanned driving, autonomous driving, smart driving, or internet driving.
Alternatively, the first terminal 110 may be a motor vehicle (e.g., an unmanned vehicle, a smart car, an electric vehicle, a digital car, etc.), an unmanned plane, a rail car, or a traffic light, etc., and may be a terminal capable of emitting lights and/or whistling sounds to the outside.
Alternatively, the second terminal 120 may be a motor vehicle (e.g., an unmanned vehicle, a smart car, an electric vehicle, a digital car, etc.), an unmanned plane, a rail car, a bicycle, a traffic light, etc., and a terminal capable of receiving lights and/or whistling sounds emitted from the outside.
In a possible implementation manner, the distance between the first terminal 110 and the second terminal 120 may be smaller than a preset first threshold, that is, the first terminal 110 may be located within a range centered on the second terminal 120 and taking the first threshold as a radius.
In other words, the first terminal 110 may be located near the second terminal 120, such as where the lights of the first terminal 110 may be observed and/or the whistle of the first terminal 110 may be received at the location of the second terminal 120.
For example: the first terminal 110 drives behind the second terminal 120, and if the first terminal 110 finds an abnormal situation such as the trunk of the second terminal 120 is not closed, the first terminal 110 may alert the second terminal 120 by whistling.
For another example: in some sections where whistling is prohibited, if the first terminal 110 wants to exceed the second terminal 120, the first terminal 110 may alert the second terminal by turning on the left turn light.
Taking the first terminal 110 as the vehicle 1 and the second terminal 120 as the vehicle 2 as an example, in the prior art, the vehicle 2 may detect the light and/or the whistle of the external vehicle through various sensors mounted on the vehicle, and remind the vehicle owner of the flashing behavior and/or the whistle behavior of the vehicle 1, however, there is a situation that the vehicle owner of the vehicle 2 does not notice the light and/or the whistle of the external vehicle 1 during the driving process (for example, the vehicle owner of the vehicle 2 does not notice the light of the vehicle 1 by lowering the head of the air conditioner, or the vehicle owner does not notice the whistle of the vehicle 1 due to an active noise reduction function inside the vehicle), in the above situation, the vehicle owner may not timely acquire the reminder of the vehicle 1, and thus, the driving safety is poor.
It should be noted that, in the embodiment of the present application, only the first terminal 110 is used for performing an application scenario of reminding the second terminal 120 through a flashing behavior and/or a whistling behavior, but the embodiment of the present application is not limited thereto.
Optionally, the embodiment of the present application may also be applied to an application scenario in which the first terminal 110 reminds the second terminal through other behaviors, for example, the first terminal 110 reminds the second terminal 120 through a speed change (including speed increase or speed decrease) behavior, a lane change behavior, a docking behavior, and the like, and the application scenario reminded through other behaviors may refer to an application scenario reminded through a flashing behavior and/or a whistling behavior.
Fig. 2 shows a schematic block diagram of a control system 200 to which the control method provided by the embodiment of the present application is applied. As shown in fig. 2, the control system 200 may include an image capturing device 210 and/or an audio capturing device 220, and a central control device 230, wherein the central control device 230 may be in communication with the image capturing device 210 and the audio capturing device 220, respectively.
The image capturing device 210 is configured to capture a first image, where the first image includes a light of the first terminal, that is, the first image is capable of indicating a flashing behavior of the first terminal.
For example: the image capturing device 210 may be a camera.
Optionally, the first image may be a frame image or an image stream, which is not limited in this embodiment of the application.
It should be noted that when the first image is an image stream, a flashing process of the light dynamics can be obtained.
The audio collecting device 220 is configured to collect a first audio, where the first audio includes a whistle sound of the first terminal, that is, the first audio can indicate a whistle action of the first terminal;
for example: the audio capture device 220 may be an audio sensor.
The central control device 230 is configured to obtain first behavior indication data of the first terminal, where the first behavior indication data is used to indicate a first behavior of the first terminal, and the first behavior may include a flashing light behavior and/or a whistling behavior; determining a first behavioral goal for the first behavior based on the first behavior indicating data; and controlling the second terminal to perform a first operation in response to the first behavioral goal.
For example: the central control device 230 may be a central control system.
It should be noted that, since the flashing behavior purpose represented by different positions, colors, flashing frequencies and/or flashing times of the lights of the first terminal may be different, and similarly, the whistle behavior represented by different directions, lengths, frequencies and/or flashing times of the whistle sounds of the first terminal may also be different, the central control device may comprehensively analyze and judge the first behavior purpose of the first terminal based on the flashing behavior and/or the whistle behavior of the first terminal.
Optionally, the first operation may include at least one of a vehicle speed control operation (such as a braking operation, a starting operation, an accelerating operation or a decelerating operation), a flashing operation, a whistle operation, and a lane changing operation, which is not limited in the embodiment of the present application.
Optionally, the embodiment of the present application does not limit the specific forms of the image capturing device 210, the audio capturing device 220, and the central control device 230.
In one possible implementation, the image capturing device 210, the audio capturing device 220, and the center control device 230 may be three independent devices, which may be separately installed or integrated in the second terminal.
In another possible implementation manner, the image capturing device 210 and the audio capturing device 220 may be integrated into one apparatus, and the central control device 230 is a separate device, and the apparatus and the central control device 230 are separately installed or integrated into the second terminal.
In yet another possible implementation manner, the image capturing device 210 and the central control device 230 may be integrated into one apparatus, and the audio capturing device 220 is a separate device, and the apparatus and the audio capturing device 220 are separately installed or integrated into the second terminal.
In yet another possible implementation manner, the audio capture device 220 and the central control device 230 may be integrated into one apparatus, and the image capture device 210 is a separate device, and the apparatus and the image capture device 210 are separately installed or integrated into the second terminal.
It should be noted that, regardless of the specific forms of the image capturing device 210, the audio capturing device 220 and the central control device 230, for clarity, the image capturing device, the audio capturing device and the central control device are used for description in the following description.
Optionally, the central control device 230 and the image capturing device 210 (or the audio capturing device 220) may communicate in various ways, which is not limited in this embodiment of the application.
In a possible implementation manner, the central control device 230 may communicate with the image capturing device 210 (or the audio capturing device 220) in a wired manner, which is not limited in this embodiment.
For example: the wired mode can be that the communication is realized through data line connection or internal bus connection.
In another possible implementation, the central control device 230 may communicate with the image capturing device 210 (or the audio capturing device 220) in a wireless manner.
For example: the wireless mode can be communication through a communication network, and the communication network can be a stationThe domain network may also be a wide area network (wan) switched by a relay device, or include a local area network and a wan. When the communication network is a local area network, the communication network may be, for example, a wireless fidelity (Wifi) hotspot network, a Wifi peer-to-peer (P2P) network, a bluetooth (bluetooth) network, a zigbee network, a Near Field Communication (NFC) network, or a future-possible universal short-range communication network. When the communication network is a wide area network, the communication network may be, for example, a third-generation wireless telephone technology (3G) network, a fourth-generation mobile communication technology (4G) network, a fifth-generation mobile communication technology (5G) network, a Public Land Mobile Network (PLMN), or the internetInternet) However, the present invention is not limited to this.
In the control system provided by the embodiment of the application, the central control device may determine the first behavior purpose of the first terminal based on the first behavior indication data; and in response to the first behavior purpose, controlling the second terminal to perform the first operation, that is, the central control device on the second terminal may determine the first behavior purpose (or intention) of the first terminal based on the first behavior indication data of the first terminal and give timely feedback based on the first behavior purpose (or intention), which can improve the safety of driving.
Optionally, the control system 200 may further include a network service device 240, and the network service device 240 may be in communication with the image capturing device 210, the audio capturing device 220, and the central control device 230, respectively.
In one possible implementation, the network service device 240 may be a computer device with communication and computing capabilities.
In a possible implementation manner, the network service device 240 is configured to obtain the first behavior indication data; determining the first behavioral goal based on the first behavior indicating data; based on the first action purpose, indication information for instructing the second terminal to perform the first operation is sent to the central control device 230. Accordingly, the central control device 230 is configured to control the second terminal to perform the first operation in response to the indication information.
Optionally, the network service device 240 may be disposed on a local or cloud side, which is not limited in this embodiment of the application.
In one possible implementation manner, when the network service device 240 is disposed on the cloud side, the network service device 240 may communicate with the image capturing device 210 (or the audio capturing device 220 or the central control device 230) in a wireless manner.
For example: the network service device 240 may be a server on the cloud side.
In another possible implementation manner, when the network service device 240 is locally provided, the network service device 240 may be a separate device installed or integrated in the second terminal.
In yet another possible implementation manner, when the network service device 240 is provided locally, the network service device 240 may be integrated with at least one of the image capturing device 210, the audio capturing device 220 and the central control device 230 in a device installed or integrated in the second terminal.
With the control system provided in the embodiment of the present application, the central control device 230 itself does not need to determine the first behavior purpose based on the first behavior indication data, and instead executes the first behavior purpose by the network service device 240, so that the capability requirement and the calculation amount of the central control device 230 can be reduced, and the network service device 240 can perform big data statistics and calculation when being disposed on the cloud side, and can improve the accuracy of determining the first behavior purpose, thereby improving the driving safety.
Fig. 3 shows a schematic flowchart of a control method 300 provided in an embodiment of the present application, and as shown in fig. 3, the method 300 may be applied to the application scenario 100 shown in fig. 1, may be applied to the control system 200 shown in fig. 2, and may be executed by a central control device in the control system 200.
S310, the central control device obtains first behavior indicating data of the first terminal, the first behavior indicating data are used for indicating first behaviors of the first terminal, and the first behaviors comprise flashing light behaviors and/or whistling behaviors.
Optionally, the number of the first terminals may be one or more, which is not limited in this embodiment of the application.
Optionally, the embodiment of the present application does not limit specific contents included in the first behavior indicating data.
In a possible implementation manner, the first behavior indicating data may include at least one of a first image including a light of the first terminal, a first audio including a whistling sound of the first terminal, image feature data indicating a position, a color, a brightness, a flashing frequency and/or a flashing number of the light, and audio feature data indicating an orientation, a length, a frequency and/or a number of the whistling sound.
Optionally, the central control device may obtain the first behavior indication data in a plurality of ways, which is not limited in this embodiment of the present application.
In a possible implementation manner, the central control device may receive the first image acquired by the image acquisition device.
In another possible implementation manner, the central control device may receive the first audio collected by the audio collecting device.
It should be noted that the image feature data is obtained by performing feature extraction on the first image, and the audio feature data is obtained by performing feature extraction on the first audio, and a specific feature extraction manner may refer to the existing related art, which is not limited in this embodiment of the present application.
In a possible implementation manner, the central control device may perform feature extraction on the first image to obtain the image feature data.
In another possible implementation manner, the central control device may perform feature extraction on the first audio to obtain the audio feature data.
And S320, the central control device determines a first behavior purpose of the first behavior according to the first behavior indicating data.
In a possible implementation manner, the central control device may determine the first behavior purpose based on the first behavior indication data and a preset mapping relationship a, where the mapping relationship a is used for indicating a correspondence relationship between the behavior indication data and the behavior purpose.
Optionally, the embodiment of the present application does not limit the specific form of the mapping relationship a.
In one possible implementation, the mapping relationship a may be a mapping table a.
For example: the mapping relationship a may be a mapping table as shown in the following table one.
Watch 1
First behavior indication data Object of the first action
Fog lamp flashing three times and whistling two times Reminding of low front visibility
Whistling is given once under the left turn light and the bright and dark alternating flashing lights 4 Reminding about overtaking
Whistling long tone for 10 seconds Reminding of coming vehicle at curve
In another possible implementation, the mapping A may identify the model A for behavioral purposes.
Optionally, the central control device may obtain the mapping relationship a in a plurality of ways, which is not limited in this embodiment of the present application.
In a possible implementation manner, the central control device may generate the mapping relationship a by itself.
For example: the central control device can train the behavior purpose recognition model A.
In another possible implementation manner, the central control device may receive the mapping relationship a from other devices.
For example: the central control device can receive the behavior purpose recognition model A from a model training device, and the model training device is used for training to obtain the behavior purpose recognition model.
Another example is: the central control device may receive the mapping table a (or the behavior purpose recognition model a) from the cloud network service device.
In yet another possible implementation manner, the central control device may pre-configure the behavior purpose recognition model.
Optionally, S320 may include: the central control device determines the first behavior object based on the first behavior indicating data and reference data, the reference data comprising at least one of the following data: the driving state data is used for indicating the driving state of the second terminal, the environment data is used for indicating the external and/or internal environment of the second terminal, and the road condition data is used for indicating the road condition around the second terminal.
For example: the driving state data may include a current vehicle speed, a current position, whether to brake, whether to accelerate and decelerate, and the like.
For another example: the external environment data may include points of interest (POIs) around the second terminal, and the internal environment data may include whether music in the vehicle is on, a sound volume of the music, an opening condition of a window, and the like.
For another example: the road condition data may include whether it is currently at an intersection, whether there is a curve ahead, whether it is currently on a one-way road, and the like.
For another example: the weather data may include sunny days, rainy days, snowy days, and the like.
Optionally, the central control device may obtain the reference data in various ways, which is not limited in this embodiment of the application.
In one possible implementation, the central control device may receive the reference data from other devices.
For example: the central control device can receive the reference data from the navigation device.
In one possible implementation, the central control device may determine the reference data based on an image and/or audio acquisition device acquired by an image acquisition device.
Optionally, the central control device determines the first behavior purpose based on the first behavior indicating data, the reference data and a preset mapping relationship B, where the mapping relationship B is used to indicate a corresponding relationship between different reference data downlink behavior indicating data and behavior purposes.
It should be noted that the specific form and the obtaining manner of the mapping relationship B may refer to the specific form and the obtaining manner of the mapping relationship a, and are not described herein again to avoid repetition.
S330, the central control device responds to the first action purpose, reminds the user and/or controls the second terminal to execute the first operation.
In one possible implementation, the central control device responds to the first action purpose, and reminds the user of the first action purpose and/or provides further operation suggestions for the user.
Optionally, the central control device may remind the user in various ways, which is not limited in the embodiment of the present application.
In one possible implementation, the central control device may voice-report the first action and/or the operation suggestion.
In another possible implementation manner, the central control device may remind the first action purpose and/or the operation suggestion through prompt words.
In another possible implementation manner, the central control device controls the second terminal to execute the first operation in response to the first action purpose.
Optionally, before the controlling the second terminal to perform the first operation in response to the first action, the central control device may determine the first operation corresponding to the first action based on the first action.
Optionally, the central control device may determine, in various ways and based on the first action purpose, the first operation corresponding to the first action purpose, which is not limited in this embodiment of the application.
In a possible implementation manner, the central control device may determine the first operation based on the first action purpose and a preset mapping relationship C, where the mapping relationship C is used to indicate a correspondence relationship between the action purpose and the operation.
It should be noted that the specific form and the obtaining manner of the mapping relationship C may refer to the specific form and the obtaining manner of the mapping relationship a, and are not described herein again to avoid repetition.
For example: the mapping relationship C may be a mapping table as shown in table two below.
Watch two
Object of the first action First operation
Warning of low front visibility Speed reduction, quick flashing of fog lamp
Reminding about overtaking Driving in the current lane
Reminding of coming vehicle at curve Deceleration, whistling for 10 seconds
Optionally, before the central control device controls the second terminal to perform the first operation, the central control device may send a control request to a user, where the control request is used to inquire whether the user performs the first operation; after a first instruction of a user is acquired, controlling the second terminal to execute the first operation; or terminating to control the second terminal to execute the first operation after acquiring the second instruction of the user.
In another possible implementation manner, the central control device responds to the first action purpose, reminds the user of the first action purpose, and controls the second terminal to execute the first operation.
Optionally, after S330, the central control device may acquire a second operation actually performed by the second terminal within a preset time period; if the second operation is not identical to the first operation, the mapping relation A and the mapping relation B are updated.
Optionally, the central control device may send update information to a device that generates each mapping relationship (e.g., mapping relationship a or mapping relationship B), where the update information is used to indicate a corresponding relationship between the first action purpose and the second action, so that the device updates each mapping relationship, which can continuously improve the accuracy of identifying the action purpose.
Alternatively, the method 300 may be performed by a network service device. Wherein, S320 and/or S330 are executed by the network service device, which can reduce the computing power and the computing amount of the central control device.
In a possible implementation manner, fig. 4 shows a schematic flowchart of a control method 400 provided in an embodiment of the present application, and as shown in fig. 4, the method 400 may be applied to the application scenario 100 shown in fig. 1, and may be applied to the control system 200 shown in fig. 2.
S410, the network service device obtains first behavior indication data of the first terminal, where the first behavior indication data is used to indicate a first behavior of the first terminal.
In one possible implementation, the network service device may receive the first behavior indication data from the central control device.
S420, the network service device determines a first action purpose of the first action according to the first action indication data.
S430, the network service device sends indication information to a central control device, and the indication information is used for indicating the first action purpose; accordingly, the central control device receives the indication information from the network service device.
S440, the central control device responds to the indication information, reminds the user of the first action purpose and/or controls the second terminal to execute the first operation.
It should be noted that, for parts not described in S410 to S440, reference may be made to corresponding descriptions in S310 to S330, and details are not described here again to avoid repetition.
The control method provided by the embodiment of the present application is described above with reference to fig. 3 and 4, and the control device for executing the control method will be described below with reference to fig. 5 to 7.
It should be noted that the control device may be the central control device described in the embodiment of the method 300, and can execute the method implemented by the central control device in the method 300; alternatively, the control device may be the network service device described in the embodiment of the method 400, and may be capable of executing the method implemented by the network service device in the method 400.
It is understood that the central control device or the network service device includes hardware and/or software modules for performing the above functions. The present application is capable of being implemented in hardware or a combination of hardware and computer software in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, with the embodiment described in connection with the particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In this embodiment, the central control device or the network service device may be divided into functional modules according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of dividing each function module by corresponding functions, fig. 5 shows a possible composition diagram of the control device (e.g. terminal) or the network service device (e.g. server) involved in the above embodiments, and as shown in fig. 5, the device 500 may include: a transceiving unit 510 and a processing unit 520.
Wherein the processing unit 520 may control the transceiving unit 510 to implement the method performed by the network control device or the terminal in the above-described method 200 embodiment, and/or other processes for the techniques described herein.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
In case an integrated unit is employed, the apparatus 500 may comprise a processing unit, a storage unit and a communication unit. The processing unit may be configured to control and manage operations of the apparatus 500, and for example, may be configured to support the apparatus 500 to perform steps performed by the above units. The memory unit may be used to support the apparatus 500 in executing stored program codes and data, etc. The communication unit may be used to support the communication of the apparatus 500 with other devices.
Wherein the processing unit may be a processor or a controller. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage unit may be a memory. The communication unit may specifically be a radio frequency circuit, a bluetooth chip, a Wi-Fi chip, or other devices that interact with other electronic devices.
In a possible implementation manner, the control device or the network service device according to this embodiment may be the device 600 with the structure shown in fig. 6, the device 600 may be a schematic structural diagram of a terminal, and may also be a schematic structural diagram of a server, and the device 600 includes a processor 610 and a transceiver 620, and the processor 610 and the transceiver 620 communicate with each other through an internal connection path. The related functions implemented by the processing unit 520 in fig. 5 may be implemented by the processor 610, and the related functions implemented by the transceiver unit 510 may be implemented by the processor 610 controlling the transceiver 620.
Optionally, the apparatus 600 may further comprise a memory 630, and the processor 610, the transceiver 620 and the memory 630 may communicate with each other through an internal connection path. The associated functions implemented by the memory unit depicted in fig. 5 may be implemented by the memory 630.
The embodiment of the present application further provides a computer storage medium, where a computer instruction is stored in the computer storage medium, and when the computer instruction runs on an electronic device, the electronic device is enabled to execute the relevant method steps to implement the control method in the foregoing embodiment.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the control method in the above embodiment.
In addition, the embodiment of the present application further provides an apparatus, which may specifically be a chip, a component or a module, and the apparatus may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the control method in the above-mentioned method embodiments.
Fig. 7 shows a schematic diagram of a chip 700. Chip 700 includes one or more processors 710 and interface circuits 720. Optionally, the chip 700 may further include a bus 730. Wherein:
processor 710 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 710. The processor 710 may be a general purpose processor, DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, discrete hardware component. The methods, steps disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The interface circuit 720 may be used for transmitting or receiving data, instructions or information, and the processor 710 may perform processing by using the data, instructions or other information received by the interface circuit 720, and may transmit the processing completion information through the interface circuit 720.
Optionally, the chip further comprises a memory, which may include read only memory and random access memory, and provides operating instructions and data to the processor. The portion of memory may also include non-volatile random access memory (NVRAM).
Alternatively, the memory stores executable software modules or data structures, and the processor may perform corresponding operations by calling operating instructions stored in the memory (which may be stored in an operating system).
Alternatively, the chip may be used in the control device or the access control device according to the embodiments of the present application. Optionally, the interface circuit 720 may be used to output the results of the execution by the processor 710. For the control method provided in one or more embodiments of the present application, reference may be made to the foregoing embodiments, which are not described herein again.
It should be noted that the functions corresponding to the processor 710 and the interface circuit 720 may be implemented by hardware design, may also be implemented by software design, and may also be implemented by a combination of software and hardware, which is not limited herein.
The embodiment of the present application further provides a terminal, where the terminal includes the apparatus 500 described in fig. 5, the apparatus 600 described in fig. 6, or the chip 700 described in fig. 7.
An embodiment of the present application further provides a server, where the terminal includes the apparatus 500 described in fig. 5, the apparatus 600 described in fig. 6, or the chip 700 described in fig. 7.
It should be noted that the control device, the computer storage medium, the computer program product, the chip, the terminal, or the server provided in this embodiment are all configured to execute the corresponding method provided above, and therefore, the beneficial effects achieved by the control device, the computer storage medium, the computer program product, the chip, the terminal, or the server may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (23)

1. A control method, comprising:
acquiring first behavior indicating data of a first terminal, wherein the first behavior indicating data is used for indicating a first behavior of the first terminal;
determining a first behavior purpose of the first behavior according to the first behavior indicating data;
and based on the first action purpose, reminding the user and/or controlling the second terminal to execute the first operation.
2. The method of claim 1, wherein prior to said determining a first behavioral goal for said first behavior based on said first behavior indicating data, said method further comprises:
acquiring reference data, the reference data comprising at least one of: the driving state data are used for indicating the driving state of the second terminal, the environment data are used for indicating the external and/or internal environment of the second terminal, and the road condition data are used for indicating the road condition around the second terminal;
the determining a first action purpose of the first action according to the first action indicating data includes:
determining the first behavioural goal in dependence on the first behavioural indication data and the reference data.
3. The method of claim 1, wherein determining the first behavioral goal based on the first behavior indicating data comprises:
inputting the first behavior indicating data into a behavior purpose recognition model, and determining the first behavior purpose, wherein the behavior purpose recognition model is used for indicating the mapping relation between the behavior indicating data and the behavior purpose.
4. The method according to any of claims 1-3, wherein prior to said controlling the second terminal to perform the first operation based on the first behavioral goal, the method further comprises:
and determining the first operation corresponding to the first action purpose based on the first action purpose.
5. The method of claim 4, wherein obtaining the first operation corresponding to the first action based on the first action comprises:
and determining the first operation corresponding to the first action purpose in the mapping table based on the first action purpose query mapping table, wherein the mapping table comprises at least one action purpose and an operation corresponding to each action purpose in the at least one action purpose, and the at least one action purpose comprises the first action purpose.
6. The method of any of claims 1-5, wherein the first operation comprises at least one of a vehicle speed control operation, a flashing light operation, a whistling operation, and a lane-changing operation, the vehicle speed control operation comprising a braking operation, a starting operation, an accelerating operation, or a decelerating operation.
7. The method according to any of claims 1-6, wherein said controlling the second terminal to perform a first operation comprises:
and sending indication information to the second terminal, wherein the indication information is used for indicating the second terminal to execute the first operation.
8. The method according to any one of claims 1-7, wherein the first behavior comprises a flashing behavior and/or a whistling behavior.
9. The method according to any of claims 1-8, wherein the first behavior indicating data comprises at least one of the following data: first image, first audio frequency, image characteristic data and audio frequency characteristic data, wherein, include in the first image the light of first terminal, include in the first audio frequency the whistle sound of first terminal, image characteristic data are used for instructing the position, colour, luminance, flash frequency and/or the flash number of times of light, audio frequency characteristic data are used for instructing the position, the length, the frequency and/or the number of times of whistle sound.
10. A control device, characterized by comprising:
a transceiver unit, configured to acquire first behavior indication data of a first terminal, where the first behavior indication data is used to indicate a first behavior of the first terminal;
the processing unit is used for determining a first behavior purpose of the first behavior according to the first behavior indicating data acquired by the transceiving unit; and based on the first action purpose, reminding the user and/or controlling the second terminal to execute the first operation.
11. The apparatus according to claim 10, wherein the processing unit is specifically configured to:
obtaining reference data, the reference data comprising at least one of: the driving state data are used for indicating the driving state of the second terminal, the environment data are used for indicating the external and/or internal environment of the second terminal, and the road condition data are used for indicating the road condition around the second terminal;
determining the first behavioural objective in dependence on the first behavioural indication data and the reference data.
12. The apparatus according to claim 10, wherein the processing unit is specifically configured to:
inputting the first behavior indicating data into a behavior purpose recognition model, and determining the first behavior purpose, wherein the behavior purpose recognition model is used for indicating the mapping relation between the behavior indicating data and the behavior purpose.
13. The apparatus according to any of claims 10-12, wherein the processing unit is further configured to:
before the second terminal is controlled to execute a first operation based on the first behavior purpose, determining the first operation corresponding to the first behavior purpose based on the first behavior purpose.
14. The apparatus according to claim 13, wherein the processing unit is specifically configured to:
querying a mapping table based on the first action purpose, and determining the first action corresponding to the first action purpose in the mapping table, where the mapping table includes at least one action purpose and an action corresponding to each action purpose in the at least one action purpose, and the at least one action purpose includes the first action purpose.
15. The apparatus according to any one of claims 10-14, characterized in that the first operation comprises at least one of a vehicle speed control operation, a flashing light operation, a whistling operation, and a lane-changing operation, the vehicle speed control operation comprising a braking operation, a starting operation, an accelerating operation, or a decelerating operation.
16. The apparatus according to any of claims 10-15, wherein the processor unit is configured to:
and controlling the transceiver unit to send indication information to the second terminal, wherein the indication information is used for indicating the second terminal to execute the first operation.
17. The method according to any of claims 10-16, wherein the first behavior comprises a flashing behavior and/or a whistling behavior.
18. The apparatus according to any of claims 10-17, wherein the first behavior indicating data comprises at least one of: first image, first audio frequency, image characteristic data and audio frequency characteristic data, wherein, include in the first image the light of first terminal, include in the first audio frequency the whistle sound of first terminal, image characteristic data are used for instructing the position, colour, luminance, flash frequency and/or the flash number of times of light, audio frequency characteristic data are used for instructing the position, the length, the frequency and/or the number of times of whistle sound.
19. A control apparatus comprising at least one processor and a transceiver, the at least one processor and the transceiver being coupled, wherein the control apparatus implements the method of any one of claims 1-9 when the at least one processor executes a program or instructions.
20. A terminal, characterized in that the terminal comprises a control device according to any one of claims 9-16 or a control device according to claim 19.
21. A chip comprising at least one processor and interface circuitry for providing said at least one processor with data, instructions or information transmission or reception, characterized in that said at least one processor, when executing program code or instructions, implements the method of any of the preceding claims 1-9.
22. A computer-readable storage medium for storing a computer program, characterized in that the computer program comprises instructions for implementing the method of any of the preceding claims 1-9.
23. A computer program product comprising instructions which, when run on a computer or processor, cause the computer or processor to carry out the method of any one of claims 1 to 9.
CN202110291392.1A 2021-03-18 2021-03-18 Control method, device and system Pending CN115116247A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110291392.1A CN115116247A (en) 2021-03-18 2021-03-18 Control method, device and system
PCT/CN2022/081453 WO2022194246A1 (en) 2021-03-18 2022-03-17 Control method, apparatus and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110291392.1A CN115116247A (en) 2021-03-18 2021-03-18 Control method, device and system

Publications (1)

Publication Number Publication Date
CN115116247A true CN115116247A (en) 2022-09-27

Family

ID=83322098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110291392.1A Pending CN115116247A (en) 2021-03-18 2021-03-18 Control method, device and system

Country Status (2)

Country Link
CN (1) CN115116247A (en)
WO (1) WO2022194246A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107839593A (en) * 2016-09-18 2018-03-27 西华大学 Automobile rear car is overtaken other vehicles prompt system
CN109741609B (en) * 2019-02-25 2021-05-04 南京理工大学 Motor vehicle whistling monitoring method based on microphone array
CN110097775A (en) * 2019-04-29 2019-08-06 大众问问(北京)信息科技有限公司 A kind of running information based reminding method, apparatus and system
CN110718093B (en) * 2019-10-16 2021-11-16 联想(北京)有限公司 Processing method for vehicle whistle and first vehicle

Also Published As

Publication number Publication date
WO2022194246A1 (en) 2022-09-22
WO2022194246A9 (en) 2022-11-24

Similar Documents

Publication Publication Date Title
US9970615B1 (en) Light-based vehicle-device communications
US9959763B2 (en) System and method for coordinating V2X and standard vehicles
CN107415956B (en) System and method for detecting and communicating slippage of an unconnected vehicle
CN111278702B (en) Vehicle control device, vehicle having the same, and control method
CN110395253B (en) Vehicle control device and computer-readable storage medium
US20170046578A1 (en) Focus system to enhance vehicle vision performance
CN107650778A (en) Light for vehicle control method and device and storage medium
US11227493B2 (en) Road speed limit identification method, road speed limit identification apparatus, electronic apparatus, computer program, and computer readable recording medium
CN106314424B (en) Householder method of overtaking other vehicles, device and automobile based on automobile
WO2021065626A1 (en) Traffic control system, traffic control method, and control device
WO2021070768A1 (en) Information processing device, information processing system, and information processing method
KR20190019808A (en) Method and apparatus for outputting contents using display in rear of vehicle
JP2019174992A (en) Information processing device and program
CN115131749A (en) Image processing apparatus, image processing method, and computer-readable storage medium
CN108847887B (en) LIFI communication method, readable storage medium and vehicle-mounted terminal
CN108733020A (en) Remote control equipment and method for vehicle
KR101871819B1 (en) Method and apparatus for outputting contents using display in rear of vehicle
CN107599965B (en) Electronic control device and method for vehicle
CN111098864B (en) Prompt method, device, automatic driving vehicle and storage medium
US20170092121A1 (en) Method and System for Determining and Using Property Associations
CN115116247A (en) Control method, device and system
CN112835351B (en) Control method and related equipment
US20210219379A1 (en) Communication device, base station, and communication system
JP6473337B2 (en) In-vehicle device, program for in-vehicle device, and speed warning system
JP2012256138A (en) Portable terminal device and driving evaluation system having the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination