CN113515060A - Control script processing method and device, electronic equipment and storage medium - Google Patents

Control script processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113515060A
CN113515060A CN202110758337.9A CN202110758337A CN113515060A CN 113515060 A CN113515060 A CN 113515060A CN 202110758337 A CN202110758337 A CN 202110758337A CN 113515060 A CN113515060 A CN 113515060A
Authority
CN
China
Prior art keywords
vehicle
specified
personality
target vehicle
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110758337.9A
Other languages
Chinese (zh)
Other versions
CN113515060B (en
Inventor
黄超超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiayu Intelligent Technology Co ltd
Original Assignee
Shanghai Xianta Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Xianta Intelligent Technology Co Ltd filed Critical Shanghai Xianta Intelligent Technology Co Ltd
Priority to CN202110758337.9A priority Critical patent/CN113515060B/en
Publication of CN113515060A publication Critical patent/CN113515060A/en
Application granted granted Critical
Publication of CN113515060B publication Critical patent/CN113515060B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a control script processing method and device of a vehicle, electronic equipment and a storage medium, wherein the control script processing method comprises the following steps: establishing a triggering relation between a specified monitoring event and a specified control result to obtain a current control script representing the triggering relation; determining release range information, wherein the release range information comprises an appointed personality; according to the release range information, releasing the current control script to a target vehicle so as to enable: the target vehicle can execute the specified control result when the specified monitoring event is monitored; wherein the current personality of the target robot in the target vehicle matches the specified personality.

Description

Control script processing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of vehicles, and in particular, to a control script processing method and apparatus, an electronic device, and a storage medium.
Background
In a vehicle, an on-board device is usually configured, and further, signal processing and man-machine interaction can be realized through the on-board device. In the in-vehicle apparatus, a robot is included. In the vehicle, the items required to be monitored by the robot and the control result required to be executed can be defined based on the control script.
However, in the related art, control scripts are written in advance, which is difficult to meet the changing requirements in the use process, and meanwhile, the control scripts of all robots are basically the same and are difficult to match different requirements.
Disclosure of Invention
The invention provides a control script processing method and device, electronic equipment and a storage medium, and aims to solve the problem that the requirements are difficult to meet.
According to a first aspect of the present invention, there is provided a control script processing method of a vehicle, including:
establishing a triggering relation between a specified monitoring event and a specified control result to obtain a current control script representing the triggering relation;
determining release range information, wherein the release range information comprises an appointed personality;
according to the release range information, releasing the current control script to a target vehicle so as to enable: the target vehicle can execute the specified control result when the specified monitoring event is monitored; wherein the current personality of the target robot in the target vehicle matches the specified personality.
Optionally, before determining the distribution range information, the method further includes:
if the selected or input vehicle description information is acquired, determining the vehicle portrait according to the vehicle description information;
if the selected or input user description information is acquired, determining the user portrait according to the user description information;
acquiring the designated personality;
the information for determining the release range comprises:
and determining the distribution range information according to the designated personality and the vehicle portrait and/or the user portrait.
Optionally, the issuing the current control script to the target vehicle according to the issuing range information includes:
obtaining vehicle information of a candidate vehicle, user information of a user associated with the candidate vehicle, and a robot personality of a robot in the candidate vehicle,
comparing the release range information with the vehicle information, the user information and the robot personality, and selecting the target vehicle from the candidate vehicles;
and issuing the current script to the target vehicle.
Optionally, the step of establishing a triggering relationship between the specified monitoring event and the specified control result to obtain a current control script representing the triggering relationship includes:
acquiring a specified monitoring event and a specified control result;
according to the specified monitoring event and the specified control result, a current display interface representing the triggering relation is displayed externally; the current display interface comprises:
an event display element characterizing the specified monitoring event;
a result display element that characterizes the specified control result, and:
characterizing a logical identification of the triggering relationship;
and after a script confirmation instruction for the current display interface is acquired, forming the current control script.
Optionally, the current personality is a personality currently configured by the target robot.
Optionally, the current personality is all configurable personalities of the target robot.
Optionally, the current personality is a personality that the target robot has been configured for.
Optionally, the specified event includes at least one of:
the target vehicle arrives at a designated area;
the current weather of the area where the target vehicle is located is designated weather information;
a designated vehicle signal is present at the target vehicle; the designated vehicle signal characterizes a driving behavior for the target vehicle;
the occurrence of the road on which the target vehicle is currently located specifies a traffic state;
detecting a specific picture by an image acquisition device of the target vehicle;
detecting specific voice content by a sound collection device of the target vehicle;
the human-computer interaction part of the target vehicle receives the specified information fed back by the user;
the sensor of the target vehicle detects that the detection information enters a specified section.
Optionally, the specified control result includes at least one of:
a display unit of the target vehicle displays a specified screen;
the sound unit of the target vehicle plays the specified voice content;
a window state of the vehicle of the target vehicle;
a change in a state of a vehicle component of the target vehicle; the vehicle part comprises at least one of a vehicle door, a vehicle window, a windscreen wiper, a vehicle lamp, an air conditioner, a vehicle seat control part, a vehicle mirror control part and an image acquisition part;
controlling a movable component of the target robot to generate a designated motion;
adjusting an emotion value of at least one emotion dimension of the target robot.
According to a second aspect of the present invention, there is provided a control script processing apparatus comprising:
the script establishing module is used for establishing a triggering relation between a specified monitoring event and a specified control result to obtain a current control script representing the triggering relation;
the range determining module is used for determining issuing range information, and the issuing range information comprises an appointed personality;
the issuing module is used for issuing the current control script to a target vehicle according to the issuing range information so as to enable: the target vehicle can execute the specified control result when the specified monitoring event is monitored; wherein the current personality of the target robot in the target vehicle matches the specified personality.
According to a third aspect of the invention, there is provided an electronic device comprising a processor and a memory,
the memory is used for storing codes and related data;
the processor is configured to execute the codes in the memory to implement the control script processing method according to the first aspect and the optional aspects thereof.
According to a fourth aspect of the present invention, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the control script processing method relating to the first aspect and its alternatives.
In the control script processing method, the control script processing device, the electronic equipment and the storage medium, the triggering relation between the specified monitoring event and the specified control result can be established in the background, the current control script representing the triggering relation is obtained, a mechanism for establishing and issuing the control script is formed, a basis can be provided for updating and writing the control script, and further the updating requirement can be matched and met in the using process of the vehicle.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a first flowchart illustrating a control script processing method according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating step S101 according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a control script processing method according to an embodiment of the invention;
FIG. 4 is a flowchart illustrating step S103 according to an embodiment of the present invention;
FIG. 5 is a first diagram illustrating program modules for controlling a script processing apparatus according to an embodiment of the present invention;
FIG. 6 is a second flowchart of program modules for controlling the script processing apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device in an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The control script processing method and device for the vehicle provided by the embodiment of the invention can be applied to a background, and can also be applied to a control device which is communicated with the background, and furthermore, the background and the control device can be, for example, an electronic device 30 mentioned later, and the background can be configured to be capable of being directly or indirectly communicated with each vehicle (which can be understood as a candidate vehicle) respectively. In addition, the background or control device can be further configured with a human-computer interaction interface and a human-computer interaction component so as to meet at least part of execution processes of the control script processing method and device in the following text.
Referring to fig. 1, in an embodiment of the present invention, a method for processing a control script of a vehicle includes:
s101: establishing a triggering relation between a specified monitoring event and a specified control result to obtain a current control script representing the triggering relation;
s102: determining release range information;
s103: according to the release range information, releasing the current control script to a target vehicle so as to enable: the target vehicle is capable of executing the specified control result when the specified monitoring event is monitored.
The issuing range information can be understood as any information describing vehicles to which the current control script can be issued, and specifically includes a designated personality. The designated personality may be understood as a designated personality or personalities of the robot.
In some embodiments, the publishing scope information may further include:
vehicle representation of target vehicle: and/or: a user representation of a user associated with the target vehicle.
The vehicle image therein may be, for example but not limited to: the range of the brand, model, production time, production place, etc. of the vehicle; user representations therein may be, for example but not limited to: the age, occupation, income, address, user rating, etc. of the user.
In other aspects, the distribution range information may further include a desired range of brand, model, production time, production place, status, etc. of other on-board devices of the target vehicle (including, but not limited to, an air conditioner, an image capturing unit, an entertainment device, etc.).
The personality can be any information describing the anthropomorphic personality of the robot, and in one example, the personality can include the following categories: inward, outward, lovely, etc.; further, based on the above example, personality may be further subdivided based on gender, for example: sensitive inward sister, active outward brother, lovely panda, etc.; in one example, the personality may also be divided based on age: a savory milk, a harsh tertiary, lovely loli, etc. The personality may be defined without departing from the scope of embodiments of the invention.
Wherein the current personality of the target robot in the target vehicle matches the specified personality.
In some examples, the match may indicate that the current personality is the same as the specified personality.
In another example, the match may also indicate that the current personality is similar to or associated with the specified personality. The similarity and the association can be defined according to any mode, and it is seen that whether the personality is similar or not and whether the association can be defined in advance can obtain personality definition information, and further, when judging whether the personality is matched or not, judgment can be performed based on the personality definition information, for example: a sister to the sensitive interior may be defined as a similar personality to a brother to the sensitive interior, and/or: a sister that is sensitive inward and a sister that is open outward are defined as associated personas.
By the mode, the vehicles capable of issuing the control scripts can be effectively expanded based on the designated personality, operation burden can be reduced, and the control scripts can be issued according to various expansion possibilities of the personality, wherein after the personality is designated, whether the corresponding vehicle can be used as an issuing object can be judged based on the designated personality even if a new personality is generated (defined) in the subsequent maintenance process, and the personality does not need to be designated again for each control script; for example: when a control script X is issued, it is possible to issue the control script X to a vehicle of a1 personality, and when the robot of a vehicle B changes from the B personality to a2 personality (may or may not be newly created), since the personality before and after the change is not a1 personality, it is generally not possible to issue the control script X, and if it is necessary to issue the control script X to a vehicle B of a2 personality, it is necessary to issue the a2 personality as a designated personality of the control script X or to newly create a control script, but in the above embodiment, it is possible to automatically issue the control script X to a vehicle B of a2 personality based on the similarity or correlation between the a1 personality and the a2 personality.
In one example, the current personality is a personality currently configured for the target robot. Furthermore, the method can ensure that the release of the control script can be matched with the current personality of the target robot in the target vehicle.
In another example, the current personality is all configurable personalities of the target robot. Furthermore, the control script can be guaranteed to be issued to meet the possible personality of the target vehicle in a matching mode, and the control script is prevented from being issued again after the personality of the target vehicle is switched every time.
In yet another example, the current personality is a personality that the target robot has been configured for. Furthermore, because the configured personality is usually the most likely to be used and switched by the target robot of the target vehicle, the control script can be ensured to be issued to match the possible personality of the target vehicle, the control script is prevented from being issued again after the personality of the target vehicle is switched every time, and the burden caused by too many control scripts issued to the target vehicle is avoided.
Furthermore, each control script may be issued only once for each target vehicle, avoiding repeated transmission for the control script.
In some embodiments, the current personality may be determined by the vehicle based on the connected accessories of the robot; in one example, the robot may be provided with variable figures, namely: the robot may have different figures and different attitudes correspond to different fittings (the fittings may be wearing fittings of the robot, such as hat, glasses, etc., or body parts of the robot itself, such as ears, tail, or any other fittings capable of being fitted to the robot), based on which the current personality can be determined by the connected fittings; in part of the scheme, if the accessory has data storage capacity, the stored personality can be read from the accessory to serve as the current personality; the solution can adapt to changing current personality of the vehicle in accordance with demand, preference.
In another scheme, the current personality may be determined by the vehicle in response to a personality-specifying operation of the user, or may be a personality pre-stored in the robot.
The specific monitoring event can be any event that can be monitored based on the vehicle, such as: the specified monitoring event comprises at least one of:
the target vehicle arrives at a designated area; the designated area may be understood as a preset area range, for example, a road, a street, a business circle, a garden, a region of a city, a province, a country, etc., or an area corresponding to a POI, for example, a house, a shop, a mailbox, a bus station, a tourist attraction, a school, etc.;
the current weather of the area where the target vehicle is located is designated weather information; the current weather can be networked to acquire the weather information on the network through the weather information, and can also be acquired through various sensors, such as a temperature sensor, a humidity sensor and the like, and the specified weather information comprises at least one of the following: specifying a temperature, a humidity, a wind power level, an air quality and a weather category, wherein the weather category can be, for example, a weather category of sunny days, cloudy days, light rain, heavy rain, thunder and lightning;
a designated vehicle signal is present at the target vehicle; the designated vehicle signal characterizes a driving behavior for the target vehicle; specifying that the vehicle signal is characteristic of a driving behavior for the vehicle; the designated vehicle signal can be the latitude of sudden braking, flameout, acceleration, deceleration, vehicle starting, the number of people in the vehicle reaching the preset number, the speed reaching the speed threshold, sudden turning, the acceleration duration reaching the corresponding preset threshold and the like;
the occurrence of the road on which the target vehicle is currently located specifies a traffic state; the traffic state can be obtained in one or more map operators in a networking manner, the traffic state of a road can also be obtained through a positioning device, and the specified traffic state can comprise a specified road congestion level, a specified road speed limit level, a specified accident occurrence level and the like;
detecting a specific picture by an image acquisition device of the target vehicle; the image acquisition device can be a camera, an image sensor and the like, the designated range can be a range which can be acquired by the image acquisition device, different ranges can be preset based on a specific picture, the specific picture can be a specific picture, for example, a designated person or animal or object in the designated range is detected to appear, a designated expression appears in a person in the designated range, a designated building or landscape is shot, or a picture which changes continuously can be a certain picture, for example, a designated action appears in a person or animal in the designated range;
detecting specific voice content by a sound collection device of the target vehicle; the sound collection device can be a microphone to collect sound in a specific range, and specific voice content can be understood as specific voice content collected through voice interaction and can also be understood as sound matched with the specific voice content in an environment actively collected by the robot;
the human-computer interaction part of the target vehicle receives the specified information fed back by the user; the human-computer interaction part can be interaction equipment such as a touch screen, voice interaction equipment, visual interaction equipment and the like;
a sensor of the target vehicle detects that detection information enters a specified interval; the sensors therein may be, for example, odor sensors, air quality sensors, etc., to monitor odor, air quality, temperature, humidity, etc., within a given environment.
The specified control result may be any result that can cause the target vehicle (particularly the robot therein) to change. In a specific example, the specified control result may include at least one of:
a display unit of the target vehicle displays a specified screen; the display unit can be, for example, a display screen and a touch screen of a vehicle machine of a target vehicle, or a display screen and a touch screen of a robot therein;
the sound unit of the target vehicle plays the specified voice content; the sound unit can be any device or combination of devices capable of playing sound;
a change in a state of a vehicle component of the target vehicle; the vehicle part comprises at least one of a vehicle door, a vehicle window, a windscreen wiper, a vehicle lamp, an air conditioner, a vehicle seat control part, a vehicle mirror control part and an image acquisition part;
controlling a movable component of the target robot to generate a designated motion; the moveable parts of the robot may be, for example, the head, body, limbs, eyes, ears, mouth, nose, hands, feet, etc.; it is also possible, for example, for the exterior of the robot to be differently dressed to change the robot's appearance, the different dressing corresponding to fitting different accessories (which may be wearing accessories of the robot, such as hats, glasses, etc.), and the movable part of the robot then being the fitted accessory;
adjusting the emotion values of a plurality of emotion dimensions of the target robot; the multiple emotion dimensions can be, for example, those of pleasure, depression, anger, fear, jealoy, love and the like, further, the pleasure emotion dimensions can be divided into joyful emotion dimensions and excited emotion dimensions, and urgency and the like emotions can be increased; one control script may correspond to the adjustment of the sentiment value for one sentiment dimension or may correspond to the adjustment of the sentiment values for a plurality of sentiment dimensions. Based on the change of each emotion dimension, a further control result can be realized; for example: when the numerical distribution of each emotion numerical value or the numerical value itself satisfies a predefined condition, a control result corresponding to the condition may be executed, and the content of the control result may be understood with reference to a specified control result.
The triggering relationship between the specified monitoring event and the specified control result can be understood as follows: after the appointed monitoring event is monitored to occur, the appointed control result can be triggered and executed, wherein the appointed monitoring event and the appointed control result can be triggered by multiple monitoring events to one control result, can be triggered by one monitoring event to one control result, and can be triggered by one monitoring event to multiple control results.
In the above scheme based on steps S101 to S103, the trigger relationship between the designated monitoring event and the designated control result may be established in the background to obtain the current control script representing the trigger relationship, and a mechanism for establishing and issuing the control script is formed, which may provide a basis for updating and writing the control script, and further may help to match and meet the updating requirement in the use process of the vehicle, and meanwhile, the control script of each vehicle may be issued based on the personality so as to meet the control requirement of the robot of the designated personality in a targeted manner, and the issued script may fully embody the individuation and differentiation of different robots (and vehicles using different robots).
In one embodiment, referring to fig. 2, step S101 may include:
s1011: acquiring a specified monitoring event and a specified control result;
s1012: according to the specified monitoring event and the specified control result, a current display interface representing the triggering relation is displayed externally;
s1013: and after a script confirmation instruction for the current display interface is acquired, forming the current control script.
The process of step S101 may enter a selection and/or input interface of the monitoring event by, for example, clicking a button, and then in the interface, the user may select the monitoring event, and may also adjust a parameter corresponding to the monitoring event, and further obtain the specified monitoring event through the interface;
the process of step S101 may enter a selection and/or input interface of the control result by, for example, clicking a button, and then in the interface, the user may select the control result, and may also adjust a parameter corresponding to the control result, and further obtain the designated control result through the interface.
The current display interface comprises:
an event display element characterizing the specified monitoring event;
a result display element that characterizes the specified control result, and:
and characterizing the logic identification of the trigger relationship.
The display elements therein may be, for example, boxes and graphics and/or text within boxes (which may be determined based on a specified control result, a specified monitoring event), and the logical identification therein may be, for example, a connecting line with a point between boxes. In other examples, other manners of display may be used.
The script confirmation instruction may be, for example, information formed by a user clicking a confirmation button in the interface.
In the above scheme, the content of the control script to be formed is displayed externally in a visual manner, and after the display, a user can judge whether the content is the required control script based on the displayed content, and also can modify or re-establish the control script based on the displayed content, so that the accuracy and operability of the establishment process of the control script are ensured, and the control script is more intuitive and is convenient to understand, modify and verify.
In one embodiment, referring to fig. 3, before step S102, the method further includes:
s104: whether the selected or input vehicle description information is acquired;
s106: whether the selected or input user description information is acquired;
if the determination in step S104 is yes, step S105 may be executed: determining the vehicle representation based on the vehicle descriptive information;
if the determination in step S106 is yes, step S107 may be implemented: determining the user representation according to the user description information;
s108: acquiring the designated personality;
further, by the above steps, the designated personality may be acquired, and the user figure and/or the vehicle figure may be acquired, and the step S102 may further include:
s1021: and determining the distribution range information according to the designated personality and the vehicle portrait and/or the user portrait.
Wherein the vehicle representation and the user representation are understood with reference to the foregoing description.
The vehicle description information may be any information describing the vehicle, and the user description information may be any information describing the user; the user description information and the vehicle description information may be part or all of the vehicle figure and the user figure, or may not be part of them, and may be information for specifying the user figure and the vehicle figure, for example.
Any representation of the vehicle, user, and information identifying the representation in the art is within the scope of the above description. In addition, the designated personality, the vehicle description information and the user description information can be any information which is input or selected by people. For example: the options and labels of various personalities, vehicle description information and user description information can be provided in the interface, and one or more options and labels can be selected from the options and labels to serve as the designated personalities, the vehicle description information and the user description information; for another example: the personality, the vehicle description information and the user description information can be input in a user-defined mode.
In the scheme, the user portrait, the vehicle portrait and the personality are specified, so that the range for issuing the control script is determined, the specified range can be accurately matched with users, vehicles and personalities which possibly have requirements, and the accurate matching of the requirements is guaranteed.
In one embodiment, referring to fig. 4, step S103 may include:
s1031: obtaining vehicle information of a candidate vehicle, user information of a user associated with the candidate vehicle, and a robot personality of a robot in the candidate vehicle,
s1032: comparing the release range information with the vehicle information, the user information and the robot personality, and selecting the target vehicle from the candidate vehicles;
s1033: and issuing the current script to the target vehicle.
The vehicle information can be understood by referring to the vehicle description information and the information in the vehicle portrait in the previous text; the user information can be understood by referring to the user description information and the information in the user figure.
The alignment can be, for example:
if the delivery range information includes a vehicle representation but does not include a user representation, then: if the vehicle information of the candidate vehicle conforms to the vehicle image in the release range information and the specified personality is matched with the current personality of the robot in the candidate vehicle, determining that the candidate vehicle is the target vehicle;
if the distribution range information includes the user representation but not the vehicle representation, then: if the user information of the candidate vehicle conforms to the user portrait in the release range information and the designated personality is matched with the current personality of the robot in the candidate vehicle, determining the candidate vehicle as a target vehicle;
if the distribution range information comprises a vehicle portrait and a user portrait, then: and if the vehicle information of the candidate vehicle conforms to the vehicle portrait in the release range information, the user information of the candidate vehicle conforms to the user portrait in the release range information, and the specified personality is matched with the current personality of the robot in the candidate vehicle, determining that the candidate vehicle is the target vehicle.
Therefore, in the scheme, all target vehicles suitable for issuing the current control script in the candidate vehicles can be accurately found based on the comparison of the information, so that the issuing of the control script is realized in a targeted manner. The script issued on the basis can fully embody the individuation and differentiation of different robots (and vehicles using different robots), different users and different vehicles.
Referring to fig. 5, an embodiment of the present invention provides a control script processing apparatus 2, including:
the script establishing module 201 is configured to establish a triggering relationship between a specified monitoring event and a specified control result, and obtain a current control script representing the triggering relationship;
a range determining module 202, configured to determine release range information, where the release range information includes a designated personality;
the issuing module 203 is configured to issue the current control script to the target vehicle according to the issuing range information, so that: the target vehicle can execute the specified control result when the specified monitoring event is monitored; wherein the current personality of the target robot in the target vehicle matches the specified personality.
Optionally, referring to fig. 6, the control script processing apparatus 2 further includes:
a vehicle image determining module 204, configured to determine the vehicle image according to the vehicle description information if the selected or input vehicle description information is acquired;
a user representation determining module 205, configured to determine the user representation according to the user description information if the selected or input user description information is obtained;
a personality obtaining module 206, configured to obtain the designated personality;
the range determining module 202 is specifically configured to: and determining the distribution range information according to the designated personality and the vehicle portrait and/or the user portrait.
Optionally, the publishing module 203 is specifically configured to:
obtaining vehicle information of a candidate vehicle, user information of a user associated with the candidate vehicle, and a robot personality of a robot in the candidate vehicle,
comparing the release range information with the vehicle information, the user information and the robot personality, and selecting the target vehicle from the candidate vehicles;
and issuing the current script to the target vehicle.
Optionally, the script creating module 201 is specifically configured to:
acquiring a specified monitoring event and a specified control result;
according to the specified monitoring event and the specified control result, a current display interface representing the triggering relation is displayed externally; the current display interface comprises:
an event display element characterizing the specified monitoring event;
a result display element that characterizes the specified control result, and:
characterizing a logical identification of the triggering relationship;
and after a script confirmation instruction for the current display interface is acquired, forming the current control script.
Referring to fig. 7, an electronic device 40 is provided, which includes:
a processor 41; and the number of the first and second groups,
a memory 42 for storing executable instructions of the processor;
wherein the processor 41 is configured to perform the above-mentioned method via execution of the executable instructions.
The processor 41 is capable of communicating with the memory 42 via the bus 43.
Embodiments of the present invention also provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the above-mentioned method.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (12)

1. A control script processing method for a vehicle, comprising:
establishing a triggering relation between a specified monitoring event and a specified control result to obtain a current control script representing the triggering relation;
determining release range information, wherein the release range information comprises an appointed personality;
according to the release range information, releasing the current control script to a target vehicle so as to enable: the target vehicle can execute the specified control result when the specified monitoring event is monitored; wherein the current personality of the target robot in the target vehicle matches the specified personality.
2. The control script processing method according to claim 1,
before the determining the distribution range information, the method further comprises:
if the selected or input vehicle description information is acquired, determining the vehicle portrait according to the vehicle description information;
if the selected or input user description information is acquired, determining the user portrait according to the user description information;
acquiring the designated personality;
the information for determining the release range comprises:
and determining the distribution range information according to the designated personality and the vehicle portrait and/or the user portrait.
3. The control script processing method according to claim 2,
the issuing the current control script to the target vehicle according to the issuing range information comprises:
obtaining vehicle information of a candidate vehicle, user information of a user associated with the candidate vehicle, and a robot personality of a robot in the candidate vehicle,
comparing the release range information with the vehicle information, the user information and the robot personality, and selecting the target vehicle from the candidate vehicles;
and issuing the current script to the target vehicle.
4. The method according to claim 1, wherein the establishing a trigger relationship between a specific monitoring event and a specific control result to obtain a current control script representing the trigger relationship comprises:
acquiring a specified monitoring event and a specified control result;
according to the specified monitoring event and the specified control result, a current display interface representing the triggering relation is displayed externally; the current display interface comprises:
an event display element characterizing the specified monitoring event;
a result display element that characterizes the specified control result, and:
characterizing a logical identification of the triggering relationship;
and after a script confirmation instruction for the current display interface is acquired, forming the current control script.
5. The control script processing method of claim 1, wherein the current personality is a personality currently configured for the target robot.
6. The control script processing method of claim 1, wherein the current personality is all configurable personalities of the target robot.
7. The control script processing method of claim 1, wherein the current personality is a personality that the target robot has been configured for.
8. The control script processing method according to any one of claims 1 to 6, wherein the specified event includes at least one of:
the target vehicle arrives at a designated area;
the current weather of the area where the target vehicle is located is designated weather information;
a designated vehicle signal is present at the target vehicle; the designated vehicle signal characterizes a driving behavior for the target vehicle;
the road where the target vehicle is located has a specified traffic state;
detecting a specific picture by an image acquisition device of the target vehicle;
detecting specific voice content by a sound collection device of the target vehicle;
the human-computer interaction part of the target vehicle receives the specified information fed back by the user;
the sensor of the target vehicle detects that the detection information enters a specified section.
9. The control script processing method according to any one of claims 1 to 6, wherein the specified control result includes at least one of:
a display unit of the target vehicle displays a specified screen;
the sound unit of the target vehicle plays the specified voice content;
a change in a state of a vehicle component of the target vehicle; the vehicle part comprises at least one of a vehicle door, a vehicle window, a windscreen wiper, a vehicle lamp, an air conditioner, a vehicle seat control part, a vehicle mirror control part and an image acquisition part;
controlling a movable component of the target robot to generate a designated motion;
adjusting an emotion value of at least one emotion dimension of the target robot.
10. A control script processing apparatus, comprising:
the script establishing module is used for establishing a triggering relation between a specified monitoring event and a specified control result to obtain a current control script representing the triggering relation;
the range determining module is used for determining issuing range information, and the issuing range information comprises an appointed personality;
the issuing module is used for issuing the current control script to a target vehicle according to the issuing range information so as to enable: the target vehicle can execute the specified control result when the specified monitoring event is monitored; wherein the current personality of the target robot in the target vehicle matches the specified personality.
11. An electronic device, comprising a processor and a memory,
the memory is used for storing codes and related data;
the processor, configured to execute the code in the memory to implement the control script processing method of any one of claims 1 to 10.
12. A storage medium having stored thereon a computer program which, when executed by a processor, implements the control script processing method of any one of claims 1 to 10.
CN202110758337.9A 2021-07-05 2021-07-05 Control script processing method and device, electronic equipment and storage medium Active CN113515060B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110758337.9A CN113515060B (en) 2021-07-05 2021-07-05 Control script processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110758337.9A CN113515060B (en) 2021-07-05 2021-07-05 Control script processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113515060A true CN113515060A (en) 2021-10-19
CN113515060B CN113515060B (en) 2023-04-21

Family

ID=78066570

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110758337.9A Active CN113515060B (en) 2021-07-05 2021-07-05 Control script processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113515060B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102085842A (en) * 2009-12-07 2011-06-08 福特全球技术公司 Motor vehicle and method of operating occupant safety system of the same
CN103051648A (en) * 2011-10-14 2013-04-17 上海博泰悦臻网络技术服务有限公司 Communication method and system among vehicles and server for communication among vehicles
CN103786061A (en) * 2014-01-13 2014-05-14 郭海锋 Vehicular robot device and system
CN104199321A (en) * 2014-08-07 2014-12-10 刘松珍 Emotion interacting type vehicle-mounted robot
CN107554450A (en) * 2017-08-29 2018-01-09 三星电子(中国)研发中心 The method and apparatus for adjusting vehicle
CN109240718A (en) * 2018-09-27 2019-01-18 武汉旖旎科技有限公司 It is a kind of intelligence lattice build and analysis system and its working method
US10313219B1 (en) * 2016-11-08 2019-06-04 Sprint Communications Company L.P. Predictive intelligent processor balancing in streaming mobile communication device data processing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102085842A (en) * 2009-12-07 2011-06-08 福特全球技术公司 Motor vehicle and method of operating occupant safety system of the same
CN103051648A (en) * 2011-10-14 2013-04-17 上海博泰悦臻网络技术服务有限公司 Communication method and system among vehicles and server for communication among vehicles
CN103786061A (en) * 2014-01-13 2014-05-14 郭海锋 Vehicular robot device and system
CN104199321A (en) * 2014-08-07 2014-12-10 刘松珍 Emotion interacting type vehicle-mounted robot
US10313219B1 (en) * 2016-11-08 2019-06-04 Sprint Communications Company L.P. Predictive intelligent processor balancing in streaming mobile communication device data processing
CN107554450A (en) * 2017-08-29 2018-01-09 三星电子(中国)研发中心 The method and apparatus for adjusting vehicle
CN109240718A (en) * 2018-09-27 2019-01-18 武汉旖旎科技有限公司 It is a kind of intelligence lattice build and analysis system and its working method

Also Published As

Publication number Publication date
CN113515060B (en) 2023-04-21

Similar Documents

Publication Publication Date Title
US11243613B2 (en) Smart tutorial for gesture control system
US11034362B2 (en) Portable personalization
EP3727962B1 (en) Method and system for human-like vehicle control prediction in autonomous driving vehicles
CN105966405A (en) Driver distraction detection system
CN108827307B (en) Navigation method, navigation device, terminal and computer readable storage medium
CN107146611B (en) Voice response method and device and intelligent equipment
CN107415938A (en) Based on occupant position and notice control autonomous vehicle function and output
CN104902081B (en) Control method of flight mode and mobile terminal
CN104933885B (en) The method, apparatus and smart machine of intelligent reminding road conditions
CN108735203A (en) Voice interactive method, terminal and computer-readable medium
CN105022294B (en) Portable communication device, system and method for work with driving condition man-machine interface
CN110390932A (en) Method of speech processing and its equipment based on recognition of face
CN106427840A (en) Method of self-adaptive vehicle driving mode and terminal
CN112130547B (en) Vehicle interaction method and device
CN110875940A (en) Application program calling method, device and equipment based on virtual robot
CN109383364A (en) On-vehicle parts automatic adjustment
CN110871813A (en) Control method and device of virtual robot, vehicle, equipment and storage medium
CN113515060A (en) Control script processing method and device, electronic equipment and storage medium
CN111625094B (en) Interaction method and device of intelligent rearview mirror, electronic equipment and storage medium
US20220335292A1 (en) Information processing device, information processing method, and program
CN113180427A (en) Multifunctional intelligent mirror
EP3126934A1 (en) Systems and methods for the detection of implicit gestures
CN109582271B (en) Method, device and equipment for dynamically setting TTS (text to speech) playing parameters
CN109878290B (en) AI technology-based intelligent vehicle-mounted air conditioner control method and system
CN113459100B (en) Processing method, device, equipment and medium based on robot personality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20231121

Address after: Floors 3-7, Building T3, No. 377 Songhong Road, Changning District, Shanghai, 200000

Patentee after: Shanghai Jiayu Intelligent Technology Co.,Ltd.

Address before: 200050 room 8041, 1033 Changning Road, Changning District, Shanghai (nominal Floor 9)

Patentee before: Shanghai xianta Intelligent Technology Co.,Ltd.

TR01 Transfer of patent right