WO2023103699A1 - Procédé et appareil d'interaction, et dispositif électronique et support d'informations - Google Patents

Procédé et appareil d'interaction, et dispositif électronique et support d'informations Download PDF

Info

Publication number
WO2023103699A1
WO2023103699A1 PCT/CN2022/130876 CN2022130876W WO2023103699A1 WO 2023103699 A1 WO2023103699 A1 WO 2023103699A1 CN 2022130876 W CN2022130876 W CN 2022130876W WO 2023103699 A1 WO2023103699 A1 WO 2023103699A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
information
user
content
preset
Prior art date
Application number
PCT/CN2022/130876
Other languages
English (en)
Chinese (zh)
Inventor
罗升阳
Original Assignee
杭州逗酷软件科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州逗酷软件科技有限公司 filed Critical 杭州逗酷软件科技有限公司
Publication of WO2023103699A1 publication Critical patent/WO2023103699A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present application relates to the technical field of electronic equipment, and more specifically, to an interaction method, device, electronic equipment, and storage medium.
  • the present application proposes an interaction method, device, electronic equipment and storage medium to solve the above problems.
  • the embodiment of the present application provides an interaction method, which is applied to an electronic device, and the method includes: in response to the received interaction information meeting the preset condition, starting the context awareness function of the electronic device; identifying the The current situation of the electronic device is located, and the user intention is determined based on the current situation; based on the user intention, content push is performed.
  • the embodiment of the present application provides an interaction device, which is applied to electronic equipment, and the device includes: a function activation module, configured to activate the electronic equipment in response to the received interaction information satisfying the preset condition.
  • Context awareness function configured to identify the current context of the electronic device, and determine the user intention based on the current context
  • a content push module configured to push content based on the user intention.
  • the embodiment of the present application provides an electronic device, including a memory and a processor, the memory is coupled to the processor, the memory stores instructions, and when the instructions are executed by the processor, the The processor executes the above method.
  • the embodiment of the present application provides a computer-readable storage medium, where program code is stored in the computer-readable storage medium, and the program code can be invoked by a processor to execute the above method.
  • FIG. 1 shows a schematic flowchart of an interaction method provided by an embodiment of the present application
  • FIG. 2 shows a schematic flowchart of an interaction method provided by an embodiment of the present application
  • FIG. 3 shows a schematic flowchart of an interaction method provided by an embodiment of the present application
  • FIG. 4 shows a schematic flowchart of an interaction method provided by an embodiment of the present application
  • FIG. 5 shows a schematic flowchart of step S450 of the interaction method identified in FIG. 4 of the present application
  • FIG. 6 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • FIG. 7 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • FIG. 8 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • FIG. 9 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • FIG. 10 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • Fig. 11 shows a module block diagram of an interaction device provided by an embodiment of the present application.
  • FIG. 12 shows a block diagram of an electronic device for performing an interaction method according to an embodiment of the present application in an embodiment of the present application
  • Fig. 13 shows a storage unit for storing or carrying program codes for realizing the interaction method according to the embodiment of the present application according to the embodiment of the present application.
  • electronic devices generally push content to users.
  • electronic devices can be equipped with a context awareness function.
  • the context awareness function When the context awareness function is turned off, the electronic device will not push content to the user.
  • the context awareness function When the context awareness function is turned on , the electronic device will push content to the user.
  • the situation awareness functions of electronic devices are always on. Therefore, electronic devices will push content to users from time to time. The user experience is not good.
  • the inventor discovered after long-term research, and proposed the interaction method, device, electronic equipment and storage medium provided by the embodiment of the present application, and activates the situation awareness function of the electronic equipment through interactive information, so as to reduce the resource requirements of the electronic equipment And reduce the power consumption of electronic equipment.
  • the specific interaction method is described in detail in the subsequent embodiments.
  • FIG. 1 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • the method is used for starting the situation awareness function of the electronic device through the interaction information, so as to reduce the resource requirement of the electronic device and reduce the power consumption of the electronic device.
  • the interaction method is applied to the interaction device 200 shown in FIG. 11 and the electronic device 100 ( FIG. 12 ) configured with the interaction device 200 .
  • the following will take an electronic device as an example to illustrate the specific process of this embodiment.
  • the electronic device applied in this embodiment may include a smart phone, a tablet computer, a wearable electronic device, etc., which is not limited here.
  • the process shown in Figure 1 will be described in detail below, and the interaction method may specifically include the following steps:
  • Step S110 In response to the received interaction information meeting the preset condition, activate the context awareness function of the electronic device.
  • the electronic device may detect the interaction information input by the user.
  • the electronic device can detect the interaction information input by the user in real time, detect the interaction information input by the user at a preset time interval, detect the interaction information input by the user at a preset time point, and can detect the interaction information input by the user according to The environmental factors trigger the detection of the interaction information input by the user, or the detection of the interaction information input by the user according to other preset rules, etc., which are not limited here.
  • the detection of the interaction information input by the user is triggered according to the environmental factors as an example.
  • the detected ambient temperature reaches the preset temperature, it can detect the interactive information input by the user; when it detects that the ambient humidity reaches the preset humidity, it can detect the interactive information input by the user; when it detects that the environmental location is within the preset range
  • the interaction information input by the user may also be detected, etc., which is not limited here.
  • the input interaction information detected by the electronic device may include: voice interaction information, text interaction information, touch interaction information, etc., which is not limited herein.
  • context awareness refers to the acquisition of relevant information of the user's environment by electronic devices when designing sensors and related technologies, and performing corresponding actions.
  • situational awareness can detect data through sensors of electronic devices, perform feature statistics through machine learning, perceive basic situations, such as environment, time, activity, traffic, location, nearby equipment, etc., and then use inference Engine reasoning obtains advanced contexts, for example, ordering meals in advanced contexts through time/location inferences in basic contexts; Situational sightseeing attractions; advanced situational projections obtained through inference of devices near the basic situation, etc., are not limited here.
  • the context awareness function includes: position awareness function, vehicle awareness function, activity state awareness function, device state awareness function, device attitude awareness function, nearby device awareness function, environmental state awareness function, time awareness function, etc., It is not limited here.
  • the position awareness function can locate the position of the electronic device through the technology of the global positioning system (GPS), and can locate the position of the electronic device through the technology of the Beidou positioning system, etc., which is not limited here.
  • GPS global positioning system
  • the vehicle perception function can calculate the change speed of the electronic device on the x-axis and y-axis through the acceleration sensor of the electronic device, and combine machine learning and other technologies to determine whether the electronic device is in a driving state.
  • the types of vehicles are distinguished through voice recognition, for example, cars, buses, trains, airplanes, etc., where different vehicles correspond to different environmental noises.
  • the activity state perception function can calculate the change speed of the electronic device on the x-axis and y-axis through the acceleration sensor of the electronic device, and combine machine learning and other technologies to determine whether the user corresponding to the electronic device is in a static state, whether Whether it is in a walking state, whether it is in a running state, etc., is not limited here.
  • the device state function may acquire the state of the electronic device through the operating system of the electronic device. For example, whether the electronic device is connected to an audio playback device, whether the wireless module of the electronic device is connected, whether the screen of the electronic device is turned on, etc., are not limited here.
  • the device posture perception function can use the acceleration sensor, gyroscope, magnetometer and other sensors of the electronic device, combined with machine learning and other technologies, to evaluate whether the electronic device is facing up or down, whether it is placed on the table or placed on it. In pockets, backpacks, etc., it is not limited here.
  • the nearby device sensing function can identify nearby devices through broadcasts such as Bluetooth and WiFi sent by nearby devices.
  • the nearby devices may include, for example, smart phones, smart TVs, smart watches, smart earphones, smart cars, etc., which are not limited herein.
  • the environmental state sensing function may identify the current environmental state of the electronic device through sensors such as a barometer, a thermometer, and ambient light of the electronic device.
  • the time awareness function can determine whether it is a working day or a weekend, morning, afternoon, evening or late night, etc. by obtaining the current system time of the electronic device, calculating date, week and other information. Then, through the country or region to which it belongs, determine whether it is a holiday or not.
  • the electronic device may preset and store a preset condition, and the preset condition is used as a basis for judging the interaction information detected by the electronic device. Therefore, in this embodiment, when the electronic device detects the input interaction information, it may compare the input interaction information with a preset condition to determine whether the interaction information satisfies the preset condition.
  • the situation awareness function of the electronic device may be activated. Wherein, the electronic device can push content to the user when the context awareness function is turned on.
  • the situation awareness function of the electronic device may be kept in an off state. Wherein, the electronic device may not push content to the user when the context awareness function is turned off.
  • the preset condition may include preset voice information, preset touch information, preset text information, etc., which are not limited here.
  • Step S120 Identify the current situation of the electronic device, and determine the user's intention based on the current situation.
  • the context awareness function of the electronic device after the context awareness function of the electronic device is started, it can detect the current situation of the electronic device, and after detecting the current situation of the electronic device, it can identify the current situation and obtain The user intent corresponding to the electronic device.
  • detecting the current context of the electronic device may be detecting the basic context of the electronic device, for example, detecting the current environment, current time, current activity, current vehicle, current location and/or or currently nearby devices.
  • the advanced context can be inferred by the rule engine based on the basic context, and the user intention corresponding to the electronic device can be determined based on the high-level context obtained through reasoning.
  • Step S130 Push content based on the user intention.
  • the electronic device may push content based on the user's intention. For example, after it is determined that the user intends to take the subway, the subway code can be pushed to the user.
  • this embodiment can rely on the sensor of a single electronic device to complete the situation awareness of a single electronic device. Therefore, it can be applied to the scenarios of a single electronic device and multiple electronic devices, and can also be applied to indoor and outdoor scenarios.
  • a simple and natural multi-modal active expression technology can be used to avoid continuous context awareness in the system background of the electronic device and consume resources of the electronic device.
  • the interaction method provided by an embodiment of the present application activates the context awareness function of the electronic device in response to the received interaction information satisfying the preset condition, identifies the current context of the electronic device, determines the user intention based on the current context, and based on the user
  • the intent is to push content, so as to activate the situation awareness function of the electronic device through interactive information, so as to reduce the resource requirement of the electronic device and reduce the power consumption of the electronic device.
  • FIG. 2 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • the method is applied to the above-mentioned electronic equipment, and the flow shown in Figure 2 will be described in detail below, and the interaction method may specifically include the following steps:
  • Step S210 In response to the received interaction information meeting the preset condition, activate the context awareness function of the electronic device.
  • Step S220 Identify the current situation of the electronic device, and determine the user's intention based on the current situation.
  • step S210-step S220 please refer to step S110-step S120, which will not be repeated here.
  • Step S230 Based on the user intention, output prompt information, wherein the prompt information is used to prompt selection of whether to execute content push.
  • the electronic device may output prompt information for prompting whether to perform content push based on the user's intention, so that the user can select content push according to the prompt information.
  • prompt information for prompting whether to perform content push based on the user's intention, so that the user can select content push according to the prompt information.
  • the electronic device may determine the degree of certainty of the electronic device for the user's intention.
  • the degree of certainty when the degree of certainty is lower than the predetermined degree of certainty, indicating that the intention is highly likely to be inaccurate, prompt information may be output to avoid errors in pushed content.
  • the degree of certainty when the degree of certainty is higher than the preset accuracy, indicating that the user's intention is highly likely to be accurate, it is possible to directly push content according to the user's intention without outputting prompt information, so as to improve the user's experience.
  • the prompt information may include voice prompt information, text prompt information, color prompt information, vibration prompt information, etc., which is not limited herein.
  • Step S240 Push the content when the input confirmation information is received.
  • the electronic device may monitor whether information input based on the prompt information is received.
  • the confirmation information input based on the prompt information it means that the user approves the user intention, that is, the electronic device agrees to push the content based on the user intention, and then the content push can be performed.
  • the denial information input based on the prompt information it means that the user does not approve the user intention, that is, the electronic device does not agree to push the content based on the user intention, and the content push may not be performed.
  • the interaction method provided by an embodiment of the present application activates the context awareness function of the electronic device in response to the received interaction information satisfying the preset condition, identifies the current context of the electronic device, determines the user intention based on the current context, and based on the user Intent, to output prompt information for prompting to choose whether to perform content push, and when the input confirmation information is received, content push will be performed.
  • this embodiment also outputs prompt information when the user's intention cannot be completely determined, and determines whether to execute content push according to the user's choice, so as to improve the accuracy of content push.
  • FIG. 3 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • the method is applied to the above-mentioned electronic equipment, and the flow shown in Figure 3 will be described in detail below, and the interaction method may specifically include the following steps:
  • Step S310 In response to the received interaction information meeting the preset condition, activate the context awareness function of the electronic device.
  • Step S320 Identify the current context of the electronic device, and determine the user's intention based on the current context.
  • step S310-step S320 please refer to step S110-step S120, which will not be repeated here.
  • Step S330 Based on the user intention, output prompt information, wherein the prompt information is used to prompt to choose whether to perform content push.
  • step S330 for the specific description of step S330, please refer to step S230, which will not be repeated here.
  • Step S340 When an input voice command is received, recognize the voice command based on voice recognition technology, and obtain voice information included in the voice command.
  • the electronic device may monitor whether information input based on the prompt information is received.
  • the voice command may be recognized based on voice recognition technology, so as to obtain the voice information included in the voice command.
  • an electronic device may include a sound pickup device (eg, a microphone).
  • the sound pickup device of the electronic device may be in a normally-on state, or may be controlled to be turned on after outputting a prompt message, which is not limited here.
  • Step S350 When the voice information includes preset voice information, push the content.
  • the electronic device may be preset and stored with preset voice information, and the preset voice information is used as a basis for judging the voice information contained in the voice instruction received by the electronic device. Therefore, in this embodiment, after the voice information is obtained, the voice information may be compared with the preset voice information to determine whether the voice information includes the preset voice information. Wherein, when it is determined that the voice information includes preset voice information, it means that the user approves the user intention, that is, agrees that the electronic device pushes content based on the user intention, and then the content push can be performed. Wherein, when it is determined that the voice information does not include the preset voice information, it means that the user does not approve the user intention, that is, the electronic device does not agree to push content based on the user intention, and the content push may not be performed.
  • the preset voice information may include “yes”, “agree”, “execute”, “push”, etc., which are not limited herein.
  • the interaction method provided by an embodiment of the present application activates the context awareness function of the electronic device in response to the received interaction information satisfying the preset condition, identifies the current context of the electronic device, determines the user intention based on the current context, and based on the user Intent, to output prompt information for prompting to choose whether to execute content push.
  • the voice command is recognized based on voice recognition technology, and the voice information contained in the voice command is obtained.
  • the voice information includes preset voice
  • this embodiment also outputs prompt information when the user's intention cannot be completely determined, and determines whether to execute content push according to the user's voice command, so as to improve the accuracy of content push.
  • FIG. 4 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • the method is applied to the above-mentioned electronic equipment, and the flow shown in Figure 4 will be described in detail below, and the interaction method may specifically include the following steps:
  • Step S410 In response to the received interaction information meeting the preset condition, activate the context awareness function of the electronic device.
  • Step S420 Identify the current situation of the electronic device, and determine the user's intention based on the current situation.
  • step S410-step S420 please refer to step S110-step S120, which will not be repeated here.
  • Step S430 Based on the user intention, output prompt information, wherein the prompt information is used to prompt to choose whether to perform content push.
  • step S430 please refer to step S230, which will not be repeated here.
  • Step S440 When a first tap operation acting on the electronic device is detected, detect a tap position corresponding to the first tap operation.
  • the electronic device may monitor whether information input based on the prompt information is received. Wherein, when a tap operation (first tap operation) input based on the prompt information is detected, a tap position corresponding to the first tap operation may be detected.
  • an electronic device may include a touch sensor. Then, after the electronic device outputs the prompt information, it can detect the touch operation acting on the electronic device through the touch sensor, wherein, when the first tapping operation acting on the electronic device is detected, the first tapping operation can be detected Operate the corresponding tapping position.
  • the electronic device may include a pressure sensor. Then, after the electronic device outputs the prompt information, it can detect the touch operation acting on the electronic device through the pressure sensor, wherein, when the first tapping operation acting on the electronic device is detected, the first tapping operation can be detected Operate the corresponding tapping position.
  • FIG. 5 shows a schematic flowchart of step S440 of the interaction method identified in FIG. 4 of the present application.
  • the process shown in Figure 5 will be described in detail below, and the method may specifically include the following steps:
  • Step S441 When a first tap operation acting on the electronic device is detected, acquire a tap sound corresponding to the first tap operation.
  • the electronic device may monitor whether information input based on the prompt information is received. Wherein, when a tap operation (first tap operation) input based on the prompt information is detected, a tap sound corresponding to the first tap operation may be detected.
  • the electronic device can perform feature statistics on the sounds generated when tapping different positions of the electronic device by means of machine learning, so as to identify different positions of the electronic device tapped by the user through different tapping sounds.
  • the electronic device can perform feature statistics on the different sounds generated when the front and back of the electronic device are knocked by machine learning, so as to identify the user's knock on the front or back of the electronic device through different knocking sounds.
  • Step S442 Based on the tapping sound, determine a tapping position corresponding to the first tapping operation.
  • the electronic device may determine the tapping position corresponding to the first tapping operation based on the tapping sound. Wherein, after the electronic device detects the tapping sound corresponding to the first tapping operation, it can compare the tapping sound with pre-stored tapping sounds of different positions, and determine that the tapping sound is consistent with a pre-stored When the tapping sounds of the positions match, it can be determined that the tapping position corresponding to the first tapping operation is the certain position.
  • Step S450 Push the content when the tap position satisfies the preset tap position.
  • the electronic device may be preset and stored with a preset tap position, and the preset tap position is used as the tap position corresponding to the first tap operation received by the electronic device and acting on the electronic device. Judgments based. Therefore, in this embodiment, after the tapping position is obtained, the tapping position may be compared with the preset tapping position to determine whether the tapping position satisfies the preset tapping position. Wherein, when it is determined that the tap position satisfies the preset tap position, it means that the user approves the user intention, that is, agrees that the electronic device pushes content based on the user intention, and then the content push can be performed. Wherein, when it is determined that the tap position does not satisfy the preset tap position, it means that the user does not approve the user intention, that is, the electronic device does not agree to push content based on the user intention, and the content push may not be performed.
  • the preset tapping position may include the front of the electronic device, may include the back of the electronic device, may include the side of the electronic device, etc., which is not limited herein.
  • the interaction method provided by an embodiment of the present application activates the context awareness function of the electronic device in response to the received interaction information meeting the preset conditions, identifies the current context of the electronic device, determines the user intention based on the current context, and determines the user intention based on the user intention.
  • Output prompt information wherein the prompt information is used to prompt to select whether to perform content push, when the first tap operation acting on the electronic device is detected, the tap position corresponding to the first tap operation is detected, when the tap position When the preset tapping position is satisfied, the content is pushed.
  • this embodiment also outputs prompt information when the user's intention cannot be completely determined, and determines whether to execute content push according to the tap position corresponding to the tap operation, so as to improve the accuracy of content push.
  • FIG. 6 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • the method is applied to the above-mentioned electronic equipment, and the following will describe the process shown in Figure 6 in detail, and the interaction method may specifically include the following steps:
  • Step S510 In response to the received interaction information meeting the preset condition, activate the context awareness function of the electronic device.
  • Step S520 Identify the current context of the electronic device, and determine the user's intention based on the current context.
  • step S510-step S520 please refer to step S110-step S120, which will not be repeated here.
  • Step S530 Based on the user intention, output prompt information, wherein the prompt information is used to prompt to choose whether to perform content push.
  • step S530 please refer to step S230, which will not be repeated here.
  • Step S540 When a second tap operation acting on the electronic device is detected, detect the number of consecutive taps corresponding to the second tap operation.
  • the electronic device may monitor whether information input based on the prompt information is received. Wherein, when a tap operation (second tap operation) input based on the prompt information is detected, the number of consecutive taps corresponding to the second tap operation may be detected.
  • an electronic device may include a touch sensor. Then, after the electronic device outputs the prompt information, it can detect the touch operation acting on the electronic device through the touch sensor, wherein, when the second tapping operation acting on the electronic device is detected, the second tapping operation can be detected The number of consecutive taps corresponding to the operation.
  • the electronic device may include a pressure sensor. Then, after the electronic device outputs the prompt information, it can detect the touch operation acting on the electronic device through the pressure sensor, wherein, when the second tapping operation acting on the electronic device is detected, the first tapping operation can be detected The number of consecutive taps corresponding to the operation.
  • the number of consecutive taps may refer to the total number of taps within a certain preset time period.
  • the electronic device may monitor whether information input based on the prompt information is received. Wherein, when a tap operation (second tap operation) input based on the prompt information is detected, a tap sound corresponding to the second tap operation may be detected.
  • the electronic device can perform feature statistics on the sound of continuous tapping on the electronic device for different times through machine learning, so as to identify the different times of the electronic device being tapped by the user through different tapping sounds.
  • the electronic device can use machine learning to perform feature statistics on the different sounds generated when the electronic device is tapped once, twice, or three times in a row, so as to identify the consecutive times the user taps the electronic device through different tapping sounds.
  • the electronic device may determine the number of consecutive tappings corresponding to the second tapping operation based on the tapping sound.
  • it can compare the tapping sound with pre-stored tapping sounds with different consecutive tapping times, and determine the difference between the tapping sound and the pre-stored tapping sound. When the tap sound of a certain number of consecutive taps matches, it can be determined that the number of consecutive taps corresponding to the second tap operation is the certain number of consecutive taps.
  • Step S550 Push the content when the number of consecutive taps meets the preset number of taps.
  • the electronic device may be preset and stored with a preset number of taps, and the preset number of taps is used as the number of connection taps corresponding to the second tap operation received by the electronic device and acting on the electronic device. basis for judgment. Therefore, in this embodiment, after the number of consecutive taps is obtained, the number of consecutive taps can be compared with the preset number of taps to determine whether the number of consecutive taps meets the preset number of taps. Wherein, when it is determined that the number of consecutive taps satisfies the preset number of taps, it means that the user approves the user intention, that is, agrees that the electronic device pushes content based on the user intention, and then the content push can be performed.
  • the user does not approve the user intention, that is, the electronic device does not agree to push content based on the user intention, and the content push may not be performed.
  • the preset number of taps may include one, two, three times, etc., which is not limited here.
  • the interaction method provided by an embodiment of the present application activates the context awareness function of the electronic device in response to the received interaction information meeting the preset conditions, identifies the current context of the electronic device, determines the user intention based on the current context, and determines the user intention based on the user intention.
  • Output prompt information wherein the prompt information is used to prompt to select whether to perform content push, when a second tap operation acting on the electronic device is detected, the number of consecutive taps corresponding to the second tap operation is detected, and the continuous tap When the number of taps meets the preset number of taps, the content is pushed.
  • this embodiment also outputs prompt information when the user's intention cannot be completely determined, and determines whether to perform content push according to the number of consecutive taps corresponding to the tap operation, so as to improve the accuracy of content push .
  • FIG. 7 shows a schematic flowchart of an interaction method provided by an embodiment of the present application. The method is applied to the above-mentioned electronic equipment, and the flow shown in FIG. 7 will be described in detail below.
  • the interaction method may specifically include the following steps:
  • Step S610 In response to the received interaction information meeting the preset condition, activate the context awareness function of the electronic device.
  • Step S620 Identify the current situation of the electronic device, and determine the user's intention based on the current situation.
  • step S610-step S620 please refer to step S110-step S120, which will not be repeated here.
  • Step S630 Based on the user intention, output prompt information, wherein the prompt information is used to prompt selection of whether to execute content push.
  • step S630 please refer to step S230, which will not be repeated here.
  • Step S640 When an image is captured by the camera, the image is recognized based on an image recognition technology, and image information corresponding to the image is obtained.
  • the electronic device may include a camera.
  • the electronic device may monitor whether information input based on the prompt information is received.
  • the image may be recognized based on an image recognition technology, so as to obtain the image information corresponding to the image.
  • Step S650 Push the content when the image information includes preset image information.
  • the electronic device may be preset and stored with preset image information, and the preset image information is used as a basis for judging the image information corresponding to the image collected by the electronic device. Therefore, in this embodiment, after the image information is obtained, the image information may be compared with the preset image information to determine whether the image information includes the preset image information. Wherein, when the image information includes the preset image information, it means that the user approves the user intention, that is, agrees that the electronic device pushes the content based on the user intention, and then the content push can be performed. Wherein, when the image information includes non-preset image information, it means that the user does not approve the user intention, that is, the electronic device does not agree to push content based on the user intention, and the content push may not be performed.
  • the preset image information may include “nod image information”, “like image information”, etc., which are not limited here.
  • the interaction method provided by an embodiment of the present application activates the context awareness function of the electronic device in response to the received interaction information meeting the preset conditions, identifies the current context of the electronic device, determines the user intention based on the current context, and determines the user intention based on the user intention.
  • Output prompt information wherein the prompt information is used to prompt to choose whether to perform content push.
  • the image is collected by the camera, the image is recognized based on the voice recognition technology to obtain the image information corresponding to the image.
  • the image information includes the preset
  • content push is performed.
  • this embodiment also outputs prompt information when the user's intention cannot be completely determined, and determines whether to execute content push according to the image information collected by the camera, so as to improve the accuracy of content push.
  • FIG. 8 shows a schematic flowchart of an interaction method provided by an embodiment of the present application. This method is applied to the above-mentioned electronic equipment, and the following will elaborate on the flow shown in Figure 8 in detail, and the interaction method may specifically include the following steps:
  • Step S710 Obtain the acceleration change of the electronic device based on the shaking operation.
  • the electronic device can detect a shaking operation acting on the electronic device.
  • the electronic device may detect a shaking operation acting on the electronic device through an acceleration sensor.
  • the electronic device when it detects an input shaking operation, it may acquire an acceleration change of the electronic device based on the shaking operation.
  • the electronic device can obtain the acceleration change of the electronic device on the x-axis, y-axis and z-axis through the acceleration sensor, and determine the acceleration change of the electronic device on the x-axis, y-axis and z-axis as the acceleration of the electronic device based on the shaking operation Variety.
  • Step S720 When the acceleration change satisfies a preset acceleration change, determine that the interaction information satisfies the preset condition, and start a situation awareness function of the electronic device.
  • the electronic device may be preset and stored with a preset acceleration change, and the preset acceleration change is used as a basis for judging the acceleration change of the electronic device. Therefore, in this embodiment, when the acceleration change of the electronic device is obtained, the acceleration change may be compared with a preset acceleration change to determine whether the acceleration change satisfies the preset acceleration change.
  • the situation awareness function of the electronic device may be activated. Wherein, the electronic device can push content to the user when the context awareness function is turned on.
  • the situation awareness function of the electronic device may be kept in an off state. Wherein, the electronic device may not push content to the user when the context awareness function is turned off.
  • Step S730 Identify the current situation of the electronic device, and determine the user's intention based on the current situation.
  • Step S740 Push content based on the user intention.
  • step S730-step S740 please refer to step S120-step S130, which will not be repeated here.
  • the interaction method provided by an embodiment of the present application acquires the acceleration change of the electronic device based on the shaking operation.
  • the acceleration change meets the preset acceleration change
  • it is determined that the interaction information meets the preset condition and the context awareness function of the electronic device is activated to identify the electronic device.
  • the user's intention is determined based on the current situation, and the content is pushed based on the user's intention.
  • this embodiment also activates the situation awareness function of the electronic device when the acceleration change based on the shaking operation of the electronic device satisfies the preset acceleration change, so as to improve the autonomy of function activation and reduce the electronic device power consumption.
  • FIG. 9 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
  • the method is applied to the above-mentioned electronic equipment, and the following will describe the process shown in Figure 9 in detail, and the interaction method may specifically include the following steps:
  • Step S810 When the third tap operation satisfies a preset tap operation, determine that the interaction information satisfies the preset condition, and activate a context awareness function of the electronic device.
  • the electronic device can detect a tap operation on the electronic device.
  • the electronic device may detect a tap operation (third tap operation) acting on the electronic device through a touch sensor, a pressure sensor, or the like.
  • the electronic device may be preset and stored with a preset tap operation, and the preset tap operation is used as a basis for judging a third tap operation of the electronic device. Therefore, in this embodiment, when the third tapping operation acting on the electronic device is obtained, the third tapping operation can be compared with the preset tapping operation to determine whether the third tapping operation satisfies the Preset tap action.
  • the situation awareness function of the electronic device may be activated. Wherein, the electronic device can push content to the user when the context awareness function is turned on.
  • the situation awareness function of the electronic device may be kept in an off state. Wherein, the electronic device may not push content to the user when the context awareness function is turned off.
  • the preset tapping operation may include: tapping a certain position of the electronic device for a predetermined number of times.
  • Step S820 Identify the current situation of the electronic device, and determine the user's intention based on the current situation.
  • Step S830 Push content based on the user intention.
  • step S820-step S830 please refer to step S120-step S130, which will not be repeated here.
  • this embodiment when the third tap operation satisfies the preset tap operation, it is determined that the interaction information satisfies the preset condition, and the context awareness function of the electronic device is activated to identify the current context of the electronic device, Determine user intent based on the current situation, and push content based on user intent.
  • this embodiment also activates the context-aware function of the electronic device when detecting the third tap operation that satisfies the preset tap operation, so as to improve the autonomy of function startup and reduce electronic The power consumption of the device.
  • FIG. 10 shows a schematic flowchart of an interaction method provided by an embodiment of the present application. This method is applied to the above-mentioned electronic equipment, and the flow shown in Figure 10 will be described in detail below, and the interaction method may specifically include the following steps:
  • Step S910 In response to the received interaction information meeting the preset condition, activate the context awareness function of the electronic device.
  • Step S920 Identify the current situation of the electronic device, and determine the user's intention based on the current situation.
  • step S910-step S920 please refer to step S110-step S120, which will not be repeated here.
  • Step S930 Obtain a user portrait of a user corresponding to the electronic device.
  • a user portrait of a user corresponding to the electronic device may be acquired.
  • user portraits can be constructed based on the user's habits of using electronic devices, such as frequently used applications, including the user's exercise habits, entertainment habits, consumption habits, commuting habits, sleeping habits, etc.
  • commuting habits during commuting time and the like are not limited here.
  • the build cycle can be preset. For example, different time periods of each day may be preset as a build cycle, one day may be preset as a build cycle, and one week may be set as a build cycle, etc., which are not limited here.
  • the pre-set construction period is a different time period every day, you can set 8:00 ⁇ 12:00 every day as a construction period, 12:00 ⁇ 16:00 as a construction period, and 12:00 ⁇ 12:00 20:00 is set as a build cycle, 20:00 ⁇ 24:00 is set as a build cycle, and 24:00 ⁇ 8:00 is set as a build cycle.
  • the current time may be acquired, the construction period in which the current time is located is determined as the current construction period, and user data of the user corresponding to the electronic device is collected. For example, if the preset build period is a different time period every day, and the current time is 9:00, you can use 8:00-12:00 as the current build period, and the build period between 8:00-12:00 Collect the user data of the user corresponding to the electronic device.
  • user data of users corresponding to electronic devices in different applications, different services, different systems, etc. may be collected during the current construction cycle.
  • applications it is possible to collect user data of users corresponding to electronic devices in chat applications, collect user data of users corresponding to electronic devices in game applications, and collect user data of users corresponding to electronic devices in video applications.
  • the user data in the search application, the collection of user data of the user corresponding to the electronic device in the search application program, etc., are not limited here.
  • the collected user data of the user corresponding to the electronic device may include: the user's age, the user's gender, the user's occupation, the user's income, the user's hobbies, the applications downloaded by the user, and the user's relationship with the application. Scoring, etc., are not limited here.
  • the user data of the user corresponding to the electronic device may be aggregated to obtain a user portrait of the user corresponding to the electronic device in the current construction cycle.
  • the user data collected from different applications, services, systems, etc. can be aggregated to obtain the corresponding The user profile of the user in the current build cycle.
  • the user's basic portrait can be constructed based on the user's age, gender, income, etc. collected from different applications, businesses, systems, etc.
  • the user's business portrait can be constructed based on user preferences collected from different applications, businesses, systems, etc. , build a user's behavior profile based on the downloaded and installed applications collected from different applications, services, systems, etc., which are not limited here.
  • Step S940 Push the content based on the user intention and the user portrait.
  • this embodiment after the user intention and user portrait are obtained, content push can be performed based on the user intention and user portrait.
  • this embodiment combines user portraits to identify different user intentions in the same situation, and realize personalized situation awareness.
  • the interaction method provided by an embodiment of the present application activates the context awareness function of the electronic device in response to the received interaction information satisfying the preset condition, identifies the current context of the electronic device, determines the user's intention based on the current context, and obtains the information of the electronic device.
  • the user portrait of the corresponding user based on the user intention and user portrait, pushes the content.
  • this embodiment also acquires the user portrait of the user corresponding to the electronic device, and pushes content based on user intent and user portrait, thereby improving the accuracy of content push.
  • FIG. 11 shows a module block diagram of an interaction device provided by an embodiment of the present application.
  • the interaction device 200 is applied to the above-mentioned electronic equipment, and will be described below with respect to the block diagram shown in FIG.
  • the function starting module 220 is configured to start the situation awareness function of the electronic device in response to the received interaction information satisfying the preset condition.
  • the interaction information includes a shaking operation
  • the function activation module 210 includes: an acceleration change acquisition submodule and a first function activation submodule, wherein:
  • the acceleration change acquisition submodule is configured to acquire the acceleration change of the electronic device based on the shaking operation.
  • the first function activating submodule is configured to determine that the interaction information satisfies the preset condition when the acceleration change satisfies a preset acceleration change, and start the situation awareness function of the electronic device.
  • the interaction information includes a third tap operation
  • the function activation module 210 includes: a second function promoter module, wherein:
  • the second function activation sub-module is configured to determine that the interaction information satisfies the preset condition when the third tap operation satisfies the preset tap operation, and activate the situation awareness function of the electronic device.
  • the user intention identification module 220 is configured to identify the current situation of the electronic device, and determine the user intention based on the current situation.
  • the content pushing module 230 is configured to push content based on the user intention.
  • the content push module 230 includes: a prompt information output submodule and a first content push submodule, wherein:
  • the prompt information output submodule is configured to output prompt information based on the user intention, wherein the prompt information is used to prompt selection of whether to execute content push.
  • the first content pushing submodule is configured to push the content when the input confirmation information is received.
  • the first content pushing submodule includes: a voice information obtaining unit and a first content pushing unit, wherein:
  • the voice information obtaining unit is configured to, when receiving an input voice command, recognize the voice command based on a voice recognition technology, and obtain voice information included in the voice command.
  • the first content pushing unit is configured to push the content when the voice information includes preset voice information.
  • the first content push submodule includes: a tap position detection unit and a second content push unit, wherein:
  • the tap position detection unit is configured to detect a tap position corresponding to the first tap operation when a first tap operation acting on the electronic device is detected.
  • the tapping position detection unit includes: a tapping sound acquisition subunit and a tapping position determination subunit, wherein:
  • the knocking sound acquisition subunit is configured to acquire a knocking sound corresponding to the first knocking operation when a first knocking operation acting on the electronic device is detected.
  • the tapping position determination subunit is configured to determine the tapping position corresponding to the first tapping operation based on the tapping sound.
  • the second content push unit is configured to push the content when the tap position satisfies a preset tap position.
  • the first content pushing submodule includes: connecting the number of times of tapping unit and the third content pushing unit, wherein:
  • the continuous tapping times unit is configured to, when a second tapping operation acting on the electronic device is detected, detect the consecutive tapping times corresponding to the second tapping operation.
  • the third content pushing unit is configured to push the content when the number of consecutive taps meets a preset number of taps.
  • the first content pushing submodule includes: an image information obtaining unit and a fourth content pushing unit, wherein:
  • the image information obtaining unit is configured to, when an image is collected by the camera, recognize the image based on an image recognition technology, and obtain image information corresponding to the image.
  • the fourth content pushing unit is configured to push the content when the image information includes preset image information.
  • the content push module 230 includes: a user portrait acquisition sub-module and a second content push sub-module, wherein:
  • the user portrait acquiring submodule is configured to acquire the user portrait of the user corresponding to the electronic device.
  • the second content pushing submodule is configured to push the content based on the user intention and the user portrait.
  • the coupling between the modules may be electrical, mechanical or other forms of coupling.
  • each functional module in each embodiment of the present application may be integrated into one processing module, each module may exist separately physically, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules.
  • FIG. 12 shows a structural block diagram of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may be an electronic device capable of running application programs, such as a smart phone, a tablet computer, and an e-book.
  • the electronic device 100 in this application may include one or more of the following components: a processor 110, a memory 120, and one or more application programs, wherein one or more application programs may be stored in the memory 120 and configured to be executed by a or a plurality of processors 110, and one or more programs are configured to execute the methods described in the foregoing method embodiments.
  • the processor 110 may include one or more processing cores.
  • the processor 110 uses various interfaces and lines to connect various parts of the entire electronic device 100, and executes or executes instructions, programs, code sets or instruction sets stored in the memory 120, and calls data stored in the memory 120 to execute Various functions of the electronic device 100 and processing data.
  • the processor 110 may adopt at least one of Digital Signal Processing (Digital Signal Processing, DSP), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), and Programmable Logic Array (Programmable Logic Array, PLA). implemented in the form of hardware.
  • DSP Digital Signal Processing
  • FPGA Field-Programmable Gate Array
  • PLA Programmable Logic Array
  • the processor 110 may integrate one or a combination of a central processing unit (Central Processing Unit, CPU), a graphics processing unit (Graphics Processing Unit, GPU), a modem, and the like.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the CPU mainly handles the operating system, user interface and application programs, etc.
  • the GPU is used to render and draw the content to be displayed
  • the modem is used to handle wireless communication. It can be understood that, the above-mentioned modem may not be integrated into the processor 110, but may be realized by a communication chip alone.
  • the memory 120 may include random access memory (Random Access Memory, RAM), and may also include read-only memory (Read-Only Memory).
  • the memory 120 may be used to store instructions, programs, codes, sets of codes, or sets of instructions.
  • the memory 120 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system and instructions for implementing at least one function (such as a touch function, a sound playback function, an image playback function, etc.) , instructions for implementing the following method embodiments, and the like.
  • the storage data area can also store data created during use of the electronic device 100 (such as phonebook, audio and video data, chat record data) and the like.
  • FIG. 13 shows a structural block diagram of a computer-readable storage medium provided by an embodiment of the present application.
  • Program codes are stored in the computer-readable medium 300, and the program codes can be invoked by a processor to execute the methods described in the foregoing method embodiments.
  • the computer readable storage medium 300 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the computer-readable storage medium 300 includes a non-transitory computer-readable storage medium (non-transitory computer-readable storage medium).
  • the computer-readable storage medium 300 has a storage space for program code 310 for executing any method steps in the above methods. These program codes can be read from or written into one or more computer program products.
  • Program code 310 may, for example, be compressed in a suitable form.
  • the interaction method, device, electronic device, and storage medium activate the context awareness function of the electronic device in response to the received interaction information that satisfies the preset condition, and identify the current situation where the electronic device is located.
  • Context based on the current context to determine the user's intention, based on the user's intention, to push the content, so as to activate the situation awareness function of the electronic device through interactive information, so as to reduce the resource demand and power consumption of the electronic device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention se rapporte au champ technique des dispositifs électroniques. L'invention concerne un procédé et un appareil d'interaction, et un dispositif électronique et un support d'informations. Le procédé est appliqué à un dispositif électronique, et le procédé consiste : en réponse à des informations d'interaction reçues qui satisfont à une condition prédéfinie, à activer une fonction de perception de contexte d'un dispositif électronique ; à identifier le contexte actuel dans lequel est situé le dispositif électronique, et puis à déterminer une intention d'utilisateur sur la base du contexte actuel ; et à transmettre du contenu sur la base de l'intention d'utilisateur. Au moyen de la présente invention, une fonction de perception de contexte d'un dispositif électronique est activée au moyen d'informations d'interaction, de sorte que les besoins en ressources du dispositif électronique sont réduits, et que la consommation de puissance du dispositif électronique est également réduite.
PCT/CN2022/130876 2021-12-10 2022-11-09 Procédé et appareil d'interaction, et dispositif électronique et support d'informations WO2023103699A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111507921.3A CN114285930B (zh) 2021-12-10 2021-12-10 交互方法、装置、电子设备以及存储介质
CN202111507921.3 2021-12-10

Publications (1)

Publication Number Publication Date
WO2023103699A1 true WO2023103699A1 (fr) 2023-06-15

Family

ID=80871662

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/130876 WO2023103699A1 (fr) 2021-12-10 2022-11-09 Procédé et appareil d'interaction, et dispositif électronique et support d'informations

Country Status (2)

Country Link
CN (1) CN114285930B (fr)
WO (1) WO2023103699A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114285930B (zh) * 2021-12-10 2024-02-23 杭州逗酷软件科技有限公司 交互方法、装置、电子设备以及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104951077A (zh) * 2015-06-24 2015-09-30 百度在线网络技术(北京)有限公司 基于人工智能的人机交互方法、装置和终端设备
CN108320742A (zh) * 2018-01-31 2018-07-24 广东美的制冷设备有限公司 语音交互方法、智能设备及存储介质
CN109445313A (zh) * 2018-12-12 2019-03-08 仲恺农业工程学院 任务驱动的动态自适应环境感知移动机器人及系统、方法
US20190103102A1 (en) * 2017-10-04 2019-04-04 The Toronto-Dominion Bank Persona-based conversational interface personalization using social network preferences
CN111405478A (zh) * 2020-03-02 2020-07-10 Oppo广东移动通信有限公司 服务提供方法、装置、终端及存储介质
CN111797874A (zh) * 2019-04-09 2020-10-20 Oppo广东移动通信有限公司 行为预测方法、装置、存储介质及电子设备
CN111796926A (zh) * 2019-04-09 2020-10-20 Oppo广东移动通信有限公司 指令执行方法、装置、存储介质及电子设备
CN114285930A (zh) * 2021-12-10 2022-04-05 杭州逗酷软件科技有限公司 交互方法、装置、电子设备以及存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10042603B2 (en) * 2012-09-20 2018-08-07 Samsung Electronics Co., Ltd. Context aware service provision method and apparatus of user device
CN109741108A (zh) * 2018-12-29 2019-05-10 安徽云森物联网科技有限公司 基于情境感知的流式应用推荐方法、装置和电子设备
CN110445937B (zh) * 2019-09-16 2021-09-21 Oppo(重庆)智能科技有限公司 事件提醒方法及相关产品
CN113469711A (zh) * 2021-07-20 2021-10-01 阳光保险集团股份有限公司 一种智能客服交互方法、装置、电子设备及存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104951077A (zh) * 2015-06-24 2015-09-30 百度在线网络技术(北京)有限公司 基于人工智能的人机交互方法、装置和终端设备
US20190103102A1 (en) * 2017-10-04 2019-04-04 The Toronto-Dominion Bank Persona-based conversational interface personalization using social network preferences
CN108320742A (zh) * 2018-01-31 2018-07-24 广东美的制冷设备有限公司 语音交互方法、智能设备及存储介质
CN109445313A (zh) * 2018-12-12 2019-03-08 仲恺农业工程学院 任务驱动的动态自适应环境感知移动机器人及系统、方法
CN111797874A (zh) * 2019-04-09 2020-10-20 Oppo广东移动通信有限公司 行为预测方法、装置、存储介质及电子设备
CN111796926A (zh) * 2019-04-09 2020-10-20 Oppo广东移动通信有限公司 指令执行方法、装置、存储介质及电子设备
CN111405478A (zh) * 2020-03-02 2020-07-10 Oppo广东移动通信有限公司 服务提供方法、装置、终端及存储介质
CN114285930A (zh) * 2021-12-10 2022-04-05 杭州逗酷软件科技有限公司 交互方法、装置、电子设备以及存储介质

Also Published As

Publication number Publication date
CN114285930A (zh) 2022-04-05
CN114285930B (zh) 2024-02-23

Similar Documents

Publication Publication Date Title
WO2021027267A1 (fr) Procédé et appareil d'interaction parlée, terminal et support de stockage
US20220038615A1 (en) Wearable Multimedia Device and Cloud Computing Platform with Application Ecosystem
US20180137097A1 (en) Electronic device and control method therefor
US11223497B2 (en) Method and apparatus for providing notification by interworking plurality of electronic devices
KR101363201B1 (ko) 상태 인식을 이용한 휴대용 전자 장치
US20160253083A1 (en) Method and apparatus for supporting communication in electronic device
EP3358813B1 (fr) Système et procédé permettant de transmettre des informations de communication
US20150249718A1 (en) Performing actions associated with individual presence
WO2018227823A1 (fr) Procédé de génération de portrait d'utilisateur, et terminal
WO2019140702A1 (fr) Procédé et dispositif permettant de générer une image de profil d'utilisateur
WO2015043505A1 (fr) Procédé, appareil et système d'envoi et de réception d'informations de réseau social
KR20150044830A (ko) 휴대용 장치가 웨어러블 장치를 통하여 정보를 표시하는 방법 및 그 장치
US20240320417A1 (en) Annotating a collection of media content items
US11816269B1 (en) Gesture recognition for wearable multimedia device using real-time data streams
WO2023103699A1 (fr) Procédé et appareil d'interaction, et dispositif électronique et support d'informations
US11908489B2 (en) Tap to advance by subtitles
CN111862972B (zh) 语音交互服务方法、装置、设备及存储介质
WO2016052501A1 (fr) Dispositif d'interface d'utilisateur, programme et procédé de notification de contenu
CN111341317B (zh) 唤醒音频数据的评价方法、装置、电子设备及介质
CN114666433A (zh) 一种终端设备中啸叫处理方法及装置、终端
US11564069B2 (en) Recipient-based content optimization in a messaging system
KR20200100367A (ko) 루틴을 제공하기 위한 방법 및 이를 지원하는 전자 장치
WO2023125514A1 (fr) Procédé de commande de dispositif et appareil associé
CN114237453A (zh) 一种控制设备的方法、电子设备和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22903122

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE