WO2020114157A1 - 交互控制方法和装置、存储介质及电子装置 - Google Patents

交互控制方法和装置、存储介质及电子装置 Download PDF

Info

Publication number
WO2020114157A1
WO2020114157A1 PCT/CN2019/114396 CN2019114396W WO2020114157A1 WO 2020114157 A1 WO2020114157 A1 WO 2020114157A1 CN 2019114396 W CN2019114396 W CN 2019114396W WO 2020114157 A1 WO2020114157 A1 WO 2020114157A1
Authority
WO
WIPO (PCT)
Prior art keywords
holographic projection
projection device
touch
interaction
virtual character
Prior art date
Application number
PCT/CN2019/114396
Other languages
English (en)
French (fr)
Inventor
乔什·达瓦·詹米
陈晓玫
邬文捷
陈镜州
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP19891843.5A priority Critical patent/EP3893099A4/en
Publication of WO2020114157A1 publication Critical patent/WO2020114157A1/zh
Priority to US17/075,441 priority patent/US11947789B2/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • G03H2001/0061Adaptation of holography to specific applications in haptic applications when the observer interacts with the holobject

Definitions

  • This application relates to the field of computers, specifically, interactive control.
  • AI artificial intelligence
  • holographic projection devices can present holograms that interact with users in real-time information Project virtual characters.
  • the holographic projection virtual character can not only learn the user's hobbies and habits, and respond to the instructions issued by the user in time and accurately, but also gradually develop a unique character personality.
  • Embodiments of the present application provide an interactive control method and device, a storage medium, and an electronic device, which can effectively realize interactive control between a user and a holographic projection virtual character in a holographic projection device.
  • an interactive control method including: a holographic projection device recognizing a touch operation performed on a touch panel, wherein the holographic projection device is used to present a holographic projection virtual character; the touch panel setting On the holographic projection device; the holographic projection device generates an interaction request based on the identified touch operation, wherein the interaction request is used to request interaction with the holographic projection virtual character; the holographic projection device controls the holographic projection virtual character to perform and Interaction actions that match the above interaction request.
  • a holographic projection device including: a holographic projection film for presenting a holographic projection virtual character; a projection optical machine for controlling the projection of the holographic projection virtual character onto a On the holographic projection film; a touchpad for collecting touch information generated by a touch operation; a processor connected to the touchpad and the projection light machine for identifying the touch operation based on the touch information; also Used to generate an interaction request according to the identified touch operation, wherein the interaction request is used to request interaction with the holographic projection virtual character; also used to send the interaction request to the projection optical machine; The projection optical machine is also used to control the holographic projection virtual character to perform an interaction action matching the interaction request according to the interaction request.
  • an interactive control device which is applied to a holographic projection device, and includes: an identification unit for identifying a touch operation performed on a touch panel, wherein the holographic projection device is used for Presenting the holographic projection virtual character; the above-mentioned touch panel is provided on the holographic projection device; the generating unit is used to generate an interaction request according to the identified above-mentioned touch operation, wherein the above-mentioned interaction request is used to request interaction with the above-mentioned holographic projection virtual character
  • the control unit is used to control the holographic projection virtual character to perform an interaction action matching the interaction request.
  • the above-mentioned identification unit includes: a first acquisition module for acquiring touch information generated by interaction between the contact corresponding to the touch operation and the touchpad; and an identification module for The touch information identifies the touch operation.
  • the above identification module includes: a first determination sub-module for determining that the number of strokes performed by the contact on the touch panel is greater than a first threshold according to the touch information
  • the swiping action refers to an action that the contact moves a distance greater than a second threshold on the touch panel
  • second A determining sub-module when it is determined according to the touch information that the distance the contact moves on the touch panel is less than a third threshold, and the duration of the contact acting on the touch panel is less than a fourth threshold , Determine that the touch operation is to perform a flapping operation on the holographic projection virtual character.
  • the above-mentioned first determining sub-module is also used to perform the following steps: acquiring the first position where the contact stays on the touch panel at the first moment, and the contact is at the second moment A second position on the touch panel, wherein the time interval between the first time and the second time is one frame period; the time between determining the first position and the second position When the distance is greater than the second threshold, the action of moving the contact point from the first position to the second position is recognized as the stroke action, and the number of times of the stroke action is incremented by one ; When it is determined that the number of strokes is greater than the first threshold, it is determined that the touch operation is performing the touching operation on the holographic projection virtual character.
  • the above-mentioned second determination sub-module is further used to perform the following steps: acquiring the third position where the contact stays on the touch panel at the third moment, and the contact at the fourth moment A fourth position staying on the touchpad; after determining that the distance between the third position and the fourth position is less than the third threshold, and the third time to the fourth time When the time interval between is less than the fourth threshold, it is determined that the touch operation is to perform the flapping operation on the holographic projection virtual character.
  • the above-mentioned second determining sub-module is further used to perform the following steps: after acquiring the third position where the contact stays on the touch panel, in each frame period Perform the following steps until the end of the touch operation is detected: obtain the current position of the contact point on the touch panel at the target time in the current frame period, wherein the third time to the target time The time interval between them is N frame periods, N ⁇ 1, and N is an integer; when it is detected that the touch operation has not ended, the acquisition in the next frame period after the current frame period is obtained The target position at which the contact point stays on the touch panel at the target moment, the target moment in the next frame period is used as the target moment in the current frame period, and the next frame The target position where the contact stays in the cycle is taken as the current position; when the end of the touch operation is detected, the target time in the current frame period is taken as the fourth time, The current position is taken as the fourth position; wherein, when the action area of the contact point on the touch panel is less than or equal to a
  • the identification unit further includes: a second acquisition module, configured to acquire the interaction between the contact and the touchpad after the touch operation is identified based on the touch information An action area; a determination module, configured to determine the interaction information requested by the interaction request according to the position of the action area on the touch panel.
  • the above determination module includes at least one of the following: a third determination submodule, configured to determine the interaction with the holographic projection virtual character according to the position of the action area on the touch panel The role part; determine the interaction information requested by the interaction request according to the role part; a fourth determination submodule for determining the interaction with the holographic projection virtual character according to the position of the action area on the touch panel The interactive type of interaction; determining the interactive information requested by the interactive request according to the interactive type.
  • a third determination submodule configured to determine the interaction with the holographic projection virtual character according to the position of the action area on the touch panel The role part; determine the interaction information requested by the interaction request according to the role part; a fourth determination submodule for determining the interaction with the holographic projection virtual character according to the position of the action area on the touch panel The interactive type of interaction; determining the interactive information requested by the interactive request according to the interactive type.
  • control unit includes at least one of the following: a first control module for controlling the holographic projection virtual character to play an animation matching the interaction request; a second control module for controlling The holographic projection virtual character plays audio matching the interaction request.
  • the above apparatus further includes: an adjustment unit for adjusting the emotion parameter of the holographic projection virtual character; if it is determined that the emotion parameter is reaching a target value, the holographic projection device controls the hologram The character response mode of the projected virtual character, and/or, the holographic projection device starts the hidden skill of the holographic projected virtual character in the virtual scene.
  • a storage medium in which a computer program is stored, wherein the computer program is configured to execute the above interactive control method during runtime.
  • an electronic device including a memory and a processor, a computer program is stored in the memory, and the processor is configured to execute the above interactive control method through the computer program .
  • a computer program product including instructions, which when executed on a computer, causes the computer to execute the above interactive control method.
  • the identification of the touch operation performed on the touch panel provided on the holographic projection device is adopted, wherein the holographic projection device is used to present a holographic projection virtual character; an interaction request is generated according to the identified touch operation, wherein , The interaction request is used to request interaction with the holographic projection virtual character; the method of controlling the holographic projection virtual character to perform an interaction action matching the interaction request is used in the process of controlling the holographic projection device, using a touch panel to receive Touch operation and operate the holographic projection device according to the touch operation, so that it can quickly and accurately interact with the holographic projection virtual character in the holographic projection device according to the touch operation received by the touch panel, enabling flexible, accurate and efficient virtual partner
  • the technical effect of control further solves the technical problem of the single control method of the virtual partner existing in the related technology.
  • FIG. 1 is a schematic diagram of an application environment of an optional interactive control method according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of an optional interactive control method according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an optional interactive control method according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of another optional interactive control method according to an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of another optional interactive control method according to an embodiment of the present application.
  • FIG. 6 is a schematic flowchart of another optional interactive control method according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of another optional interactive control method according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of another optional interactive control method according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of another optional interactive control method according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of another optional interactive control method according to an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an optional holographic projection device according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of an optional holographic projection device according to an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of an optional interactive control device according to an embodiment of the present application.
  • an interactive control method is provided.
  • the foregoing interactive control method may be, but not limited to, applied to the environment shown in FIG. 1.
  • the user 102 may perform a touch operation on the touch panel 104.
  • the host 106 recognizes the touch operation and generates an interaction request according to the recognition result.
  • the host 106 controls the holographic projection virtual character 110 projected by the projector 108 through the interaction request to perform an interaction action matching the interaction request.
  • the battery 112 is used to power the entire device.
  • FIG. 1 is only an example of a hardware structure, and does not constitute a limitation on this embodiment.
  • the touchpad 104 and the projector 108 may appear separately.
  • the touch panel 104 transmits a touch signal to the host 106 through the signal transmission device connected to the touch panel 104.
  • the projector 108 receives the interaction to be projected from the host 106 through the signal receiving device.
  • a client or voice command is usually used to interact with the holographic projection device.
  • the above method of interacting with the holographic projection device has a single control command, and the holographic projection device can only respond simply.
  • a method of acquiring a touch operation through a touch panel and recognizing the touch operation, generating an interaction request based on the recognized touch operation, and performing an interaction action matching the interaction request according to the interaction request is described above.
  • the holographic projection device can be controlled directly through the touch operation received by the touch panel, thereby enriching the instructions for controlling the holographic projection device and improving the flexibility of interacting with the virtual character of the holographic projection.
  • the interactive control method provided in the embodiments of the present application may be applied to a holographic projection device; and, as an optional implementation manner, the interactive control method may include:
  • the holographic projection device recognizes the touch operation performed on the touch panel, wherein the holographic projection device is used to present a virtual character of holographic projection; the touch panel is disposed on the holographic projection device.
  • the holographic projection device generates an interaction request according to the identified touch operation, where the interaction request is used to request interaction with the holographic projection virtual character;
  • the holographic projection device controls the holographic projection virtual character to perform an interaction action matching the interaction request.
  • the above interaction control method may be, but not limited to, applied to the process of interacting with the holographic projection virtual character in the holographic projection device.
  • the above-mentioned interaction control method can be applied to the process of interacting with the projected holographic virtual cartoon character, and can also be applied to the process of interacting with the projected holographic virtual building.
  • the following will take the application of the interactive control method to the process of interacting with the holographic virtual animation character projected by the holographic projection device as an example for description.
  • the holographic virtual character is projected by the holographic projection device
  • the user can observe the holographic virtual character.
  • the holographic projection device recognizes the touch operation and generates an interaction request, and controls the holographic virtual anime character to perform an interaction action matching the interaction request according to the interaction request (for example, shy ), at this time, the user can watch the holographic virtual cartoon character execute the interactive action, thereby completing the interaction between the user and the holographic virtual cartoon character.
  • the holographic projection device uses a touchpad to obtain touch operations first, so as to recognize the touch operations, and then generate an interaction request according to the identified touch operations, so as to perform an interaction action matching the interaction request according to the interaction request.
  • Control method the holographic projection device can be controlled directly by the touch operation received by the touch panel, thereby enriching the instructions for controlling the holographic projection device and improving the flexibility of interacting with the virtual character of the holographic projection Sex.
  • the above-mentioned interactive action may be, but not limited to, a holographic projection virtual character's body movement and/or a holographic projection virtual character's vocalization.
  • controlling the holographic projection virtual character to perform the interactive action matching the interaction request may be controlling the holographic projection virtual character to play the animation matching the interaction request; and/or controlling the holographic projection virtual character to match the interaction request Audio.
  • controlling the holographic projection virtual character to play audio may be, but not limited to, changing the mouth of the controlling holographic projection virtual character, while using a sound playback device to play audio.
  • the above-mentioned touch operation may be obtained through touch information generated by the interaction between the contact point and the touch panel. After acquiring the touch information, a touch operation is recognized based on the touch information.
  • the above touch information may be, but not limited to, at least one of the following: the duration of the touch action of the contact, the touch trajectory of the contact, and the touch position where the contact is located.
  • the above-mentioned contact point may be, but not limited to, a contact position between a human body or a tool used by the human body and the touch pad.
  • the above-mentioned human body may be, but not limited to, any part of a person, for example, a finger, a palm, a back of a hand, a nail, a forehead, a chin, a cheek, or other parts.
  • the above-mentioned contacts may be, but not limited to, multiple, for example, if the palm contacts the touchpad, multiple contacts will be formed. Multiple contacts are synchronized and have the same action. If the finger touches the touch screen, a contact will be formed.
  • the above touch information may be, but not limited to, contact movement information or contact time information.
  • the above contact point is the contact position between the finger and the touch screen.
  • the finger When the finger is in contact with the touchpad, it may move to leave a trace on the touch screen; or the finger may touch the touch screen once and leave for a short time to click on the touch screen.
  • the above-mentioned trajectory or click operation can be used as touch information to identify the touch operation according to the above-mentioned touch information.
  • the holographic projection device identifying the touch operation according to the touch information includes:
  • the holographic projection device determines that the touch operation is to perform a touch operation on the holographic projection virtual character, where the contact point is The action that the distance moved on the touch panel is greater than the second threshold will be recognized as a swipe action;
  • the holographic projection device determines that the touch operation is virtual to the holographic projection The character performs a flapping operation.
  • the first threshold, the second threshold, the third threshold, and the fourth threshold may be, but not limited to, set according to empirical values. Taking the above-mentioned contact as an example where the finger is in contact with the touchpad, as shown in FIG. 3, when the finger contacts the touchpad to form contact A, the finger moves, from contact A to contact B, and the moving distance For a. Judging the relationship between a and the first threshold, if a is greater than or equal to the second threshold, it means that this operation is a sliding operation. If a is less than the third threshold, it means that this operation is not a sliding operation, and it needs to be judged according to other conditions.
  • the above action duration may be, but not limited to, a single contact duration. For example, if the finger touches the touchpad for 1 second, the action duration is 1 second.
  • this operation is a sliding operation
  • the number of received sliding operations is judged, and if the number of sliding operations is greater than the first threshold, the holographic projection The device can calculate that the user has performed multiple sliding operations on the touchpad. At this time, the holographic projection device can obtain that the user is performing a stroke operation. If the operation is not a sliding operation, and the duration of the operation is less than the fourth threshold, the holographic projection device can calculate that the user has performed a single short-term touch operation on the touch panel. At this time, the holographic projection device can obtain that the user is performing a flapping action.
  • the holographic projection device determines that the touch operation is to perform a touch operation on the holographic projection virtual character, but may not be limited to the following method: the holographic projection device obtains the first position where the contact stays on the touch panel at the first moment, and the second The second position where the contact stays on the touch panel at the moment, where the time interval from the first moment to the second moment is a frame period; and the distance between the first position and the second position is greater than the second threshold
  • the holographic projection device recognizes the movement of the contact from the first position to the second position as a swipe movement, and counts up the number of swipe movements; moreover, the number of swipe movements is greater than the first
  • the holographic projection device determines that the touch operation is to perform a stroke operation on the holographic projection virtual character.
  • FIG. 4 is an optional touch node of a touch screen.
  • the coordinates of point C are (x, y), and x and y are both positive numbers.
  • the description will be made with reference to FIG. 5 when the palm is in contact with the touch screen.
  • the touchpad is in the to-be-detected state (that is, S502)
  • the palm makes contact with the touchpad, and it is determined whether the number of contacts is greater than 3 through S504, and in the case where the contact is greater than 3, the coordinates of the contact are recorded through S506.
  • the contact disappears it is determined through S508 whether the number of contacts before the contact disappears is greater than 3, and in the case of yes, the coordinates of the contact disappearance are recorded through S510.
  • the starting coordinate of each contact corresponds to the disappearing coordinate.
  • the distance D between the start coordinate and the disappearance coordinate of a touch point is calculated through S512, and it is determined whether D is greater than the second threshold through S514. If yes, the number of sliding actions E is increased by 1 through S516. It is determined through S518 whether E is greater than the first threshold, and if so, it is determined that the currently performed action is a stroke operation. In this process, if the number of contacts in S504 and S508 is not greater than 3, or after determining that the current action is a stroke operation, execute S522 to reset E to 0.
  • the above determination of the number of contacts is greater than 3, and 3 of them can be flexibly set.
  • the above process is a process of judging whether the palm is performing a stroke operation. If it is to determine whether the finger (or other parts) is touching, you can, but not limited to, delete the steps S504 and S508 in the process in FIG. 5, there is no need to determine whether the number of contacts is greater than 3, use the remaining steps in FIG. 5 You can determine whether the current operation is a touch operation.
  • the holographic projection device determines that the touch operation is to perform a slap operation on the holographic projection virtual character, but may not be limited to the following method: the holographic projection device obtains the third position where the contact stays on the touch panel at the third moment, and the fourth The fourth position where the contact stays on the touchpad at the moment; the distance between the third position and the fourth position is less than the third threshold, and the time interval between the third moment and the fourth moment is less than the fourth threshold Next, it is determined that the touch operation is to perform a flapping operation on the holographic projection virtual character.
  • L is greater than or equal to the third threshold or T is greater than or equal to the fourth threshold or after judging that the current action is a flapping operation, it will be in the state of S602 to be detected again. And if the contact does not disappear during the detection process, the position of the contact and the current time will be recorded periodically.
  • the method further includes: the holographic projection device obtains the active area where the contact point and the touch panel interact; the holographic projection device determines the interaction according to the position of the active area on the touch panel Request the requested interactive information.
  • the touchpad is divided into six areas A, B, C, D, E, and F, and the interactions generated in different areas correspond to different interaction information.
  • the size and shape of the areas may be the same or different, and the number of blocks in the divided area may be any positive number greater than or equal to 1.
  • the holographic projection device determines that the interaction information requested by the interaction request includes at least one of the following two steps according to the position of the action area on the touch panel:
  • the holographic projection device determines the character part interacting with the holographic projection virtual character according to the position of the action area on the touch panel; and determines the interaction information requested by the interaction request according to the character part.
  • the touchpad may be divided into different areas, but different areas represent different character parts.
  • the touchpad is divided into 6 pieces, each of which corresponds to the character's head, left arm, right arm, stomach, left leg, and right leg. Touching different areas means interacting with different body parts of the character. For example, patting the head, touching the stomach, etc.
  • the holographic projection device determines the interaction type to interact with the holographic projection virtual character according to the position of the action area on the touch panel; and determines the interaction information requested by the interaction request according to the interaction type.
  • the above interaction type may include, but is not limited to, an interaction form or an interaction mode.
  • the above-mentioned interactive form may be, but not limited to, the actions performed by the character or the sound played, and the above-mentioned interactive mode may, but not limited to, include the mode in which the current character is located.
  • the above-mentioned modes may be, but not limited to, a variety of modes that imitate human emotions. For example, shy mode, angry mode, fear mode, coquettish mode, etc.
  • FIG. 9 is to set different positions of the touchpad to perform different forms of interaction (for example, dancing, singing, etc.); and, FIG. 10 is to set different positions of the touchpad to different human-like emotion patterns (for example, shy Mode, coquettish mode).
  • the interactive control method further includes: the holographic projection device adjusts the emotional parameters of the holographic projection virtual character; if it is determined that the emotional parameters are reaching the target value, the holographic projection device controls the character response mode of the holographic projection virtual character, and/or Or, the holographic projection device enables the hidden skill of the holographic projection virtual character in the virtual scene.
  • the embodiment of the present application does not limit the execution time of "the holographic projection device adjusts the emotional parameters of the holographic projection virtual character", for example, it can be executed when the holographic projection virtual character is controlled to perform an interaction action matching the interaction request.
  • the above emotion parameters may be represented by numbers, or by the number of red hearts, energy values, and the like.
  • the above hidden skills may be, but not limited to, new animations, new sounds, or new stages. For example, taking the above hidden skill as a new animation as an example, after the emotion parameter reaches a certain value, the character can be allowed to play a new animation. If the emotion parameter does not reach a certain value, the character is not allowed to play a new animation.
  • the holographic projection device obtains touch operations through the touch panel and recognizes the touch operations, generates an interaction request based on the identified touch operation, and performs an interaction action matching the interaction request according to the interaction request.
  • the holographic projection device can be controlled directly through the touch operation received by the touch panel, thereby enriching the instructions for controlling the holographic projection device and improving the flexibility of interacting with the virtual character of the holographic projection.
  • the holographic projection device identifying the touch operation performed on the touch panel provided on the holographic projection device includes:
  • Step 1 The holographic projection device obtains touch information generated by the interaction between the contact corresponding to the touch operation and the touch panel;
  • Step 2 The holographic projection device recognizes the touch operation according to the touch information.
  • the location of the contact is the contact point. If the finger moves, the contact will move with it.
  • the movement trajectory of the contact point, the duration of the touch effect of the finger, and the position of the contact point are used as touch information, and the touch operation is recognized according to the touch information.
  • the touch information generated by the interaction between the contact corresponding to the touch operation and the touch panel is acquired and the touch operation is performed according to the touch information, thereby improving the control flexibility of the holographic projection device.
  • the holographic projection device recognizing the touch operation according to the touch information includes:
  • Step 1 When it is determined according to the touch information that the number of strokes performed by the contact point on the touch panel is greater than the first threshold, the holographic projection device determines that the touch operation is to perform a stroke operation on the holographic projection virtual character, wherein the stroke Action refers to the action that the distance that the contact moves on the touch panel is greater than the second threshold;
  • Step 2 When it is determined according to the touch information that the distance the contact moves on the touch pad is less than the third threshold, and the duration of the contact on the touch pad is less than the fourth threshold, the holographic projection device determines that the touch operation is a virtual character for the holographic projection Perform a tap operation.
  • the first threshold, the second threshold, the third threshold, and the fourth threshold may be, but not limited to, set according to empirical values. Taking the above-mentioned contact as an example where the finger is in contact with the touchpad, as shown in FIG. 3, when the finger contacts the touchpad to form contact A, the finger moves, from contact A to contact B, and the moving distance For a. Judging the relationship between a and the first threshold, if a is greater than or equal to the second threshold, it means that this operation is a sliding operation. If a is less than the third threshold, it means that this operation is not a sliding operation, and it needs to be judged according to other conditions.
  • the above action duration may be, but not limited to, a single contact duration. For example, if the finger touches the touchpad for 1 second, the action duration is 1 second.
  • this operation is a sliding operation
  • the number of received sliding operations is judged, and if the number of sliding operations is greater than the first threshold, the holographic projection The device can calculate that the user has performed multiple sliding operations on the touchpad. At this time, the holographic projection device can obtain that the user is performing a stroke operation. If the operation is not a sliding operation, and the duration of the operation is less than the fourth threshold, the holographic projection device can calculate that the user has performed a single short-term touch operation on the touch panel. At this time, the holographic projection device can obtain that the user is performing a flapping action.
  • the touch information by comparing the touch information with the first threshold, the second threshold, the third threshold, and the fourth threshold, it is determined whether the current touch operation is a stroke operation or a tap operation, thereby improving the current touch operation The accuracy of the judgment, thereby improving the control flexibility of the holographic projection device.
  • the holographic projection device determining that the touch operation is to perform a touch operation on the holographic projection virtual character includes:
  • Step 1 The holographic projection device acquires the first position where the contact stays on the touchpad at the first moment, and the second position where the contact stays on the touchpad at the second moment, where the first moment to the second moment The time interval between them is one frame period;
  • Step 2 When it is determined that the distance between the first position and the second position is greater than the second threshold, the holographic projection device recognizes the movement of the contact point from the first position to the second position as a swipe movement, and Add one count to the number of actions;
  • Step 3 When it is determined that the number of swiping actions is greater than the first threshold, the holographic projection device determines that the touch operation is to perform a stroke operation on the holographic projected virtual character.
  • the touchpad is in a state to be detected (that is, S502).
  • S504 the number of contacts is greater than 3, and when the contacts are greater than 3, the coordinates of the contact are recorded through S506.
  • S508 it is determined through S508 whether the number of contacts before the contact disappears is greater than 3, and in the case of yes, the coordinates of the contact disappearance are recorded through S510.
  • the starting coordinate of each contact corresponds to the disappearing coordinate.
  • the distance D between the start coordinate and the disappearance coordinate of a touch point is calculated through S512, and it is determined whether D is greater than the second threshold through S514. If yes, the number of sliding actions E is increased by 1 through S516. It is determined through S518 whether E is greater than the first threshold, and if so, it is determined that the currently performed action is a stroke operation. In this process, if the number of contacts in S504 and S508 is not greater than 3, or after determining that the current action is a stroke operation, execute S522 to reset E to 0.
  • the above determination of the number of contacts is greater than 3, and 3 of them can be flexibly set.
  • the above process is a process of judging whether the palm is performing a stroke operation. If it is to determine whether the finger (or other part) is touching, it can be but not limited to delete the steps S504 and S508 in the process in FIG. 5, there is no need to determine whether the number of contacts is greater than 3, use the remaining steps in FIG. 5 You can determine whether the current operation is a stroke operation.
  • the above method is used to determine whether the current operation is a touch operation, thereby improving the accuracy of judging the current touch operation, thereby improving the control flexibility of the holographic projection device.
  • the holographic projection device determining that the touch operation is to perform a tap operation on the holographic projection virtual character includes:
  • Step 1 The holographic projection device obtains the third position where the contact stays on the touchpad at the third moment, and the fourth position where the contact stays on the touchpad at the fourth moment;
  • Step 2 When it is determined that the distance between the third position and the fourth position is less than the third threshold, and the time interval between the third time and the fourth time is less than the fourth threshold, the holographic projection device determines that the touch operation is to project the hologram The virtual character performs a flapping operation.
  • the touchpad is in a state to be detected in S602. After the palm touches the touchpad, it is detected through S604 whether the number of contacts is greater than 3, and if so, the coordinates and generation time of one contact or all contacts are recorded through S606. It is determined through S608 whether the number of contacts is equal to 0, and if so, the contact disappearance coordinates and disappearance time of the above contacts or all contacts are recorded. In S610, the time interval T between the distance L between the coordinates of the touch points and the time when the touch points are generated and disappeared is calculated.
  • L is smaller than the third threshold
  • T is less than the fourth threshold
  • the above method is used to determine whether the current operation is a flapping operation, thereby improving the accuracy of judging the current touch operation, thereby improving the control flexibility of the holographic projection device.
  • the holographic projection device acquiring the fourth position where the contact stays on the touch panel at the fourth moment includes:
  • Step 1 After acquiring the third position where the contact stays on the touch panel, the holographic projection device performs the following steps in each frame period until the end of the touch operation is detected:
  • Step 2 The holographic projection device obtains the current position of the contact on the touch panel at the target moment in the current frame period, where the time interval from the third moment to the target moment is N frame periods, N ⁇ 1, And N is an integer;
  • Step 3 When it is detected that the touch operation has not ended, the holographic projection device acquires the target position where the contact stays on the touch panel at the target moment in the next frame period after the current frame period, and The target moment of time is taken as the target moment in the current frame period, and the target position where the contact stays in the next frame period is taken as the current position;
  • Step 4 In the case of detecting the end of the touch operation, the holographic projection device takes the target time in the current frame period as the fourth time and the current position as the fourth position;
  • Step 5 wherein, when the action area of the contact point on the touch panel is less than or equal to the fifth threshold, it is determined that the touch operation ends.
  • the case where the finger is in contact with the touchpad will be described as an example.
  • the finger touches the touchpad the current position and current time of the contact point are obtained. If the touch operation is not completed, the touch position and touch time of the contact in the next frame period are obtained, and until the end of the touch operation, the touch position and touch time immediately before the end are obtained, and the touch position immediately before the end is determined as In the fourth position, the touch time immediately before the end is determined as the fourth time. Since the area of the contact between the finger and the touchpad increases as the pressure of the finger increases, if the area of the contact decreases to an empirical value, it means that the touch operation has ended and the finger has left the touchpad.
  • the fourth time and the fourth position are determined by the above method, thereby improving the accuracy of acquiring the fourth time and the fourth position.
  • the control flexibility of the holographic projection device is further improved.
  • the method further includes:
  • Step 1 The holographic projection device obtains the interaction area between the contact and the touch panel
  • Step 2 The holographic projection device determines the interaction information requested by the interaction request according to the position of the action area on the touch panel.
  • the touchpad is divided into six areas A, B, C, D, E, and F, and the interactions generated in different areas correspond to different interaction information.
  • the size and shape of the areas may be the same or different, and the number of blocks in the divided area may be any positive number greater than or equal to 1.
  • the interactive information is determined according to the position of the interaction area of the interaction between the contact point and the touch panel on the touch panel, thereby improving the control flexibility of the holographic projection device.
  • the holographic projection device determines the interaction information requested by the interaction request according to the position of the active area on the touch panel, including at least one of the following two steps:
  • the holographic projection device determines the role part interacting with the holographic projection virtual character according to the position of the action area on the touch panel; the interaction information requested by the interaction request is determined according to the role part;
  • the holographic projection device determines the interaction type for interacting with the holographic projection virtual character according to the position of the action area on the touch panel; the interaction information requested by the interaction request is determined according to the interaction type.
  • the above interaction type may include, but is not limited to, an interaction form or an interaction mode.
  • the above-mentioned interactive form may be, but not limited to, the actions performed by the character or the sound played, and the above-mentioned interactive mode may, but not limited to, include the mode in which the current character is located.
  • the above-mentioned modes may be, but not limited to, a variety of modes that imitate human emotions. For example, shy mode, angry mode, fear mode, coquettish mode, etc.
  • the touchpad is divided into 6 pieces, each of which corresponds to the character's head, left arm, right arm, stomach, left leg, and right leg. Touching different areas means interacting with different body parts of the character. For example, patting the head, touching the stomach, etc. Or as shown in Figures 9 and 10.
  • Fig. 9 is to set different positions of the touchpad to perform different forms of interaction (for example, dancing, singing, etc.); and
  • Fig. 10 is to set different positions of the touchpad to different modes of imitating human emotions (for example, shy mode, coquetry) mode).
  • the holographic projection device controls the holographic projection virtual character to perform an interaction action matched with the interaction request including at least one of the following:
  • the holographic projection device controls the holographic projection virtual character to play animation matching the interaction request
  • the holographic projection device controls the holographic projection virtual character to play audio matching the interaction request.
  • controlling the holographic projection virtual character to play audio may be, but not limited to, changing the mouth of the controlling holographic projection virtual character, while using a sound playback device to play audio.
  • the holographic projection virtual character can be controlled to dance, perform actions, or control the holographic projection virtual character's mouth to change while playing audio, simulating the holographic projection virtual character emitting sound.
  • the interactive control method further includes:
  • the holographic projection device adjusts the emotional parameters of the holographic projection virtual character; if it is determined that the emotional parameters reach the target value, the holographic projection device controls the role response mode of the holographic projection virtual character, and/or the holographic projection device turns on the holographic projection virtual character Hidden skills in virtual scenes.
  • the above emotion parameters may be represented by numbers, or by the number of red hearts, energy values, and the like.
  • the above hidden skills may be, but not limited to, new animations, new sounds, or new stages. For example, taking the above hidden skill as a new animation as an example, after the emotion parameter reaches a certain value, the character can be allowed to play a new animation. If the emotion parameter does not reach a certain value, the character is not allowed to play a new animation.
  • the number of red hearts represents the emotional parameter. After interacting with the holographic projection virtual character, the number of red hearts reaches a certain value. At this time, the holographic projection virtual character can produce new changes, for example, it can dance, sing, or change clothes.
  • a holographic projection device for implementing the above interactive control method is also provided.
  • the above holographic projection device includes:
  • the touchpad 1102 is used to collect touch information generated by a touch operation
  • the processor 1104 is connected to the touch panel and the projection light machine, and is used for identifying the touch operation according to the touch information; and also for generating an interaction request according to the identified touch operation, wherein the interaction
  • the request is used to request interaction with the holographic projection virtual character; it is also used to send the interaction request to the projection optical machine;
  • Projection light machine 1106 for controlling the projection of the holographic projection virtual character onto the holographic projection film
  • the projection light machine 1104 is further configured to control the holographic projection virtual character to perform an interaction action matching the interaction request according to the interaction request.
  • the above-mentioned touch pad may be, but not limited to, one or more pieces.
  • it may be, but not limited to, corresponding each touch pad to a different position of the holographic projection virtual character.
  • the above-mentioned processor may include, but is not limited to, a receiving module for receiving touch information collected by the touch panel 1102; an operation module for calculating an interactive action corresponding to the touch information according to the received touch information; a storage module , Used to store the interaction actions that the holographic projection virtual character may perform; the transmitter is used to send the animation information corresponding to the interaction actions performed by the holographic projection virtual object according to the contact information to the projection optical machine.
  • the projection optical machine projects the animation information on the holographic projection film to display the animation information of the holographic projection virtual character.
  • the above-mentioned holographic projection device may be applied to, but not limited to, the process of interacting with the holographic projection virtual character in the holographic projection device, for example, the process of interacting with the projected holographic virtual cartoon character, or the application with the projection The process of interacting with the holographic virtual building.
  • FIG. 12 shows two actual images of an optional holographic projection device.
  • a hand can be used to touch the touchpad area of the left holographic projection device, thereby generating touch information.
  • the holographic projection device judges the touch information. For example, if short touch information is obtained, the action of the anime character corresponding to the touch information is searched from the storage location according to the information, or The acquired short touch information is judged to determine the action of the anime character corresponding to the short touch information, thereby projecting the action corresponding to the short touch information on the touch panel.
  • the animation character in the holographic projection device on the right side of FIG. 12 when the holographic projection device receives short touch information, it projects an animation of being hit and fell, thereby forming a user interaction with the animation character.
  • an interactive control device for implementing the foregoing interactive control method. As shown in FIG. 13, the device is applied to a holographic projection device.
  • the device includes:
  • the identification unit 1302 is used to identify the touch operation performed on the touch panel, wherein the holographic projection device is used to present a virtual character of holographic projection; the touch panel is provided on the holographic projection device;
  • the generating unit 1304 is configured to generate an interaction request according to the identified touch operation, where the interaction request is used to request interaction with the holographic projection virtual character;
  • the control unit 1306 is used to control the holographic projection virtual character to perform an interaction action matching the interaction request.
  • the identification unit may include:
  • the first acquisition module is used to acquire the touch information generated by the interaction between the contact corresponding to the touch operation and the touchpad;
  • the above identification module may include:
  • the first determining submodule is used to determine that the touch operation is performing a stroke operation on the holographic projection virtual character when it is determined according to the touch information that the number of strokes performed by the contact on the touch panel is greater than the first threshold , Swipe action refers to the action that the distance that the contact moves on the touch panel is greater than the second threshold;
  • the second determination submodule is used to determine that the touch operation is when the distance that the contact moves on the touch panel is less than the third threshold and the duration of the contact on the touch panel is less than the fourth threshold according to the touch information Perform a flapping operation on the holographic projection virtual character.
  • the above-mentioned first determining sub-module is also used to perform the following steps:
  • the movement of the contact from the first position to the second position is recognized as a stroke movement, and the number of stroke movements is incremented by one ;
  • the above-mentioned second determination submodule is also used to perform the following steps:
  • the above-mentioned second determination submodule is also used to perform the following steps:
  • the target time in the current frame period is taken as the fourth time, and the current position is taken as the fourth position;
  • the identification unit further includes:
  • the second acquisition module is used to acquire the interaction area of the interaction between the contact point and the touch panel after identifying the touch operation according to the touch information;
  • the determination module is used to determine the interaction information requested by the interaction request according to the position of the action area on the touch panel.
  • the above determination module includes at least one of the following:
  • the third determination submodule is used to determine the role part interacting with the holographic projection virtual character according to the position of the action area on the touch panel; determine the interaction information requested by the interaction request according to the role part;
  • the fourth determination submodule is used to determine the type of interaction with the holographic projection virtual character according to the position of the action area on the touch panel; and determine the interaction information requested by the interaction request according to the type of interaction.
  • the above control unit includes at least one of the following:
  • the first control module is used to control the holographic projection virtual character to play animation matching the interaction request
  • the second control module is used to control the holographic projection virtual character to play audio matching the interaction request.
  • the above apparatus further includes:
  • An adjustment unit for adjusting the emotional parameters of the holographic projection virtual character if it is determined that the emotional parameters reach the target value, the holographic projection device controls the character response mode of the holographic projection virtual character, and/or The holographic projection device starts the hidden skill of the holographic projection virtual character in the virtual scene.
  • a storage medium in which a computer program is stored, wherein the computer program is configured to execute the steps in any one of the above method embodiments during runtime.
  • the above storage medium may be set to store a computer program for performing the following steps:
  • the storage medium may include: a flash disk, a read-only memory (Read-Only Memory, ROM), a random access device (Random Access Memory, RAM), a magnetic disk, or an optical disk.
  • an embodiment of the present application also provides an electronic device, including a memory and a processor, a computer program is stored in the memory, and the processor is configured to execute the interactive control method provided by the foregoing embodiment through the computer program .
  • An embodiment of the present application also provides a computer program product including instructions, which, when it runs on a server, causes the server to execute the interactive control method provided by the foregoing embodiment.
  • the integrated unit in the above embodiment is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in the computer-readable storage medium.
  • the technical solution of the present application may essentially be a part that contributes to the existing technology or all or part of the technical solution may be embodied in the form of a software product, and the computer software product is stored in a storage medium.
  • Several instructions are included to enable one or more computer devices (which may be personal computers, servers, network devices, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application.
  • RAM random access memory
  • ROM read-only memory
  • electrically programmable ROM electrically erasable and programmable ROM
  • registers hard disks, removable disks, CD-ROMs, or all fields of technology. Any other known storage medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了一种交互控制方法和装置、存储介质及电子装置。其中,该方法包括:识别对全息投影设备上设置的触摸板所执行的触摸操作,其中,全息投影设备用于呈现全息投影虚拟角色;根据识别出的触摸操作生成交互请求,其中,交互请求用于请求与全息投影虚拟角色进行互动;控制全息投影虚拟角色执行与交互请求相匹配的交互动作。本申请能够有效地实现用户与全息投影设备中的全息投影虚拟角色之间的交互控制。

Description

交互控制方法和装置、存储介质及电子装置
本申请要求于2018年12月04日提交中国专利局、申请号为201811475755.1、申请名称为“交互控制方法和装置、存储介质及电子装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机领域,具体而言,涉及交互控制。
背景技术
随着虚拟成像技术的不断发展,一些应用开发商开始使用人工智能(Artificial Intelligence,简称AI)全息三维投影技术制作全息投影设备,其中,该全息投影设备中可以呈现与用户进行实时信息交互的全息投影虚拟角色。上述全息投影虚拟角色在与用户进行多次交互之后,不仅可以学习用户的爱好习惯,对用户所发出的指令及时做出准确地响应,而且还可以逐渐养成独特的角色个性。
然而,如何有效地实现用户与全息投影设备中的全息投影虚拟角色之间的交互控制仍是一亟待解决的技术问题。
发明内容
本申请实施例提供了一种交互控制方法和装置、存储介质及电子装置,能够有效地实现用户与全息投影设备中的全息投影虚拟角色之间的交互控制。
根据本申请实施例的一个方面,提供了一种交互控制方法,包括:全息投影设备识别对触摸板所执行的触摸操作,其中,上述全息投影设备用于呈现全息投影虚拟角色;上述触摸板设置于所述全息投影设备上;全息投影设备根据识别出的上述触摸操作生成交互请求,其中,上述交互请求用于请求与上述全息投影虚拟角色进行互动;全息投影设备控制上述全息投影虚拟角色执行与上述交互请求相匹配的交互动作。
根据本申请实施例的另一个方面,还提供了一种全息投影设备,包括:全息投影膜,用于呈现全息投影虚拟角色;投影光机,用于控制将所述全息投影 虚拟角色投射到所述全息投影膜上;触摸板,用于采集触摸操作所产生的触摸信息;处理器,与所述触摸板及所述投影光机连接,用于根据所述触摸信息识别所述触摸操作;还用于根据识别出的所述触摸操作生成交互请求,其中,所述交互请求用于请求与所述全息投影虚拟角色进行互动;还用于将所述交互请求发送至所述投影光机;所述投影光机,还用于根据所述交互请求控制所述全息投影虚拟角色执行与所述交互请求相匹配的交互动作。
根据本申请实施例的又一方面,还提供了一种交互控制装置,应用于全息投影设备,包括:识别单元,用于识别对触摸板所执行的触摸操作,其中,上述全息投影设备用于呈现全息投影虚拟角色;上述触摸板设置于所述全息投影设备上;生成单元,用于根据识别出的上述触摸操作生成交互请求,其中,上述交互请求用于请求与上述全息投影虚拟角色进行互动;控制单元,用于控制上述全息投影虚拟角色执行与上述交互请求相匹配的交互动作。
作为一种可选的示例,上述识别单元包括:第一获取模块,用于获取与所述触摸操作对应的触点和所述触摸板相互作用所产生的触摸信息;识别模块,用于根据所述触摸信息识别所述触摸操作。
作为一种可选的示例,上述识别模块包括:第一确定子模块,用于当根据所述触摸信息确定所述触点在所述触摸板上所执行的划动动作的次数大于第一阈值时,确定所述触摸操作为对所述全息投影虚拟角色执行抚摸操作,其中,所述划动动作是指所述触点在所述触摸板上移动的距离大于第二阈值的动作;第二确定子模块,用于当根据所述触摸信息确定所述触点在所述触摸板上移动的距离小于第三阈值,且所述触点在所述触摸板上的作用时长小于第四阈值时,确定所述触摸操作为对所述全息投影虚拟角色执行拍打操作。
作为一种可选的示例,上述第一确定子模块还用于执行以下步骤:获取第一时刻所述触点在所述触摸板上停留的第一位置,及第二时刻所述触点在所述触摸板上停留的第二位置,其中,所述第一时刻到所述第二时刻之间的时间间隔为一个帧周期;在确定所述第一位置到所述第二位置之间的距离大于所述第二阈值时,将所述触点从所述第一位置移动到所述第二位置的动作识别为所述划动动作,并对所述划动动作的次数进行加一计数;在确定所述划动动作的次数大于所述第一阈值时,确定所述触摸操作为对所述全息投影虚拟角色执行所 述抚摸操作。
作为一种可选的示例,上述第二确定子模块还用于执行以下步骤:获取第三时刻所述触点在所述触摸板上所停留的第三位置,及第四时刻所述触点在所述触摸板上所停留的第四位置;在确定所述第三位置到所述第四位置之间的距离小于所述第三阈值,且所述第三时刻到所述第四时刻之间的时间间隔小于所述第四阈值时,确定所述触摸操作为对所述全息投影虚拟角色执行所述拍打操作。
作为一种可选的示例,上述第二确定子模块还用于执行以下步骤:在获取到所述触点在所述触摸板上所停留的所述第三位置之后,在每个帧周期中执行以下步骤,直至检测到所述触摸操作结束:获取在当前帧周期中的目标时刻所述触点在所述触摸板上所停留的当前位置,其中,所述第三时刻到所述目标时刻之间的时间间隔为N个帧周期,N≥1,且N为整数;在检测到所述触摸操作未结束的情况下,获取在所述当前帧周期之后的下一个帧周期中的所述目标时刻所述触点在所述触摸板上所停留的目标位置,将所述下一个帧周期中的所述目标时刻作为所述当前帧周期中的所述目标时刻,将所述下一个帧周期中所述触点所停留的所述目标位置作为所述当前位置;在检测到所述触摸操作结束的情况下,将所述当前帧周期中的所述目标时刻作为所述第四时刻,将所述当前位置作为所述第四位置;其中,所述触点在所述触摸板上的作用区域小于等于第五阈值的情况下,确定所述触摸操作结束。
作为一种可选的示例,上述识别单元还包括:第二获取模块,用于在所述根据所述触摸信息识别所述触摸操作之后,获取所述触点和所述触摸板进行相互作用的作用区域;确定模块,用于根据所述作用区域在所述触摸板上的位置,确定所述交互请求所请求的互动信息。
作为一种可选的示例,上述确定模块包括以下至少之一:第三确定子模块,用于根据所述作用区域在所述触摸板上的位置,确定与所述全息投影虚拟角色进行互动的角色部位;根据所述角色部位确定所述交互请求所请求的互动信息;第四确定子模块,用于根据所述作用区域在所述触摸板上的位置,确定与所述全息投影虚拟角色进行互动的互动类型;根据所述互动类型确定所述交互请求所请求的互动信息。
作为一种可选的示例,上述控制单元包括以下至少之一:第一控制模块,用于控制所述全息投影虚拟角色播放与所述交互请求相匹配的动画;第二控制模块,用于控制所述全息投影虚拟角色播放与所述交互请求相匹配的音频。
作为一种可选的示例,上述装置还包括:调整单元,用于调整所述全息投影虚拟角色的情感参数;若确定所述情感参数在达到目标值,则所述全息投影设备控制所述全息投影虚拟角色的角色响应模式,和/或,所述全息投影设备开启所述全息投影虚拟角色在虚拟场景中的隐藏技能。
根据本申请实施例的又一方面,还提供了一种存储介质,该存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述交互控制方法。
根据本申请实施例的又一方面,还提供了一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为通过所述计算机程序执行上述交互控制方法。
根据本申请实施例的又一方面,还提供了一种包括指令的计算机程序产品,当其在计算机上运行时,使得所述计算机执行上述交互控制方法。
在本申请实施例中,采用识别对全息投影设备上设置的触摸板所执行的触摸操作,其中,上述全息投影设备用于呈现全息投影虚拟角色;根据识别出的上述触摸操作生成交互请求,其中,上述交互请求用于请求与上述全息投影虚拟角色进行互动;控制上述全息投影虚拟角色执行与上述交互请求相匹配的交互动作的方式,在对全息投影设备进行控制的过程中,使用触摸板接收触摸操作并根据触摸操作对全息投影设备进行操作,从而可以根据触摸板接收到的触摸操作快速准确地与全息投影设备中的全息投影虚拟角色进行互动,实现了灵活准确且高效地对虚拟伴侣进行控制的技术效果,进而解决了相关技术中存在的对虚拟伴侣的控制方式单一的技术问题。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是根据本申请实施例的一种可选的交互控制方法的应用环境的示意 图;
图2是根据本申请实施例的一种可选的交互控制方法的流程示意图;
图3是根据本申请实施例的一种可选的交互控制方法的示意图;
图4是根据本申请实施例的另一种可选的交互控制方法的示意图;
图5是根据本申请实施例的另一种可选的交互控制方法的流程示意图;
图6是根据本申请实施例的又一种可选的交互控制方法的流程示意图;
图7是根据本申请实施例的又一种可选的交互控制方法的示意图;
图8是根据本申请实施例的又一种可选的交互控制方法的示意图;
图9是根据本申请实施例的又一种可选的交互控制方法的示意图;
图10是根据本申请实施例的又一种可选的交互控制方法的示意图;
图11是根据本申请实施例的一种可选的全息投影设备的结构示意图;
图12是根据本申请实施例的一种可选的全息投影设备的示意图;
图13是根据本申请实施例的一种可选的交互控制装置的结构示意图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
根据本申请实施例的一个方面,提供了一种交互控制方法,可选地,作为 一种可选的实施方式,上述交互控制方法可以但不限于应用于如图1所示的环境中。用户102可以对触摸板104执行触摸操作,主机106在获取到触摸板104接收到的触摸操作后,识别触摸操作,并根据识别结果生成交互请求。主机106通过交互请求控制投影器108投影出的全息投影虚拟角色110执行与交互请求匹配的交互动作。在此设备中,电池112用于给整个设备供电。
需要说明的是,图1仅为一种硬件结构的示例,并不构成对本实施例的限定。触摸板104与投影器108可以单独出现。此时,触摸板104通过与触摸板104相连的信号发送设备向主机106发送触摸信号。投影器108通过信号接收装置从主机106接收所要投影的交互动作。
需要说明的是,在相关技术中,在使用全息投影设备呈现全息投影虚拟角色时,通常是使用客户端或者语音指令与全息投影设备进行互动。然而,上述与全息投影设备进行互动的方法控制指令单一,全息投影设备仅仅能够进行简单的响应。而本实施例中,采用了通过触摸板获取触摸操作,并对触摸操作进行识别,根据识别出的触摸操作生成交互请求,并根据交互请求执行与交互请求匹配的交互动作的方法,在上述方法中,由于可以直接通过触摸板接收到的触摸操作对全息投影设备进行控制,从而丰富了对全息投影设备进行控制的指令,提高了与全息投影虚拟角色进行互动的灵活性。
为了便于理解和解释本申请实施例提供的交互控制方法,下面结合图2进行说明。其中,本申请实施例提供的交互控制方法可以应用于全息投影设备;而且,作为一种可选的实施方式,该交互控制方法可以包括:
S202,全息投影设备识别对触摸板所执行的触摸操作,其中,全息投影设备用于呈现全息投影虚拟角色;触摸板设置于所述全息投影设备上。
S204,全息投影设备根据识别出的触摸操作生成交互请求,其中,交互请求用于请求与全息投影虚拟角色进行互动;
S206,全息投影设备控制全息投影虚拟角色执行与交互请求相匹配的交互动作。
可选地,上述交互控制方法可以但不限于应用于与全息投影设备中的全息投影虚拟角色进行互动的过程中。例如,上述交互控制方法可以应用于与投影出的全息虚拟动漫人物进行互动的过程中,也可以应用于与投影出的全息虚拟 建筑进行互动的过程中。
为了便于理解和解释,以下以将交互控制方法应用到与全息投影设备投影出全息虚拟动漫人物进行互动的过程为例进行说明。在全息投影设备投影出全息虚拟动漫人物时,用户可以观察到全息虚拟动漫人物。此时,在用户通过触碰触摸板产生触摸操作之后,全息投影设备通过识别该触摸操作并生成交互请求,并根据该交互请求控制全息虚拟动漫人物执行与交互请求匹配的交互动作(例如,害羞),此时,用户可以观看到全息虚拟动漫人物执行该交互动作,从而完成用户与全息虚拟动漫人物之间的互动。
在本实施例中,全息投影设备采用了先通过触摸板获取触摸操作,以便对触摸操作进行识别,再根据识别出的触摸操作生成交互请求,以便根据交互请求执行与交互请求匹配的交互动作的控制方法,在该控制方法中,由于可以直接通过触摸板接收到的触摸操作对全息投影设备进行控制,从而丰富了对全息投影设备进行控制的指令,提高了与全息投影虚拟角色进行互动的灵活性。
可选地,上述交互动作可以但不限于为全息投影虚拟角色的肢体动作和/或全息投影虚拟角色的发声。例如,“控制全息投影虚拟角色执行与交互请求相匹配的交互动作”可以是控制全息投影虚拟角色播放与交互请求相匹配的动画;和/或,控制全息投影虚拟角色播放与交互请求相匹配的音频。
可选地,上述控制全息投影虚拟角色播放音频可以但不限于为控制全息投影虚拟角色的嘴部进行变化,同时使用声音播放装置播放音频。
可选地,上述触摸操作可以通过触点与触摸板相互作用所生成的触摸信息获取。在获取到上述触摸信息后,根据上述触摸信息识别出触摸操作。
可选地,上述触摸信息可以但不限于为以下至少之一:触点的触摸作用时长、触点的触摸轨迹、触点所在的触摸位置。
可选地,上述触点可以但不限于为人体或者人体使用的工具与触摸板之间的接触位置。上述人体可以但不限于为人的任何一个部位,例如,手指、手掌、手背、指甲、额头、下巴、脸颊或者其他部位等。可选地,上述触点可以但不限于为多个,例如,手掌与触摸板进行接触,则会形成多个触点。多个触点之间是同步的且动作一致。而若是手指与触摸屏接触,则会形成一个触点。上述触摸信息可以但不限于为触点的移动信息或者触点的时间信息。例如,以上述 触点为手指与触摸屏之间的接触位置为例进行说明。手指在与触摸板进行接触的过程中,可能会进行移动,从而在触摸屏上留下轨迹;或者手指单次接触触摸屏并短时间内离开从而对触摸屏进行点击操作。如此,可以将上述轨迹或者点击操作作为触摸信息,以便根据上述触摸信息识别触摸操作。
可选地,全息投影设备根据触摸信息识别触摸操作包括:
(1)在触摸信息指示触点在触摸板上所执行的划动动作的次数大于第一阈值的情况下,全息投影设备确定触摸操作为对全息投影虚拟角色执行抚摸操作,其中,触点在触摸板上移动的距离大于第二阈值的动作,将被识别为划动动作;
(2)在触摸信息指示触点在触摸板上移动的距离小于第三阈值,且触点在触摸板上的作用时长小于第四阈值的情况下,全息投影设备确定触摸操作为对全息投影虚拟角色执行拍打操作。
可选地,上述第一阈值、第二阈值、第三阈值、第四阈值可以但不限于根据经验值设定。以上述触点为手指与触摸板进行接触所形成为例,如图3所示,当手指与触摸板接触形成触点A后,手指进行移动,由触点A移动到触点B,移动距离为a。判断a与第一阈值的关系,如果a大于等于第二阈值,则表示本次操作为滑动操作。而如果a小于了第三阈值,则表示本次操作并不是滑动操作,还需要根据其他条件进行判断。可选地,上述作用时长可以但不限于为单次的触点存在时长。例如,手指与触摸板接触了1秒,则作用时长为1秒。
进一步地,在确定为本次操作为滑动操作的情况下,若接收到多次滑动操作,则对接收到的滑动操作的次数进行判断,如果滑动操作的次数大于了第一阈值,则全息投影设备可以计算得出用户对触摸板执行了多次滑动操作。此时,全息投影设备可以得到用户正在执行抚摸操作。而如果本次操作不为滑动操作,且本次操作的作用时长小于了第四阈值,则全息投影设备可以计算得出用户对触摸板执行了单次的短时触碰操作。此时,全息投影设备可以得到用户正在执行拍打动作。
可选地,全息投影设备确定触摸操作为对全息投影虚拟角色执行抚摸操作可以但不限于通过以下方法:全息投影设备获取第一时刻触点在触摸板上所停留的第一位置,及第二时刻触点在触摸板上所停留的第二位置,其中,第一时 刻到第二时刻之间的时间间隔为一个帧周期;并在第一位置到第二位置之间的距离大于第二阈值的情况下,全息投影设备将触点从第一位置移动到第二位置的动作识别为划动动作,并对划动动作的次数进行加一计数;而且,在划动动作的次数大于第一阈值的情况下,全息投影设备确定触摸操作为对全息投影虚拟角色执行抚摸操作。
可选地,可以但不限于对触摸屏添加坐标信息。例如如图4所示,图4为可选的触摸屏的触摸节点。以触摸屏的左下角为原点,触摸屏为平面设置平面直角坐标系。则触摸屏上的任意一点都可以获取到坐标值。例如C点的坐标为(x,y),x,y均为正数。
可选地,结合图5对手掌与触摸屏进行接触的情况进行说明。在确定触摸板处于待检测状态(也就是S502)时,手掌与触摸板进行接触,通过S504判断触点的数量是否大于3,在触点大于3的情况下,通过S506记录触点坐标。此时可以选择其中一个触点的坐标或者将所有触点的坐标均记录下来。在触点消失时,通过S508判断触点消失前触点数量是否大于3,在是的情况下,通过S510记录触点消失的坐标。此时可以选择记录了开始坐标的触点并记录消失坐标,或者记录所有的触点的消失坐标。每一个触点的开始坐标与消失坐标相对应。通过S512计算一个触点的开始坐标与消失坐标之间的距离D,并通过S514判断D是否大于第二阈值。若是,则通过S516将滑动动作次数E加1。通过S518判断E是否大于第一阈值,若是,则判断出当前执行的动作为抚摸操作。在此过程中,若S504与S508中触点的数量不大于3,或者确定当前动作为抚摸操作之后,执行S522,将E重置为0。需要说明的是,上述判断触点数量是否大于3,其中的3是可以灵活进行设置的。同时,上述过程是判断手掌是否在执行抚摸操作的过程。若是判断手指(或者其他部位)是否在抚摸,则可以但不限于将图5中的过程中的S504、S508的步骤删除,不需要判断触点的数量是否大于3,使用图5中的剩余步骤即可确定当前操作是否为抚摸操作。
可选地,全息投影设备确定触摸操作为对全息投影虚拟角色执行拍打操作可以但不限于通过以下方法:全息投影设备获取第三时刻触点在触摸板上所停留的第三位置,及第四时刻触点在触摸板上所停留的第四位置;在第三位置到第四位置之间的距离小于第三阈值,且第三时刻到第四时刻之间的时间间隔小 于第四阈值的情况下,确定触摸操作为对全息投影虚拟角色执行拍打操作。
例如,以下结合图6进行说明。以触摸操作为手掌对触摸板的操作为例,在确定触摸板处于待检测状态(也就是S602)时,当手掌对触摸板进行触摸之后,通过S604检测到触点数量是否大于3,若是,则通过S606记录一个触点或者所有触点的坐标与产生时刻。通过S608判断触点数量是否等于0,若是,则记录上述一个触点或者所有触点的触点消失坐标与消失时刻,并通过S610计算触点坐标间距离L(也就是,一个触点的开始坐标与消失坐标之间的距离)与触点的产生时刻与消失时刻的时间间隔T。通过S612判断L是否小于第三阈值,并通过S614判断T是否小于第四阈值。在L小于第三阈值且T小于第四阈值的情况下,则可以通过S616确定当前操作为拍打操作。若L大于等于第三阈值或者T大于等于第四阈值或者在判断到当前动作为拍打操作后,都会重新处于S602待检测状态。且若是检测过程中触点未消失,则会周期性地记录触点的位置与当前时刻。
可选地,在全息投影设备根据触摸信息识别触摸操作之后,还包括:全息投影设备获取触点和触摸板进行相互作用的作用区域;全息投影设备根据作用区域在触摸板上的位置,确定交互请求所请求的互动信息。
例如,如图7所示,将触摸板分为A、B、C、D、E、F六个区域,在不同的区域产生的相互作用对应不同的互动信息。
可选地,上述将触摸板分成不同的区域后,上述区域大小、形状可以相同也可以不同,所分成的区域的块数可以为大于等于1的任何正数。
可选地,全息投影设备根据作用区域在触摸板上的位置,确定交互请求所请求的互动信息包括以下两个步骤中的至少一个:
(1)全息投影设备根据作用区域在触摸板上的位置,确定与全息投影虚拟角色进行互动的角色部位;并根据角色部位确定交互请求所请求的互动信息。
可选地,可以但不限于将触摸板分成不同的区域,不同的区域代表不同的角色部位。例如如图8所示,将触摸板分成6块,每一块分别与角色的头部、左臂、右臂、肚子、左腿、右腿对应。触摸不同的区域则表示与角色的不同的身体部位进行互动。例如,拍打头部、抚摸肚子等。
(2)全息投影设备根据作用区域在触摸板上的位置,确定与全息投影虚拟角色进行互动的互动类型;并根据互动类型确定交互请求所请求的互动信息。
可选地,上述互动类型可以但不限于包含互动形式或者互动模式。上述互动形式可以但不限于为角色所执行的动作或者播放的声音,上述互动模式可以但不限于包含当前角色所处于的模式。
可选地,上述模式可以但不限于为多种仿人类感情的模式。例如,害羞模式,生气模式,害怕模式,撒娇模式等等。
例如,图9为将触摸板的不同位置设置为执行不同的互动形式(例如,跳舞,唱歌等);而且,图10为将触摸板的不同位置设置为不同仿人类感情的模式(例如,害羞模式、撒娇模式)。
可选地,该交互控制方法还包括:全息投影设备调整全息投影虚拟角色的情感参数;若确定情感参数在达到目标值,则全息投影设备控制所述全息投影虚拟角色的角色响应模式,和/或,全息投影设备开启全息投影虚拟角色在虚拟场景中的隐藏技能。
需要说明的是,本申请实施例不限定“全息投影设备调整全息投影虚拟角色的情感参数”执行时间,例如,其可以在控制全息投影虚拟角色执行与交互请求相匹配的交互动作时执行。
可选地,上述情感参数可以通过数字表示,或者通过红心的数量、能量值等形式表示。上述隐藏技能可以但不限于为新的动画、新的声音或者为新的阶段等。例如以上述隐藏技能为新的动画为例,在情感参数到达一定的值后,可以允许角色播放新的动画。而若情感参数未达到一定的值,则不允许角色播放新的动画。
通过本实施例,全息投影设备通过触摸板获取触摸操作,并对触摸操作进行识别,根据识别出的触摸操作生成交互请求,并根据交互请求执行与交互请求匹配的交互动作的方法,由于在上述方法中,可以直接通过触摸板接收到的触摸操作对全息投影设备进行控制,从而丰富了对全息投影设备进行控制的指令,提高了与全息投影虚拟角色进行互动的灵活性。
基于上述介绍的场景内容,下面将结合上述场景内容详细介绍本申请实施 例提供的交互控制方法的不同实施方式。
作为一种可选的实施方案,全息投影设备识别对全息投影设备上设置的触摸板所执行的触摸操作包括:
步骤1,全息投影设备获取与触摸操作对应的触点和触摸板相互作用所产生的触摸信息;
步骤2,全息投影设备根据触摸信息识别触摸操作。
以手指与触摸板进行互动为例,当手指与触摸板进行接触后,接触的位置为触点。如果手指进行移动,则触点也会随之移动。获取到触点的移动轨迹、手指的触摸作用时长、触点所在的位置作为触摸信息,根据触摸信息识别触摸操作。
通过本实施例,通过获取到触摸操作对应的触点与触摸板相互作用产生的触摸信息并根据触摸信息是被触摸操作,从而提高了全息投影设备的控制灵活性。
作为一种可选的实施方案,全息投影设备根据触摸信息识别触摸操作包括:
步骤1,当根据触摸信息确定触点在触摸板上所执行的划动动作的次数大于第一阈值时,全息投影设备确定触摸操作为对全息投影虚拟角色执行抚摸操作,其中,所述划动动作是指触点在触摸板上移动的距离大于第二阈值的动作;
步骤2,当根据触摸信息确定触点在触摸板上移动的距离小于第三阈值,且触点在触摸板上的作用时长小于第四阈值时,全息投影设备确定触摸操作为对全息投影虚拟角色执行拍打操作。
可选地,上述第一阈值、第二阈值、第三阈值、第四阈值可以但不限于根据经验值设定。以上述触点为手指与触摸板进行接触所形成为例,如图3所示,当手指与触摸板接触形成触点A后,手指进行移动,由触点A移动到触点B,移动距离为a。判断a与第一阈值的关系,如果a大于等于第二阈值,则表示本次操作为滑动操作。而如果a小于了第三阈值,则表示本次操作并不是滑动操作,还需要根据其他条件进行判断。可选地,上述作用时长可以但不限于为单次的触点存在时长。例如,手指与触摸板接触了1秒,则作用时长为1秒。
进一步地,在确定为本次操作为滑动操作的情况下,若接收到多次滑动操 作,则对接收到的滑动操作的次数进行判断,如果滑动操作的次数大于了第一阈值,则全息投影设备可以计算得出用户对触摸板执行了多次滑动操作。此时,全息投影设备可以得到用户正在执行抚摸操作。而如果本次操作不为滑动操作,且本次操作的作用时长小于了第四阈值,则全息投影设备可以计算得出用户对触摸板执行了单次的短时触碰操作。此时,全息投影设备可以得到用户正在执行拍打动作。
通过本实施例,通过将触摸信息与第一阈值、第二阈值以及第三阈值、第四阈值进行比对,从而确定出当前的触摸操作为抚摸操作还是拍打操作,从而提高了对当前触摸操作进行判断的准确性,进而提高了对全息投影设备的控制灵活性。
作为一种可选的实施方案,全息投影设备确定触摸操作为对全息投影虚拟角色执行抚摸操作包括:
步骤1,全息投影设备获取第一时刻触点在触摸板上所停留的第一位置,及在第二时刻触点在触摸板上所停留的第二位置,其中,第一时刻到第二时刻之间的时间间隔为一个帧周期;
步骤2,在确定第一位置到第二位置之间的距离大于第二阈值时,则全息投影设备将触点从第一位置移动到第二位置的动作识别为划动动作,并对划动动作的次数进行加一计数;
步骤3,在确定划动动作的次数大于第一阈值时,则全息投影设备确定触摸操作为对全息投影虚拟角色执行抚摸操作。
例如结合图5对手掌与触摸屏进行接触的情况进行说明。触摸板处于待检测状态(也就是S502)。此时,手掌与触摸板进行接触,通过S504判断触点的数量是否大于3,在触点大于3的情况下,通过S506记录触点坐标。此时可以选择其中一个触点的坐标或者将所有触点的坐标均记录下来。在触点消失时,通过S508判断触点消失前触点数量是否大于3,在是的情况下,通过S510记录触点消失的坐标。此时可以选择记录了开始坐标的触点并记录消失坐标,或者记录所有的触点的消失坐标。每一个触点的开始坐标与消失坐标相对应。通过S512计算一个触点的开始坐标与消失坐标之间的距离D,并通过S514判断D是否大于第二阈值。若是,则通过S516将滑动动作次数E加1。通过S518判断E是 否大于第一阈值,若是,则判断出当前执行的动作为抚摸操作。在此过程中,若S504与S508中触点的数量不大于3,或者确定当前动作为抚摸操作之后,执行S522,将E重置为0。需要说明的是,上述判断触点数量是否大于3,其中的3是可以灵活进行设置的。同时,上述过程是判断手掌是否在执行抚摸操作的过程。若是判断手指(或者其他部位)是否在抚摸,可以但不限于将图5中的过程中的S504、S508的步骤删除,不需要判断触点的数量是否大于3,使用图5中的剩余步骤即可确定当前操作是否为抚摸操作。
通过本实施例,通过上述方法确定当前操作是否为抚摸操作,从而提高了对当前触摸操作进行判断的准确性,进而提高了全息投影设备的控制灵活性。
作为一种可选的实施方案,全息投影设备确定触摸操作为对全息投影虚拟角色执行拍打操作包括:
步骤1,全息投影设备获取第三时刻触点在触摸板上所停留的第三位置,及第四时刻触点在触摸板上所停留的第四位置;
步骤2,在确定第三位置到第四位置之间的距离小于第三阈值,且第三时刻到第四时刻之间的时间间隔小于第四阈值时,全息投影设备确定触摸操作为对全息投影虚拟角色执行拍打操作。
例如,以下结合图6进行说明。以触摸操作为手掌对触摸板的操作为例,触摸板处于S602待检测状态。当手掌对触摸板进行触摸之后,通过S604检测到触点数量是否大于3,若是,则通过S606记录一个触点或者所有触点的坐标与产生时刻。通过S608判断触点数量是否等于0,若是,则记录上述一个触点或者所有触点的触点消失坐标与消失时刻。通过S610计算触点坐标间距离L与触点的产生时刻与消失时刻的时间间隔T。通过S612判断L是否小于第三阈值,通过S614判断T是否小于第四阈值。在L小于第三阈值且T小于第四阈值的情况下,则可以通过S616确定当前操作为拍打操作。若L大于等于第三阈值或者T大于等于第四阈值或者在判断到当前动作为拍打操作后,都会重新处于S602待检测状态。且若是检测过程中触点未消失,则会周期性地记录触点的位置与当前时刻。
通过本实施例,通过上述方法确定当前操作是否为拍打操作,从而提高了对当前触摸操作进行判断的准确性,进而提高了全息投影设备的控制灵活性。
作为一种可选的实施方案,全息投影设备获取第四时刻触点在触摸板上所停留的第四位置包括:
步骤1,在获取到触点在触摸板上所停留的第三位置之后,全息投影设备在每个帧周期中执行以下步骤,直至检测到触摸操作结束:
步骤2,全息投影设备获取在当前帧周期中的目标时刻触点在触摸板上所停留的当前位置,其中,第三时刻到目标时刻之间的时间间隔为N个帧周期,N≥1,且N为整数;
步骤3,在检测到触摸操作未结束的情况下,全息投影设备获取在当前帧周期之后的下一个帧周期中的目标时刻触点在触摸板上所停留的目标位置,将下一个帧周期中的目标时刻作为当前帧周期中的目标时刻,将下一个帧周期中触点所停留的目标位置作为当前位置;
步骤4,在检测到触摸操作结束的情况下,全息投影设备将当前帧周期中的目标时刻作为第四时刻,将当前位置作为第四位置;
步骤5,其中,触点在触摸板上的作用区域小于等于第五阈值的情况下,确定触摸操作结束。
例如继续以手指与触摸板接触的情况为例进行说明。当手指与触摸板接触时,获取到触点的当前位置与当前时间。若触摸操作未结束,则获取下一个帧周期的触点的触摸位置与触摸时间,一直到触摸操作结束时,获取到结束前一刻的触摸位置与触摸时间,将结束前一刻的触摸位置确定为第四位置,将结束前一刻的触摸时间确定为第四时间。由于手指与触摸板的触点面积随着手指压力的增大而增大,因此若触点面积减小到一个经验值内,则表示触摸操作已经结束,手指离开了触摸板。
通过本实施例,通过上述方法确定第四时刻与第四位置,从而提高了获取第四时刻与第四位置的准确性。进一步提高了全息投影设备的控制灵活性。
作为一种可选的实施方案,在根据触摸信息识别触摸操作之后,还包括:
步骤1,全息投影设备获取触点和触摸板进行相互作用的作用区域;
步骤2,全息投影设备根据作用区域在触摸板上的位置,确定交互请求所请求的互动信息。
例如,如图7所示,将触摸板分为A、B、C、D、E、F六个区域,在不同 的区域产生的相互作用对应不同的互动信息。
可选地,上述将触摸板分成不同的区域后,上述区域大小、形状可以相同也可以不同,所分成的区域的块数可以为大于等于1的任何正数。
通过本实施例,通过根据触点和触摸板的相互作用的作用区域在触摸板上的位置确定互动信息,从而提高了全息投影设备的控制灵活性。
作为一种可选的实施方案,全息投影设备根据作用区域在触摸板上的位置,确定交互请求所请求的互动信息包括以下两个步骤中的至少一个:
(1)全息投影设备根据作用区域在触摸板上的位置,确定与全息投影虚拟角色进行互动的角色部位;根据角色部位确定交互请求所请求的互动信息;
(2)全息投影设备根据作用区域在触摸板上的位置,确定与全息投影虚拟角色进行互动的互动类型;根据互动类型确定交互请求所请求的互动信息。
可选地,上述互动类型可以但不限于包含互动形式或者互动模式。上述互动形式可以但不限于为角色所执行的动作或者播放的声音,上述互动模式可以但不限于包含当前角色所处于的模式。
可选地,上述模式可以但不限于为多种仿人类感情的模式。例如,害羞模式,生气模式,害怕模式,撒娇模式等等。
例如,如图8所示,将触摸板分成6块,每一块分别与角色的头部、左臂、右臂、肚子、左腿、右腿对应。触摸不同的区域则表示与角色的不同的身体部位进行互动。例如,拍打头部、抚摸肚子等。或者如图9、10所示。图9为将触摸板的不同位置设置为执行不同的互动形式(例如,跳舞,唱歌等);而图10为将触摸板的不同位置设置为不同仿人类感情的模式(例如,害羞模式、撒娇模式)。
通过本实施例,通过根据作用区域在触摸板上的位置的不同确定全息投影虚拟角色的互动的角色部位或者互动类型,从而提高了全息投影设备的控制灵活性。
作为一种可选的实施方案,全息投影设备控制全息投影虚拟角色执行与交互请求相匹配的交互动作包括以下至少之一:
(1)全息投影设备控制全息投影虚拟角色播放与交互请求相匹配的动画;
(2)全息投影设备控制全息投影虚拟角色播放与交互请求相匹配的音频。
可选地,上述控制全息投影虚拟角色播放音频可以但不限于为控制全息投影虚拟角色的嘴部进行变化,同时使用声音播放装置播放音频。
例如,在全息投影虚拟角色被投影出来以后,可以控制全息投影虚拟角色跳舞,做动作或者控制全息投影虚拟角色嘴部发生变化同时播放音频,模拟全息投影虚拟角色在发出声音。
通过本实施例,通过控制全息投影虚拟角色播放动画或者播放音频,从而提高了全息投影虚拟角色的灵活性。
作为一种可选的实施方案,所述交互控制方法还包括:
全息投影设备调整全息投影虚拟角色的情感参数;若确定情感参数在达到目标值时,则全息投影设备控制所述全息投影虚拟角色的角色响应模式,和/或,全息投影设备开启全息投影虚拟角色在虚拟场景中的隐藏技能。
可选地,上述情感参数可以通过数字表示,或者通过红心的数量、能量值等形式表示。上述隐藏技能可以但不限于为新的动画、新的声音或者为新的阶段等。例如以上述隐藏技能为新的动画为例,在情感参数到达一定的值后,可以允许角色播放新的动画。而若情感参数未达到一定的值,则不允许角色播放新的动画。
例如,以红心的数量表示情感参数。在经过与全息投影虚拟角色进行互动后,红心的数量到达一定的值,此时全息投影虚拟角色可以产生新的变化,例如,可以跳舞,或者唱歌,或者换衣服等。
通过本实施例,通过根据情感参数的多少进而决定是否开启全息投影虚拟角色的隐藏技能,从而提高了全息投影设备的控制灵活性。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本申请所必须的。
根据本申请实施例的另一个方面,还提供了一种用于实施上述交互控制方法的全息投影设备。作为一种可选的实施方式,如图11所示,上述全息投影设 备包括:
触摸板1102,用于采集触摸操作所产生的触摸信息;
处理器1104,与所述触摸板及所述投影光机连接,用于根据所述触摸信息识别所述触摸操作;还用于根据识别出的所述触摸操作生成交互请求,其中,所述交互请求用于请求与所述全息投影虚拟角色进行互动;还用于将所述交互请求发送至所述投影光机;
投影光机1106,用于控制将所述全息投影虚拟角色投射到所述全息投影膜上;
全息投影膜1108,用于呈现全息投影虚拟角色;
所述投影光机1104,还用于根据所述交互请求控制所述全息投影虚拟角色执行与所述交互请求相匹配的交互动作。
可选地,上述触摸板可以但不限于为一块或多块,当触摸板有多块时,可以但不限于为将每一块触摸板对应全息投影虚拟角色的不同位置。
可选地,上述处理器可以但不限于包括接收模块,用于接收触摸板1102采集到的触摸信息;运算模块,用于根据接收到的触摸信息运算得到与触摸信息对应的交互动作;存储模块,用于存储全息投影虚拟角色可能执行的交互动作;发送器,用于将全息投影虚拟对象根据触模信息执行的交互动作对应的动画信息发送给投影光机。投影光机在获取到上述动画信息后,将上述动画信息投影到全息投影膜上,以显示全息投影虚拟角色的动画信息。
可选地,上述全息投影设备可以但不限于应用于与全息投影设备中的全息投影虚拟角色进行互动的过程中,例如与投影出的全息虚拟动漫人物进行互动的过程中,或者应用在与投影出的全息虚拟建筑进行互动的过程中。
以下将上述全息投影设备应用到对全息虚拟动漫人物进行互动的过程中为例,对上述全息投影设备进行说明。如图12所示,图12展示了两张可选的一种全息投影设备的实物图。图12中可以使用手触摸到左侧全息投影设备的触摸板区域,从而产生触摸信息。当全息投影设备获取到上述触摸信息后,对触摸信息进行判断,例如,获取到短促的轻触信息,则根据该信息,从存储位置中查找与该触摸信息对应的动漫人物的动作,或者对获取到的短促的轻触信息进行判断,从而判断出该短促的轻触信息对应的动漫人物的动作,从而在触摸板 上投影出与短促的轻触信息对应的动作。如图12中右侧的全息投影设备中的动漫人物,在全息投影设备接收到短促的轻触信息时,投影出被击打摔倒的动画,从而形成用户与动漫人物的交互。
本实施例的其他实施方式请参见上述交互控制方法中所描述的实施方式,在此不做赘述。需要说明的是,全息投影设备的技术详情请参见方法实施例。
根据本申请实施例的又一个方面,还提供了一种用于实施上述交互控制方法的交互控制装置。如图13所示,该装置应用于全息投影设备,该装置包括:
(1)识别单元1302,用于识别对触摸板所执行的触摸操作,其中,全息投影设备用于呈现全息投影虚拟角色;所述触摸板设置于所述全息投影设备上;
(2)生成单元1304,用于根据识别出的触摸操作生成交互请求,其中,交互请求用于请求与全息投影虚拟角色进行互动;
(3)控制单元1306,用于控制全息投影虚拟角色执行与交互请求相匹配的交互动作。
作为一种可选的实施方案,为了提高全息投影设备的控制灵活性,上述识别单元可以包括:
(1)第一获取模块,用于获取与触摸操作对应的触点和触摸板相互作用所产生的触摸信息;
(2)识别模块,用于根据触摸信息识别触摸操作。
作为一种可选的实施方案,为了提高全息投影设备的控制灵活性,上述识别模块可以包括:
(1)第一确定子模块,用于当根据触摸信息确定触点在触摸板上所执行的划动动作的次数大于第一阈值时,确定触摸操作为对全息投影虚拟角色执行抚摸操作,其中,划动动作是指触点在触摸板上移动的距离大于第二阈值的动作;
(2)第二确定子模块,用于当根据触摸信息确定触点在触摸板上移动的距离小于第三阈值,且触点在触摸板上的作用时长小于第四阈值时,确定触摸操作为对全息投影虚拟角色执行拍打操作。
作为一种可选的实施方案,为了提高全息投影设备的控制灵活性,上述第一确定子模块还用于执行以下步骤:
获取第一时刻触点在触摸板上停留的第一位置,及第二时刻触点在触摸板上停留的第二位置,其中,第一时刻到第二时刻之间的时间间隔为一个帧周期;
在确定第一位置到第二位置之间的距离大于第二阈值时,将触点从第一位置移动到第二位置的动作识别为划动动作,并对划动动作的次数进行加一计数;
在确定划动动作的次数大于第一阈值时,确定触摸操作为对全息投影虚拟角色执行抚摸操作。
作为一种可选的实施方案,为了提高全息投影设备的控制灵活性,上述第二确定子模块还用于执行以下步骤:
获取第三时刻触点在触摸板上所停留的第三位置,及第四时刻触点在触摸板上所停留的第四位置;
在确定第三位置到第四位置之间的距离小于第三阈值,且第三时刻到第四时刻之间的时间间隔小于第四阈值时,确定触摸操作为对全息投影虚拟角色执行拍打操作。
作为一种可选的实施方案,为了提高全息投影设备的控制灵活性,上述第二确定子模块还用于执行以下步骤:
在获取到触点在触摸板上所停留的第三位置之后,在每个帧周期中执行以下步骤,直至检测到触摸操作结束:
获取在当前帧周期中的目标时刻触点在触摸板上所停留的当前位置,其中,第三时刻到目标时刻之间的时间间隔为N个帧周期,N≥1,且N为整数;
在检测到触摸操作未结束的情况下,获取在当前帧周期之后的下一个帧周期中的目标时刻触点在触摸板上所停留的目标位置,将下一个帧周期中的目标时刻作为当前帧周期中的目标时刻,将下一个帧周期中触点所停留的目标位置作为当前位置;
在检测到触摸操作结束的情况下,将当前帧周期中的目标时刻作为第四时刻,将当前位置作为第四位置;
其中,触点在触摸板上的作用区域小于等于第五阈值的情况下,确定触摸 操作结束。
作为一种可选的实施方案,为了提高全息投影设备的控制灵活性,上述识别单元还包括:
(1)第二获取模块,用于在根据触摸信息识别触摸操作之后,获取触点和触摸板进行相互作用的作用区域;
(2)确定模块,用于根据作用区域在触摸板上的位置,确定交互请求所请求的互动信息。
作为一种可选的实施方案,为了提高全息投影设备的控制灵活性,上述确定模块包括以下至少之一:
第三确定子模块,用于根据作用区域在触摸板上的位置,确定与全息投影虚拟角色进行互动的角色部位;根据角色部位确定交互请求所请求的互动信息;
第四确定子模块,用于根据作用区域在触摸板上的位置,确定与全息投影虚拟角色进行互动的互动类型;根据互动类型确定交互请求所请求的互动信息。
作为一种可选的实施方案,为了提高全息投影设备的控制灵活性,上述控制单元包括以下至少之一:
第一控制模块,用于控制全息投影虚拟角色播放与交互请求相匹配的动画;
第二控制模块,用于控制全息投影虚拟角色播放与交互请求相匹配的音频。
作为一种可选的实施方案,为了提高全息投影设备的控制灵活性,上述装置还包括:
(1)调整单元,用于调整全息投影虚拟角色的情感参数,若确定情感参数在达到目标值时,则所述全息投影设备控制所述全息投影虚拟角色的角色响应模式,和/或,所述全息投影设备开启全息投影虚拟角色在虚拟场景中的隐藏技能。
需要说明的是,交互控制装置的技术详情请参见方法实施例。
根据本申请的实施例的又一方面,还提供了一种存储介质,该存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
可选地,在本实施例中,上述存储介质可以被设置为存储用于执行以下步骤的计算机程序:
S1,识别对触摸板所执行的触摸操作,其中,全息投影设备用于呈现全息投影虚拟角色;所述触摸板设置于所述全息投影设备上;
S2,根据识别出的触摸操作生成交互请求,其中,交互请求用于请求与全息投影虚拟角色进行互动;
S3,控制全息投影虚拟角色执行与交互请求相匹配的交互动作。
可选地,在本实施例中,本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
另外,本申请实施例还提供了一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为通过所述计算机程序执行上述实施例提供的交互控制方法。
本申请实施例还提供了一种包括指令的计算机程序产品,当其在服务器上运行时,使得服务器执行上述实施例提供的交互控制方法。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。
本说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是 与其他实施例的不同之处,各个实施例之间相同相似部分互相参见即可。对于实施例公开的装置而言,由于其与实施例公开的方法相对应,所以描述的比较简单,相关之处参见方法部分说明即可。
专业人员还可以进一步意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
结合本文中所公开的实施例描述的方法或算法的步骤可以直接用硬件、处理器执行的软件模块,或者二者的结合来实施。软件模块可以置于随机存储器(RAM)、内存、只读存储器(ROM)、电可编程ROM、电可擦除可编程ROM、寄存器、硬盘、可移动磁盘、CD-ROM、或技术领域内所公知的任意其它形式的存储介质中。
对所公开的实施例的上述说明,使本领域专业技术人员能够实现或使用本申请。对这些实施例的多种修改对本领域的专业技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本申请的核心思想或范围的情况下,在其它实施例中实现。因此,本申请将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。

Claims (16)

  1. 一种交互控制方法,包括:
    全息投影设备识别对触摸板所执行的触摸操作,其中,所述全息投影设备用于呈现全息投影虚拟角色;所述触摸板设置于所述全息投影设备上;
    所述全息投影设备根据识别出的所述触摸操作生成交互请求,其中,所述交互请求用于请求与所述全息投影虚拟角色进行互动;
    所述全息投影设备控制所述全息投影虚拟角色执行与所述交互请求相匹配的交互动作。
  2. 根据权利要求1所述的方法,所述全息投影设备识别对触摸板所执行的触摸操作包括:
    所述全息投影设备获取与所述触摸操作对应的触点和所述触摸板相互作用所产生的触摸信息;
    所述全息投影设备根据所述触摸信息识别所述触摸操作。
  3. 根据权利要求2所述的方法,所述全息投影设备根据所述触摸信息识别所述触摸操作包括:
    当根据所述触摸信息确定所述触点在所述触摸板上所执行的划动动作的次数大于第一阈值时,所述全息投影设备确定所述触摸操作为对所述全息投影虚拟角色执行抚摸操作,其中,所述划动动作是指所述触点在所述触摸板上移动的距离大于第二阈值的动作;
    当根据所述触摸信息确定所述触点在所述触摸板上移动的距离小于第三阈值,且所述触点在所述触摸板上的作用时长小于第四阈值时,所述全息投影设备确定所述触摸操作为对所述全息投影虚拟角色执行拍打操作。
  4. 根据权利要求3所述的方法,所述全息投影设备确定所述触摸操作为对所述全息投影虚拟角色执行抚摸操作包括:
    所述全息投影设备获取第一时刻所述触点在所述触摸板上停留的第一位置,及第二时刻所述触点在所述触摸板上停留的第二位置,其中,所述第一时刻到所述第二时刻之间的时间间隔为一个帧周期;
    在确定所述第一位置到所述第二位置之间的距离大于所述第二阈值时,则所述全息投影设备将所述触点从所述第一位置移动到所述第二位置的动作识 别为所述划动动作,并对所述划动动作的次数进行加一计数;
    在确定所述划动动作的次数大于所述第一阈值时,则所述全息投影设备确定所述触摸操作为对所述全息投影虚拟角色执行所述抚摸操作。
  5. 根据权利要求3所述的方法,所述全息投影设备确定所述触摸操作为对所述全息投影虚拟角色执行拍打操作包括:
    所述全息投影设备获取第三时刻所述触点在所述触摸板上所停留的第三位置,及第四时刻所述触点在所述触摸板上所停留的第四位置;
    在确定所述第三位置到所述第四位置之间的距离小于所述第三阈值,且所述第三时刻到所述第四时刻之间的时间间隔小于所述第四阈值时,所述全息投影设备确定所述触摸操作为对所述全息投影虚拟角色执行所述拍打操作。
  6. 根据权利要求5所述的方法,所述全息投影设备获取所述第四时刻所述触点在所述触摸板上所停留的所述第四位置包括:
    在所述全息投影设备获取到所述触点在所述触摸板上所停留的所述第三位置之后,所述全息投影设备在每个帧周期中执行以下步骤,直至检测到所述触摸操作结束:
    所述全息投影设备获取在当前帧周期中的目标时刻所述触点在所述触摸板上所停留的当前位置,其中,所述第三时刻到所述目标时刻之间的时间间隔为N个帧周期,N≥1,且N为整数;
    在检测到所述触摸操作未结束的情况下,所述全息投影设备获取在所述当前帧周期之后的下一个帧周期中的所述目标时刻所述触点在所述触摸板上所停留的目标位置,将所述下一个帧周期中的所述目标时刻作为所述当前帧周期中的所述目标时刻,将所述下一个帧周期中所述触点所停留的所述目标位置作为所述当前位置;
    在检测到所述触摸操作结束的情况下,所述全息投影设备将所述当前帧周期中的所述目标时刻作为所述第四时刻,将所述当前位置作为所述第四位置;
    其中,所述触点在所述触摸板上的作用区域小于等于第五阈值的情况下,确定所述触摸操作结束。
  7. 根据权利要求2所述的方法,在所述全息投影设备根据所述触摸信息识别所述触摸操作之后,还包括:
    所述全息投影设备获取所述触点和所述触摸板进行相互作用的作用区域;
    所述全息投影设备根据所述作用区域在所述触摸板上的位置,确定所述交互请求所请求的互动信息。
  8. 根据权利要求7所述的方法,所述全息投影设备根据所述作用区域在所述触摸板上的位置,确定所述交互请求所请求的互动信息包括:
    所述全息投影设备根据所述作用区域在所述触摸板上的位置,确定与所述全息投影虚拟角色进行互动的角色部位;根据所述角色部位确定所述交互请求所请求的互动信息;
    和/或,
    所述全息投影设备根据所述作用区域在所述触摸板上的位置,确定与所述全息投影虚拟角色进行互动的互动类型;根据所述互动类型确定所述交互请求所请求的互动信息。
  9. 根据权利要求1至8中任一项所述的方法,所述全息投影设备控制所述全息投影虚拟角色执行与所述交互请求相匹配的交互动作包括:
    所述全息投影设备控制所述全息投影虚拟角色播放与所述交互请求相匹配的动画;
    和/或,
    所述全息投影设备控制所述全息投影虚拟角色播放与所述交互请求相匹配的音频。
  10. 根据权利要求1至8中任一项所述的方法,还包括:
    所述全息投影设备调整所述全息投影虚拟角色的情感参数;
    若确定所述情感参数在达到目标值,则所述全息投影设备控制所述全息投影虚拟角色的角色响应模式,和/或,所述全息投影设备开启所述全息投影虚拟角色在虚拟场景中的隐藏技能。
  11. 一种全息投影设备,包括:
    全息投影膜,用于呈现全息投影虚拟角色;
    投影光机,用于控制将所述全息投影虚拟角色投射到所述全息投影膜上;
    触摸板,用于采集触摸操作所产生的触摸信息;
    处理器,与所述触摸板及所述投影光机连接,用于根据所述触摸信息识别 所述触摸操作;还用于根据识别出的所述触摸操作生成交互请求,其中,所述交互请求用于请求与所述全息投影虚拟角色进行互动;还用于将所述交互请求发送至所述投影光机;
    所述投影光机,还用于根据所述交互请求控制所述全息投影虚拟角色执行与所述交互请求相匹配的交互动作。
  12. 根据权利要求11所述的设备,所述处理器包括:
    定位模块,用于根据所述触摸信息确定所述触摸操作在所述触摸板上的作用区域;
    处理模块,与所述定位模块连接,用于根据所述作用区域所在的位置确定所述交互请求所请求的互动信息;
    通信模块,与所述处理模块连接,用于将所述互动信息发送至所述投影光机。
  13. 根据权利要求12所述的设备,所述投影光机包括:
    通信接口,用于接收所述互动信息;
    投射部件,与所述通信接口连接,用于根据所述互动信息调整所投射的所述全息投影虚拟角色的动作。
  14. 一种交互控制装置,应用于全息投影设备,包括:
    识别单元,用于识别对触摸板所执行的触摸操作,其中,所述全息投影设备用于呈现全息投影虚拟角色;所述触摸板设置于所述全息投影设备上;
    生成单元,用于根据识别出的所述触摸操作生成交互请求,其中,所述交互请求用于请求与所述全息投影虚拟角色进行互动;
    控制单元,用于控制所述全息投影虚拟角色执行与所述交互请求相匹配的交互动作。
  15. 一种存储介质,所述存储介质包括存储的计算机程序,其中,所述计算机程序运行时执行上述权利要求1至10任一项中所述的交互控制方法。
  16. 一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为通过所述计算机程序执行权利要求1至10任一项中所述的交互控制方法。
PCT/CN2019/114396 2018-12-04 2019-10-30 交互控制方法和装置、存储介质及电子装置 WO2020114157A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19891843.5A EP3893099A4 (en) 2018-12-04 2019-10-30 INTERACTION CONTROL METHOD AND APPARATUS, MEMORY MEDIA AND ELECTRONIC APPARATUS
US17/075,441 US11947789B2 (en) 2018-12-04 2020-10-20 Interactive control method and apparatus, storage medium, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811475755.1 2018-12-04
CN201811475755.1A CN110147196A (zh) 2018-12-04 2018-12-04 交互控制方法和装置、存储介质及电子装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/075,441 Continuation US11947789B2 (en) 2018-12-04 2020-10-20 Interactive control method and apparatus, storage medium, and electronic device

Publications (1)

Publication Number Publication Date
WO2020114157A1 true WO2020114157A1 (zh) 2020-06-11

Family

ID=67588381

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/114396 WO2020114157A1 (zh) 2018-12-04 2019-10-30 交互控制方法和装置、存储介质及电子装置

Country Status (4)

Country Link
US (1) US11947789B2 (zh)
EP (1) EP3893099A4 (zh)
CN (1) CN110147196A (zh)
WO (1) WO2020114157A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110147196A (zh) 2018-12-04 2019-08-20 腾讯科技(深圳)有限公司 交互控制方法和装置、存储介质及电子装置
CN110543239A (zh) * 2019-09-05 2019-12-06 重庆瑞信展览有限公司 数字化互动展览综合应用系统
CN111045587B (zh) * 2019-10-29 2024-02-09 咪咕互动娱乐有限公司 游戏控制方法、电子设备和计算机可读存储介质
CN111862280A (zh) * 2020-08-26 2020-10-30 网易(杭州)网络有限公司 虚拟角色控制方法、系统、介质及电子设备
CN113900511A (zh) * 2021-09-17 2022-01-07 广州励丰文化科技股份有限公司 投影交互系统及方法
CN114067032A (zh) * 2021-11-05 2022-02-18 常熟常春汽车零部件有限公司 一种车内全息卡通影像仿真交互方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130139062A1 (en) * 2011-11-28 2013-05-30 Qualcomm Innovation Center, Inc. Audio Indicator of Position Within a User Interface
CN106612423A (zh) * 2015-12-31 2017-05-03 北京数科技有限公司 一种针对投影图像的触控方法及装置
CN110147196A (zh) * 2018-12-04 2019-08-20 腾讯科技(深圳)有限公司 交互控制方法和装置、存储介质及电子装置

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8943541B2 (en) * 2010-10-11 2015-01-27 Eldon Technology Limited Holographic 3D display
CN103902124B (zh) * 2014-03-11 2017-01-04 广州中国科学院先进技术研究所 基于轨迹识别的三维全息互动系统及其控制方法
JP6307627B2 (ja) * 2014-03-14 2018-04-04 株式会社ソニー・インタラクティブエンタテインメント 空間感知を備えるゲーム機
US20160085332A1 (en) * 2014-09-23 2016-03-24 Continental Automotive Systems Inc. Touch sensitive holographic display system and method of using the display system
GB2532234B (en) * 2014-11-12 2019-07-24 De Montfort Univ Image display system
US9911235B2 (en) * 2014-11-14 2018-03-06 Qualcomm Incorporated Spatial interaction in augmented reality
CN106935163A (zh) * 2017-03-22 2017-07-07 青岛中鉴高科信息有限公司 一种基于轨迹识别的三维全息互动人像全息投影展示系统
CN107340859B (zh) * 2017-06-14 2021-04-06 北京光年无限科技有限公司 多模态虚拟机器人的多模态交互方法和系统
CN107831905A (zh) * 2017-11-30 2018-03-23 北京光年无限科技有限公司 一种基于全息投影设备的虚拟形象交互方法及系统
CN107908385B (zh) * 2017-12-01 2022-03-15 北京光年无限科技有限公司 一种基于全息的多模态交互系统及方法
CN108198238B (zh) * 2018-01-30 2021-06-22 北京小米移动软件有限公司 全息投影设备、方法、装置及计算机可读存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130139062A1 (en) * 2011-11-28 2013-05-30 Qualcomm Innovation Center, Inc. Audio Indicator of Position Within a User Interface
CN106612423A (zh) * 2015-12-31 2017-05-03 北京数科技有限公司 一种针对投影图像的触控方法及装置
CN110147196A (zh) * 2018-12-04 2019-08-20 腾讯科技(深圳)有限公司 交互控制方法和装置、存储介质及电子装置

Also Published As

Publication number Publication date
US11947789B2 (en) 2024-04-02
EP3893099A1 (en) 2021-10-13
CN110147196A (zh) 2019-08-20
US20210034212A1 (en) 2021-02-04
EP3893099A4 (en) 2022-04-27

Similar Documents

Publication Publication Date Title
WO2020114157A1 (zh) 交互控制方法和装置、存储介质及电子装置
US20210272344A1 (en) Virtual reality presentation of body postures of avatars
JP5632474B2 (ja) ユーザーから学習した入力を介し視覚表示を実写のようにする方法及びシステム
JP5859456B2 (ja) プレゼンテーション用カメラ・ナビゲーション
KR101643020B1 (ko) 애니메이션을 체이닝하는 방법 및 애니메이션 블렌딩 장치
KR101704848B1 (ko) 플레이어의 표현에 기반하는 비주얼 표현의 표현 방법
CN111045511B (zh) 基于手势的操控方法及终端设备
CN110555507B (zh) 虚拟机器人的交互方法、装置、电子设备及存储介质
TW201733345A (zh) 使用互動化身的通訊技術(二)
WO2020024692A1 (zh) 一种人机交互方法和装置
US11681372B2 (en) Touch enabling process, haptic accessory, and core haptic engine to enable creation and delivery of tactile-enabled experiences with virtual objects
WO2021196644A1 (zh) 交互对象的驱动方法、装置、设备以及存储介质
TW202139052A (zh) 互動物件的驅動方法、裝置、設備以及儲存媒體
CN109243248A (zh) 一种基于3d深度摄像模组的虚拟钢琴及其实现方法
CN110794964A (zh) 虚拟机器人的交互方法、装置、电子设备及存储介质
US20240231600A1 (en) Interaction with a projected virtual character using touch panel
Mancini et al. Copying behaviour of expressive motion
TWI814318B (zh) 用於使用模擬角色訓練模型以用於將遊戲角色之臉部表情製成動畫之方法以及用於使用三維(3d)影像擷取來產生遊戲角色之臉部表情之標籤值之方法
TWM631301U (zh) 互動式平台系統
CN113873162A (zh) 拍摄方法、装置、电子设备和可读存储介质
CN117339212A (zh) 控制虚拟游戏角色交互的方法、存储介质及电子装置
CN115811623A (zh) 基于虚拟形象的直播方法和系统
TWI583198B (zh) 使用互動化身的通訊技術
Ackovska Gesture recognition solution for presentation control
Wambutt Sonic feedback cues for hand-gesture photo-taking: Designing non-visual feedback for a touch-less hand-gesture based photo-taking experience

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19891843

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019891843

Country of ref document: EP

Effective date: 20210705