WO2021196584A1 - 激光感应系统及方法、计算机可读存储介质、电子设备 - Google Patents

激光感应系统及方法、计算机可读存储介质、电子设备 Download PDF

Info

Publication number
WO2021196584A1
WO2021196584A1 PCT/CN2020/125538 CN2020125538W WO2021196584A1 WO 2021196584 A1 WO2021196584 A1 WO 2021196584A1 CN 2020125538 W CN2020125538 W CN 2020125538W WO 2021196584 A1 WO2021196584 A1 WO 2021196584A1
Authority
WO
WIPO (PCT)
Prior art keywords
laser
laser irradiation
user
irradiation point
posture
Prior art date
Application number
PCT/CN2020/125538
Other languages
English (en)
French (fr)
Inventor
韩超
Original Assignee
深圳创维-Rgb电子有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳创维-Rgb电子有限公司 filed Critical 深圳创维-Rgb电子有限公司
Publication of WO2021196584A1 publication Critical patent/WO2021196584A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1062Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to a type of game, e.g. steering wheel
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/302Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device specially adapted for receiving control signals not targeted to a display device or game input means, e.g. vibrating driver's seat, scent dispenser
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • This application relates to the technical field of simulation games, in particular to a laser sensing system and method, a computer-readable storage medium, and an electronic device.
  • the acquisition device simulates a more realistic shooting scene for the player, that is, the player can control the shooting direction of the virtual shooting device used by himself.
  • the above game method is very simple for some players. Players usually find that the shooting position of the virtual character is not the most suitable shooting position, but these shooting positions cannot be changed, thus making the player's shooting operation The difficulty is greater, the hit rate is low, and it is easy to cause user fatigue, which reduces the user experience.
  • one of the objectives of the present application includes providing a laser sensing system and method, computer-readable storage medium, and electronic equipment, which can control the user's corresponding virtual character to move according to the user's target movement posture and movement direction.
  • the hit rate of user shooting reduces the difficulty of shooting operation and helps improve user experience.
  • a laser sensing system includes a handheld laser transmitter, a plurality of wearable laser transmitters, a camera device, and a terminal device:
  • the handheld laser transmitter is configured to emit a first laser signal to the terminal device according to a trigger instruction sent by a user;
  • the plurality of wearable laser transmitters are respectively worn on different parts of the user's body and configured to emit a plurality of second laser signals to the terminal device according to a preset time interval;
  • the camera device is configured to collect an action image of the user at a preset time interval, and send the collected action image to the terminal device;
  • the terminal device is configured to display a virtual character corresponding to the user and a scene corresponding to the virtual character, and respectively determine a first laser irradiation point corresponding to the first laser signal, and the multiple second laser signals
  • the second laser irradiation point corresponding to each of the second laser signals in each of the second laser signals, the first laser irradiation point is determined as the shooting position of the virtual character, and the determination is made based on each second laser irradiation point and the motion image
  • a target movement posture and a movement direction corresponding to the virtual character are obtained, and the virtual character is controlled to move in the movement direction according to the target movement posture.
  • the terminal device includes a signal receiving module, a temperature identification module, and a signal identification module:
  • the signal receiving module is configured to receive a first laser signal emitted by the handheld laser transmitter and a second laser signal emitted by each of the plurality of wearable laser transmitters;
  • the temperature identification module is configured to determine the first laser signal and the plurality of A plurality of laser irradiation points to be identified corresponding to the second laser signal;
  • the signal recognition module is configured to recognize the first laser irradiation point corresponding to the handheld laser transmitter among the plurality of laser irradiation points to be recognized according to the irradiation frequency of the plurality of laser irradiation points to be recognized, and The second laser irradiation point corresponding to each of the wearable laser transmitters is determined as the shooting position of the virtual character.
  • the terminal device further includes a position recognition module, an action recognition module, and a function control module:
  • the position recognition module is configured to receive the action image sent by the camera device, and determine the movement direction of the user based on at least two received action images;
  • the motion recognition module is configured to determine the target movement posture of the virtual character based on each of the second laser irradiation points and the received motion image;
  • the function control module is configured to control the virtual character to move in the moving direction according to the target moving posture.
  • the position recognition module is specifically configured to receive at least two of the action images sent by the camera device, and identify the location area where the user is located in each of the action images, and The shooting time of each action image determines the direction of movement of the user, and the shooting time of at least two of the action images is different.
  • the action recognition module includes a position determination unit, a movement posture determination unit, and a posture conversion unit:
  • the position determining unit is configured to determine the wearable laser emitter corresponding to the second laser irradiation point for each of the second laser irradiation points, and determine the wearing part of the wearable laser emitter;
  • the moving posture determining unit is configured to determine an initial moving posture corresponding to the user from the action image, and adjust the initial moving posture according to each wearing part;
  • the posture conversion unit is configured to perform coordinate conversion on the adjusted initial moving posture, and convert the initial moving posture into a target moving posture corresponding to the virtual character.
  • the terminal device further includes a posture correction module:
  • the posture correction module is configured to determine whether the second laser irradiation point is within a corresponding preset standard range for each second laser irradiation point, if the second laser irradiation point is not located in the Within the preset standard range, the second laser irradiation point and the preset standard range corresponding to the second laser irradiation point are displayed on the display screen of the terminal device.
  • the multiple wearable laser emitters respectively being worn on different parts of the user's body include:
  • the wearing positions of the plurality of wearable laser transmitters are determined respectively according to the position information sent by the plurality of wearable laser transmitters.
  • the display screen of the terminal device is covered with a transparent light laser sensing film, and the transparent light laser sensing film is configured to sense the first laser signal and each of the second laser signals .
  • the camera device is arranged above the terminal device.
  • a laser sensing method is also provided, which is applied to the above-mentioned terminal device, and the laser sensing method includes:
  • each wearable laser transmitter After receiving the first laser signal emitted by the handheld laser transmitter and the second laser signal emitted by each wearable laser transmitter, respectively determine the first laser irradiation point corresponding to the first laser signal, and each second laser signal The second laser irradiation point corresponding to the laser signal;
  • the first laser irradiation point and the second laser irradiation point corresponding to each second laser signal include:
  • the first laser irradiation point corresponding to the handheld laser transmitter among the plurality of laser irradiation points to be recognized is identified, and each of the wearable laser transmitters The corresponding second laser irradiation point.
  • the target movement posture corresponding to the virtual character displayed by the terminal device is determined based on each of the second laser irradiation points and the received action images of the user sent by the camera device, and Direction of movement, including:
  • the target movement posture of the virtual character is determined.
  • the receiving an action image sent by the camera device, and determining the movement direction of the user based on at least two received action images includes:
  • the determining the target movement posture of the virtual character based on each of the second laser irradiation points and the received motion image includes:
  • the laser sensing method further includes:
  • the second laser irradiation point and the preset standard range corresponding to the second laser irradiation point are displayed on the display screen of the terminal device.
  • the method further includes:
  • the wearing positions of the plurality of wearable laser transmitters are determined respectively according to the position information sent by the plurality of wearable laser transmitters.
  • a computer-readable storage medium on which a computer program is stored, and when the computer program is read and run by a processor, the method described in any one of the foregoing is implemented.
  • an electronic device including a processor and a memory
  • the memory stores a computer program that can run on the processor
  • the processor executes the computer program When realizing the steps of any one of the foregoing methods.
  • the laser sensing system and laser sensing method, computer readable storage medium, and electronic equipment provided by the embodiments of the application receive the first laser signal emitted by the handheld laser transmitter according to the trigger instruction sent by the user, and the wearable laser transmitter according to the preset
  • the multiple second laser signals sent at the time interval respectively determine the first laser irradiation point corresponding to the first laser signal in the display screen of the terminal device, and the second laser irradiation point corresponding to each second laser signal , And determine the first laser irradiation point as the shooting position of the virtual character.
  • the target movement posture and direction of the virtual character Based on the multiple second laser irradiation points and the user's motion images collected by the camera, determine the target movement posture and direction of the virtual character, and control the virtual character The character moves in the moving direction according to the target moving posture. In this way, the user can adjust the shooting position, shooting direction, and shooting posture of the corresponding virtual character according to his shooting habits, thereby increasing the hit rate of the user's shooting, reducing the difficulty of shooting operation, and helping to improve the user's experience.
  • FIG. 1 is a schematic structural diagram of a laser sensing system provided by an embodiment of this application.
  • FIG. 2 is one of the structural schematic diagrams of the terminal device shown in FIG. 1;
  • Fig. 3 is a second structural diagram of the terminal device shown in Fig. 1;
  • FIG. 4 is a schematic diagram of the structure of the action recognition module shown in FIG. 3;
  • FIG. 5 is the third structural diagram of the terminal device shown in FIG. 1;
  • FIG. 6 is a flowchart of a laser sensing method provided by an embodiment of the present application.
  • FIG. 7 is a flowchart of a laser sensing method provided by another embodiment of the application.
  • Icon 100-laser sensor system; 110-handheld laser transmitter; 120-wearable laser transmitter; 130-camera device; 140-terminal equipment; 141-signal receiving module; 142-temperature recognition module; 143-signal recognition module 144-position recognition module; 145-action recognition module; 146-function control module; 147-posture correction module; 1451-position determination unit; 1452-movement posture determination unit; 1453-posture conversion unit.
  • the current game mode is very single for some players.
  • Players usually find that the shooting position of the virtual character is not the most suitable shooting position, but they cannot change these shooting positions, which makes the player The shooting operation is more difficult, the hit rate is low, and it is easy to cause user fatigue, which reduces the user experience.
  • the laser sensing system 100 receives the first laser signal emitted by the user through the handheld laser transmitter 110 and
  • the second laser signal emitted by the wearable laser transmitter 120 identifies the shooting position, target movement posture and movement direction of the user corresponding to the virtual character, and controls the virtual character to move in the movement direction according to the target movement posture.
  • the user can adjust the shooting position, shooting direction, and shooting posture of the corresponding virtual character according to his shooting habits, thereby increasing the hit rate of the user's shooting, reducing the difficulty of shooting operation, and helping to improve the user's experience.
  • FIG. 1 is a schematic structural diagram of a laser sensing system 100 provided by an embodiment of the application.
  • the laser sensing system 100 includes a handheld laser transmitter 110, a plurality of wearable laser transmitters 120, a camera 130, and a terminal device 140:
  • the handheld laser transmitter 110 transmits a first laser signal to the terminal device 140 after receiving the trigger instruction sent by the user.
  • the user wears the wearable laser transmitter 120 on different parts of the user's body, for example, wears the wearable laser transmitter A on the left wrist, wears the wearable laser transmitter B on the right wrist, and emits the wearable laser
  • the device C is worn on the left ankle
  • the wearable laser transmitter D is worn on the right foot, etc.
  • the above-mentioned wearing rule of the wearable laser transmitter 120 may be set by the technician in advance.
  • the wearable laser transmitter A must be worn on the left wrist; or after the user wears it randomly, according to the wearable laser
  • the location information of the transmitter 120 determines the wearing part of the wearable laser transmitter 120.
  • the wearable laser transmitter 120 can send multiple second laser signals to the terminal device 140 at a preset time interval.
  • the camera 130 may be fixedly arranged above the terminal device 140 and configured to collect the user's action image at a preset time interval, and send the collected user's action image to the terminal device 140.
  • the terminal device 140 can display the virtual character corresponding to the user and the scene in which the virtual character is located.
  • the display screen of the terminal device 140 is covered with a transparent light laser sensor film.
  • the transparent light laser sensor film can sense the first laser signal and multiple The second laser signal, when the first laser signal and the second laser signal are irradiated on the transparent light laser sensing film, it will cause a certain temperature difference change. Therefore, the temperature difference can be used to determine the first laser irradiation point and the second laser irradiation point to determine The positions of the first laser irradiation point and the second laser irradiation point in the display screen of the terminal device 140.
  • the first laser signal corresponds to The first laser irradiation point, and the second laser irradiation point corresponding to each second laser emission signal.
  • the terminal device 140 can control the virtual character corresponding to the user to move in the movement direction according to the target movement posture.
  • the terminal device 140 can recognize the first laser irradiation point and multiple second laser irradiation points at the same time, that is, the virtual character can shoot while moving, and the shooting posture can be the determined target moving posture.
  • the terminal device 140 can determine the user's corresponding signal based on the received first laser signal emitted by the handheld laser transmitter 110, the second laser signal emitted by each wearable laser transmitter 120, and the action image sent by the camera 130.
  • FIG. 2 is one of the structural schematic diagrams of the terminal device 140 shown in FIG. 1.
  • the terminal device 140 includes a signal receiving module 141, a temperature recognition module 142, and a signal recognition module 143.
  • the signal receiving module 141 is configured to receive the first laser signal emitted by the handheld laser transmitter 110 and the second laser signal emitted by each wearable laser transmitter 120 of the plurality of wearable laser transmitters 120.
  • the temperature identification module 142 can determine the display according to the temperature difference between the illuminated point and the non-irradiated area on the display screen of the terminal device 140 A plurality of laser irradiation points to be identified that are irradiated by the first laser signal and the plurality of second laser signals on the screen.
  • the signal recognition module 143 can identify the multiple laser irradiation points to be recognized according to the irradiation frequency of each laser irradiation point to be recognized, which is obtained by the first laser signal emitted by the handheld laser transmitter 110
  • the first laser irradiation point that is, the first laser irradiation point corresponding to the handheld laser transmitter 110
  • the second laser irradiation point obtained from the second laser signal emitted by each wearable laser transmitter 120 that is, the first laser irradiation point corresponding to each wearable laser transmitter 120
  • Wear the second laser irradiation point corresponding to the laser transmitter 120 wear the second laser irradiation point corresponding to the laser transmitter 120, and determine the determined first laser irradiation point as the shooting position of the virtual character.
  • the virtual character will Shoot at the shooting position corresponding to the laser irradiation point.
  • FIG. 3 is a second structural diagram of the terminal device 140 shown in FIG. 1.
  • the terminal device 140 further includes a position recognition module 144, an action recognition module 145 and a function control module 146.
  • the location recognition module 144 is configured to receive the user's action images sent by the camera 130, and based on the received at least two action images, identify the location area where the user is located in each action image, and then according to each action image Time sequence, determine the user's moving direction.
  • the camera 130 is fixedly installed in advance, taking the center of the collected motion image as the origin, and the user is at the left end of the origin in the first frame of the motion image, that is, the user is at the left end of the motion image, and the collection is performed at a preset time interval.
  • the user is at the right end of the origin, that is, the user is at the right end of the action image.
  • the user has moved to the right; for another example, the user’s upper body is above the origin in the first frame of action image.
  • Half part that is, the user's upper body is in the upper half of the image
  • the user's upper body is in the lower half of the origin in the second frame of the action image collected at a preset time interval, that is, the user's upper body is in the lower half of the image.
  • the user made a squat action, that is, the direction of movement is downward movement.
  • the motion recognition module 145 is configured to determine the target movement posture of the virtual character corresponding to the user displayed by the terminal device 140 according to each second laser irradiation point and the received motion image of the user sent by the camera 130.
  • the function control module 146 is configured to control the virtual character to move in the determined movement direction according to the determined target movement posture corresponding to the virtual character, for example, to control the virtual character to move downward in the target movement posture.
  • FIG. 4 is a schematic structural diagram of the action recognition module 145 shown in FIG.
  • the location determining unit 1451 is configured to determine, for each second laser irradiation point, the wearable laser transmitter 120 corresponding to the laser irradiation point, and at the same time determine the wearable laser transmitter 120 on the user's body.
  • the wearable laser emitter 120 corresponding to the second laser irradiation point can be determined, so as to determine the wearable part of the wearable laser emitter 120 worn on the user. .
  • the wearing part of the wearable laser transmitter 120 can be set by the technician in advance.
  • the wearable laser transmitter A must be worn on the left wrist; or after the user wears it randomly, according to the wearable laser
  • the location information of the transmitter 120 determines the wearing part of the wearable laser transmitter 120.
  • the movement posture determination unit 1452 is configured to determine the user’s initial movement posture in the action image from the action image. Since the user’s action may be blocked in the action image, only the user can be identified from the collected action image. The right arm of the user can only recognize the user’s right forearm, which will make the initial movement posture incomplete, and there is no way to simulate the user’s movement posture. At this time, you need to use the determined wear part corresponding to the second laser irradiation point Adjust the initial movement posture to obtain a complete movement posture.
  • the shoulder position is not determined from the action image, determine the position of the user's shoulder, so as to adjust the initial movement posture to obtain the complete initial movement posture; or the initial movement posture recognized from the action image
  • the corresponding adjustment can also be made according to the wearable laser transmitter 120 worn by user 1 or user 2 .
  • the posture conversion unit 1453 corresponds to the back image of the virtual character corresponding to the user in the game. Therefore, after obtaining the adjusted initial movement posture, it needs to be adjusted.
  • the front movement posture of the user is determined as the back movement posture corresponding to the virtual character, that is, the adjusted initial movement posture is converted into the target movement posture corresponding to the virtual character.
  • FIG. 5 is a third structural diagram of the terminal device 140 shown in FIG. 1.
  • the terminal device 140 further includes a posture correction module 147.
  • the posture correction module 147 is configured to determine whether the laser irradiation point is within a preset standard range for each second laser irradiation point. If it is detected that the second laser irradiation point is not within the preset standard range, the terminal device The display screen of 140 displays the second laser irradiation point that is not within the preset standard range and the preset standard range corresponding to the second laser irradiation point, so that the user can adjust his movement posture according to the preset standard range.
  • the user when the user is exercising, it can be detected in real time whether the user's body has reached the preset standard range, for example, whether the right hand has reached the preset height, etc., specifically, it is to be worn on the right wrist by detecting Whether the second laser irradiation point corresponding to the wearable laser transmitter 120 is within the preset standard range, if it is detected that the second laser irradiation point is not within the preset standard range, it will be displayed on the display screen of the terminal device 140 The second laser irradiation point corresponding to the wearable laser transmitter 120 on the right wrist that is not within the preset range, and the preset standard range corresponding to the second laser irradiation point.
  • the preset standard range is pre-set by a technician, and the preset standard range is different for different application environments or scenarios.
  • the terminal device 140 can determine the preset standard range corresponding to each second laser irradiation point in the scene, and the preset standard range corresponding to the second laser irradiation point is changed as the content in the scene is switched.
  • the scope of the set standard will also change accordingly.
  • the terminal device 140 can detect the user's posture in real time through the posture correction module 147, and display the second laser irradiation point that does not reach the preset standard range and the preset standard range corresponding to the second laser irradiation point on the display screen. It can remind the user to adjust his posture when the user’s posture is not standard.
  • the laser sensing system 100 receives a first laser signal emitted by the handheld laser transmitter 110 according to a trigger instruction sent by a user, and a plurality of second lasers sent by the wearable laser transmitter 120 according to a preset time interval Signal, respectively determine the first laser irradiation point corresponding to the first laser signal in the display screen of the terminal device 140, and the second laser irradiation point corresponding to each second laser signal, and determine the first laser irradiation point It is the shooting position of the virtual character. Based on the multiple second laser irradiation points and the user's motion images collected by the camera 130, the target movement posture and direction of the virtual character are determined, and the virtual character is controlled to move in the direction according to the target movement posture. move.
  • the user can adjust the shooting position, shooting direction, and shooting posture of the corresponding virtual character according to his shooting habits, thereby increasing the hit rate of the user's shooting, reducing the difficulty of shooting operation, and helping to improve the user's experience.
  • a flow chart of a laser sensing method provided by an embodiment of this application should be configured as a terminal device, and the laser sensing method includes:
  • the terminal device determines the first laser signal corresponding to the first laser signal in the display screen of the terminal device.
  • Laser irradiation point that is, the first laser irradiation point corresponding to the handheld laser transmitter in the display screen, and the second laser irradiation point corresponding to each second laser signal, you can wear the second laser transmitter corresponding to the display screen. Laser irradiation point.
  • S602. Determine the first laser irradiation point as the shooting position of the virtual character.
  • the determined first laser irradiation point is determined as the shooting position of the virtual character corresponding to the user.
  • the location where the user wants to shoot can be determined, so that the virtual character can be controlled to shoot at the location where the user wants to shoot.
  • S603 Based on each of the second laser irradiation points and the received motion images of the user sent by the camera device, determine a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal device.
  • the virtual character corresponding to the user displayed on the display screen of the terminal device is determined The corresponding target movement posture and direction.
  • S604 Control the virtual character to move in the moving direction according to the target moving posture.
  • the terminal device controls the virtual character to move in the movement direction according to the target movement posture.
  • the terminal device can simultaneously determine the shooting position of the virtual character and the target movement posture and direction corresponding to the virtual character. Therefore, the virtual character can shoot during the movement and can shoot according to the target movement posture.
  • step S601 includes: after receiving the first laser signal emitted by the handheld laser transmitter and the second laser signal emitted by each wearable laser transmitter, according to the irradiated point in the display screen of the terminal device Determine the multiple to-be-identified laser irradiation points corresponding to the first laser signal and each of the second laser signals in the display screen according to the temperature difference between the area and the non-irradiated area; Identify the irradiation frequency of the laser irradiation point, identify the first laser irradiation point corresponding to the handheld laser transmitter among the plurality of laser irradiation points to be identified, and the second laser irradiation point corresponding to each of the wearable laser transmitters point.
  • the first laser is determined separately
  • the first laser irradiation point corresponding to the signal that is, the first laser irradiation point corresponding to the handheld laser transmitter, and the second laser irradiation point corresponding to each second laser signal, the second laser irradiation point corresponding to the laser transmitter can be worn.
  • step S603 includes: receiving an action image sent by the camera device, and determining the movement direction of the user based on the at least two received action images; based on each of the second laser irradiation points and the received To determine the target movement posture of the virtual character.
  • this step after receiving the action image sent by the camera device, identify the user's location area in each action image, and then determine the user's moving direction according to the time sequence of each action image;
  • the second laser irradiation point and the received motion image determine the target movement posture of the virtual character.
  • the camera device is fixedly installed in advance, taking the center of the captured motion image as the origin, and the user is at the left end of the origin in the first frame of the motion image, that is, the user is at the left end of the motion image, and the captured images are collected at a preset time interval.
  • the user is at the right end of the origin, that is, the user is at the right end of the action image.
  • the user has moved to the right; for another example, the user’s upper body is at the upper half of the origin in the first frame of action image Part, that is, the user's upper body is in the upper half of the image, and the user's upper body is in the lower half of the origin in the second frame of the action image collected at a preset time interval, that is, the user's upper body is in the lower half of the image, obviously , The user made a squat action, that is, the moving direction is downward.
  • the determining the target movement posture of the virtual character based on each of the second laser irradiation points and the action image includes: determining the target movement posture of the virtual character for each of the second laser irradiation points The wearable laser emitter corresponding to the second laser irradiation point, and the wearable laser emitter is determined; the initial movement posture of the user is determined from the action image, based on each wearing position , Adjusting the initial moving posture; performing coordinate transformation on the adjusted initial moving posture, and converting the initial moving posture into a target moving posture corresponding to the virtual character.
  • the wearable laser emitter corresponding to the laser irradiation point can be determined according to the irradiation frequency corresponding to each second laser irradiation point, so as to determine the wearing part of the wearable laser emitter on the user's body.
  • the wearing part of the wearable laser transmitter can be set by the technician in advance.
  • the wearable laser transmitter A must be worn on the left wrist; or after the user wears it randomly, according to the wearable laser emission
  • the position information of the wearable laser transmitter can be used to determine the wearing part of the wearable laser transmitter; the initial movement posture of the user in the action image is determined from the action image.
  • the initial movement posture is incomplete, and there is no way to simulate the user’s movement posture. In this case, you need to use the determined For the wearing part corresponding to the second laser irradiation point, adjust the initial movement posture to obtain the complete movement posture.
  • the second laser irradiation point can determine the position of the user's shoulder without determining the position of the shoulder from the action image, so that the initial movement posture can be adjusted to obtain the complete initial movement posture; or from the action image If there is a deviation in the initial movement posture recognized in the, for example, a false recognition occurs due to the influence of the background, and the hand of user 1 is recognized as the hand of user 2. At this time, it can also be based on the wearable laser worn by user 1 or user 2. The transmitter is adjusted accordingly.
  • the adjusted initial The moving posture performs coordinate conversion, and the front moving posture of the user is determined as the back moving posture corresponding to the virtual character, that is, the adjusted initial moving posture is converted into the target moving posture corresponding to the virtual character.
  • the first laser signal corresponds to The first laser irradiation point and the second laser irradiation point corresponding to each second laser signal; the first laser irradiation point is determined as the shooting position of the virtual character; based on each of the second laser irradiation points and receiving The user's action image sent by the received camera device determines the target movement posture and the movement direction of the virtual character displayed by the terminal device; controls the virtual character to move in the movement direction according to the target movement posture.
  • this application uses the handheld laser transmitter held by the user and the wearable laser transmitter to determine the shooting position, movement direction, and target movement posture of the virtual character corresponding to the user, so that the virtual character can be controlled to move according to the target movement posture.
  • Directional movement can increase the hit rate of the user's shooting, reduce the difficulty of shooting operation, and help improve the user's sense of experience.
  • FIG. 7 is a flowchart of a laser sensing method according to another embodiment of the present application. As shown in FIG. 7, the laser sensing method provided by the embodiment of the present application includes:
  • S703 Based on each of the second laser irradiation points and the received motion image of the user sent by the camera device, determine a target movement posture and a movement direction corresponding to the virtual character displayed by the terminal device.
  • S704 Control the virtual character to move in the moving direction according to the target moving posture.
  • the terminal device can determine the preset standard range corresponding to each second laser irradiation point in the scene.
  • the preset standard range corresponding to the second laser irradiation point will also change accordingly.
  • the second laser irradiation point that is not within the preset standard range and the preset standard range corresponding to the second laser irradiation point are displayed in In the display screen of the terminal device, to remind the user to adjust his posture.
  • the user when the user is exercising, it is detected in real time whether the user's body has reached the preset standard range, for example, whether the right hand reaches the preset height, etc., specifically, it is to be worn on the right wrist through detection Whether the second laser irradiation point corresponding to the wearable laser transmitter is within the preset standard range, if it is detected that the second laser irradiation point is not within the preset standard range, it will be displayed on the display screen of the terminal device. Set the second laser irradiation point corresponding to the wearable laser transmitter on the right wrist within the standard range, and the preset standard range corresponding to the second laser irradiation point.
  • the terminal device can detect the user's posture in real time through each second laser irradiation point, and display the second laser irradiation point that does not reach the preset standard range and the preset standard range corresponding to the second laser irradiation point on the display. On the screen, the user can be reminded to adjust his posture when the user’s posture is not standard.
  • S701 to S704 can refer to the description of S601 to S604, and the same technical effect can be achieved, which will not be repeated.
  • the first laser signal corresponds to The first laser irradiation point and the second laser irradiation point corresponding to each second laser signal; the first laser irradiation point is determined as the shooting position of the virtual character; based on each of the second laser irradiation points and receiving The user’s action image sent by the received camera device determines the target movement posture and movement direction of the virtual character displayed by the terminal device; controls the virtual character to move in the movement direction according to the target movement posture; For each second laser irradiation point, it is determined whether the second laser irradiation point is within the corresponding preset standard range; if the second laser irradiation point is not within the preset standard range, the terminal The display screen of the device displays the second laser irradiation point and the preset standard range corresponding to the second laser irradiation
  • this application uses the handheld laser transmitter held by the user and the wearable laser transmitter to determine the shooting position, movement direction, and target movement posture of the virtual character corresponding to the user, so that the virtual character can be controlled to move according to the target movement posture.
  • the user’s posture can be monitored in real time through each second laser irradiation point.
  • the user can be reminded to adjust their posture, which can increase the user’s shooting hit rate and reduce the shooting operation Difficulty helps to improve user experience.
  • an embodiment of the present application further provides an electronic device, including a processor and a memory, and a computer program that can be run on the processor is stored in the memory.
  • a computer program that can be run on the processor is stored in the memory.
  • the processor executes the computer program, the steps of any one of the foregoing methods are implemented. .
  • the specific implementation method is similar to the technical effect, and will not be repeated here.
  • the embodiment of the present application further provides a program product, such as a computer-readable storage medium, including a program, which is configured to execute the foregoing method embodiment when executed by a processor.
  • a program product such as a computer-readable storage medium, including a program, which is configured to execute the foregoing method embodiment when executed by a processor.
  • the disclosed system, device, and method can be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation.
  • multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be through some communication interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a non-volatile computer readable storage medium executable by a processor.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program code .
  • the laser sensing system and laser sensing method, computer readable storage medium, and electronic equipment provided by the embodiments of the application receive the first laser signal emitted by the handheld laser transmitter according to the trigger instruction sent by the user, and the wearable laser transmitter according to the preset
  • the multiple second laser signals sent at the time interval respectively determine the first laser irradiation point corresponding to the first laser signal in the display screen of the terminal device, and the second laser irradiation point corresponding to each second laser signal , And determine the first laser irradiation point as the shooting position of the virtual character.
  • the target movement posture and direction of the virtual character Based on the multiple second laser irradiation points and the user's motion images collected by the camera, determine the target movement posture and direction of the virtual character, and control the virtual character The character moves in the moving direction according to the target moving posture. In this way, the user can adjust the shooting position, shooting direction, and shooting posture of the corresponding virtual character according to his shooting habits, thereby increasing the hit rate of the user's shooting, reducing the difficulty of shooting operation, and helping to improve the user's experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种激光感应系统及激光感应方法、计算机可读存储介质和电子设备,包括手持激光发射器(110),配置成根据用户的触发指令向终端设备发射第一激光信号;可穿戴激光发射器(120),配置成向终端设备发射多个第二激光信号;摄像装置(130),配置成将采集到的用户的动作图像发送至终端设备;终端设备(140),配置成确定第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点,将第一激光照射点确定为虚拟角色的射击位置,基于每个第二激光照射点以及动作图像,确定出虚拟角色对应的目标移动姿势以及移动方向,并控制虚拟角色按照目标移动姿势向移动方向移动,增加了用户射击的命中率,降低了射击的操作难度,有助于提高用户的体验感。

Description

激光感应系统及方法、计算机可读存储介质、电子设备
相关申请的交叉引用
本申请要求于2020年04月02日提交中国专利局的申请号为2020102566828、名称为“一种激光感应系统及激光感应方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及模拟游戏技术领域,尤其是涉及一种激光感应系统及方法、计算机可读存储介质和电子设备。
背景技术
随着科技的不断发展,人们生活水平的提高,人们在闲暇时间里通常会选择进行一些游戏与运动,为了便于观看以及增强人们的体验感,对于一些射击类游戏来说,通常会通过一些信息采集设备,为玩家模拟出一个较为真实的射击场景,即玩家能够自己控制使用的虚拟射击设备的射击方向。
上述的游戏方式对于一些玩家来说是非常单一的,玩家通常会发现虚拟角色所处的射击位置并不是最适合射击的射击位置,但又不能够改变这些射击位置,从而使得玩家的射击的操作难度较大,命中率较低,且很容易造成用户操作疲劳,降低了用户的体验度。
发明内容
有鉴于此,本申请的目的之一包括提供一种激光感应系统及方法、计算机可读存储介质和电子设备,能够按照用户的目标移动姿势和移动方向控制用户对应的虚拟角色进行移动,增加了用户射击的命中率,降低了射击的操作难度,有助于提高用户的体验感。
为了实现上述目的,本申请实施例采用的技术方案如下:
本申请实施例的一方面,提供了一种激光感应系统,所述激光感应系统包括手持激光发射器、多个可穿戴激光发射器、摄像装置以及终端设备:
所述手持激光发射器,配置成根据用户发送的触发指令,向所述终端设备发射第一激光信号;
所述多个可穿戴激光发射器,分别穿戴在所述用户的身体上的不同部位处,配置成按照预设的时间间隔向所述终端设备发射多个第二激光信号;
所述摄像装置,配置成按照预设的时间间隔采集所述用户的动作图像,并将采集到的动作图像发送至所述终端设备;
所述终端设备,配置成展示与所述用户对应的虚拟角色以及所述虚拟角色对应的场景,分别确定所述第一激光信号对应的第一激光照射点,以及所述多个第二激光信号中每个所 述第二激光信号对应的第二激光照射点,将所述第一激光照射点确定为所述虚拟角色的射击位置,基于每个第二激光照射点以及所述动作图像,确定出所述虚拟角色对应的目标移动姿势以及移动方向,并控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动。
在一种可能的实现方式中,所述终端设备包括信号接收模块、温度识别模块以及信号识别模块:
所述信号接收模块,配置成接收所述手持激光发射器发射的第一激光信号以及所述多个可穿戴激光发射器中每个所述可穿戴激光发射器发射的第二激光信号;
所述温度识别模块,配置成根据所述终端设备的显示屏中被照射点与未被照射区域之间的温度差,确定出在所述显示屏中所述第一激光信号和所述多个第二激光信号对应的多个待识别激光照射点;
所述信号识别模块,配置成根据所述多个待识别激光照射点的照射频率,识别出所述多个待识别激光照射点中与所述手持激光发射器对应的第一激光照射点,以及与每个所述可穿戴激光发射器对应的第二激光照射点,将所述第一激光照射点确定为所述虚拟角色的射击位置。
在一种可能的实现方式中,所述终端设备还包括位置识别模块、动作识别模块以及功能控制模块:
所述位置识别模块,配置成接收所述摄像装置发送的所述动作图像,并基于接收到的至少两张动作图像确定出所述用户的移动方向;
所述动作识别模块,配置成基于每个所述第二激光照射点以及接收到的动作图像,确定出所述虚拟角色的目标移动姿势;
所述功能控制模块,配置成控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动。
在一种可能的实现方式中,所述位置识别模块,具体配置成接收所述摄像装置发送的至少两张所述动作图像,识别用户在每张所述动作图像中所处的位置区域,以及每张动作图像的拍摄时间,确定出所述用户的移动方向,其中至少两张所述动作图像的拍摄时间不同。
在一种可能的实现方式中,所述动作识别模块包括部位确定单元、移动姿势确定单元以及姿势转换单元:
所述部位确定单元,配置成针对于每个所述第二激光照射点,确定出所述第二激光照射点对应的可穿戴激光发射器,并确定出该可穿戴激光发射器的穿戴部位;
所述移动姿势确定单元,配置成从所述动作图像中确定出所述用户对应的初始移动姿势,并根据每个所述穿戴部位调整所述初始移动姿势;
所述姿势转换单元,配置成对调整后的所述初始移动姿势进行坐标转换,将所述初始移动姿势转换为所述虚拟角色对应的目标移动姿势。
在一种可能的实现方式中,所述终端设备还包括姿势校正模块:
所述姿势校正模块,配置成针对于每个所述第二激光照射点,确定所述第二激光照射点是否位于对应的预设标准范围内,若所述第二激光照射点未位于所述预设标准范围内,在所述终端设备的显示屏中展示该第二激光照射点以及该第二激光照射点对应的所述预设标准范围。
在一种可能的实现方式中,所述多个可穿戴激光发射器,分别穿戴在所述用户的身体上的不同部位处包括:
分别根据所述多个可穿戴激光发射器发送的位置信息确定所述多个可穿戴激光发射器的穿戴部位。
在一种可能的实现方式中,所述终端设备的显示屏上覆盖有透明光激光感应薄膜,所述透明光激光感应薄膜配置成感应所述第一激光信号以及每个所述第二激光信号。
在一种可能的实现方式中,所述摄像装置设置于所述终端设备的上方。
本申请实施例的另一方面,还提供了一种激光感应方法,应用于上述终端设备,所述激光感应方法包括:
在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点;
将所述第一激光照射点确定为虚拟角色的射击位置;
基于每个所述第二激光照射点以及接收到的摄像装置发送的用户的动作图像,确定出所述终端设备展示的虚拟角色对应的目标移动姿势以及移动方向;
控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动。
在一种可能的实现方式中,所述在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点,包括:
在接收到所述手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,根据所述终端设备的显示屏中照射点与未被照射区域之间的温度差,确定出在所述显示屏中所述第一激光信号和每个所述第二激光信号对应的多个待识别激光照射点;
根据所述多个待识别激光照射点的照射频率,识别出所述多个待识别激光照射点中所述手持激光发射器对应的第一激光照射点,以及每个所述可穿戴激光发射器对应的第二激 光照射点。
在一种可能的实现方式中,所述基于每个所述第二激光照射点以及接收到的摄像装置发送的用户的动作图像,确定出所述终端设备展示的虚拟角色对应的目标移动姿势以及移动方向,包括:
接收所述摄像装置发送的动作图像,并基于接收到的至少两张动作图像确定出所述用户的移动方向;
基于每个所述第二激光照射点以及接收到的动作图像,确定出所述虚拟角色的目标移动姿势。
在一种可能的实现方式中,所述接收所述摄像装置发送的动作图像,并基于接收到的至少两张动作图像确定出所述用户的移动方向包括:
接收所述摄像装置发送的至少两张所述动作图像,识别用户在每张所述动作图像中所处的位置区域,以及每张动作图像的拍摄时间,确定出所述用户的移动方向,其中,至少两张所述动作图像的拍摄时间不同。
在一种可能的实现方式中,所述基于每个所述第二激光照射点以及接收到的动作图像,确定出所述虚拟角色的目标移动姿势,包括:
针对于每个所述第二激光照射点,确定出所述第二激光照射点对应的可穿戴激光发射器,并确定出该可穿戴激光发射器的穿戴部位;
从所述动作图像中确定出所述用户的初始移动姿势,基于每个所述穿戴部位,调整所述初始移动姿势;
对调整后的所述初始移动姿势进行坐标变换,将所述初始移动姿势转换为所述虚拟角色对应的目标移动姿势。
在一种可能的实现方式中,在所述控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动之后,所述激光感应方法还包括:
针对于每个所述第二激光照射点,确定所述第二激光照射点是否位于对应的预设标准范围内;
若所述第二激光照射点未位于所述预设标准范围内,在所述终端设备的显示屏中展示该第二激光照射点以及该第二激光照射点对应的所述预设标准范围。
在一种可能的实现方式中,所述在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点之前,所述方法还包括:
分别根据所述多个可穿戴激光发射器发送的位置信息确定所述多个可穿戴激光发射器的穿戴部位。
本申请实施例的再一方面,还提供了一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器读取并运行时,实现前述任一项所述的方法。
本申请实施例的又一方面,还提供了一种电子设备,包括处理器和存储器,所述存储器中存储有可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现前述任一项所述的方法的步骤。本申请实施例提供的激光感应系统及激光感应方法、计算机可读存储介质和电子设备,接收手持激光发射器根据用户发送的触发指令发射的第一激光信号,以及可穿戴激光发射器按照预设的时间间隔发送的多个第二激光信号,分别确定出在所述终端设备的显示屏中第一激光信号对应的第一激光照射点,以及每一个第二激光信号对应的第二激光照射点,并将第一激光照射点确定为虚拟角色的射击位置,基于多个第二激光照射点和摄像装置采集到的用户的动作图像,确定出虚拟角色的目标移动姿势和移动方向,并控制虚拟角色按照目标移动姿势向移动方向移动。这样,用户能够根据自己的射击习惯,调整对应虚拟角色的射击位置、射击方向以及射击姿势,从而能够增加了用户射击的命中率,降低了射击的操作难度,有助于提高用户的体验感。
为使本申请的上述目的、特征和优点能更明显易懂,下文特举可选实施例,并配合所附附图,作详细说明如下。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,应当理解,以下附图仅示出了本申请的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1为本申请实施例所提供的一种激光感应系统的结构示意图;
图2为图1中所示的终端设备的结构示意图之一;
图3为图1中所示的终端设备的结构示意图之二;
图4为图3中所示的动作识别模块的结构示意图;
图5为图1中所示的终端设备的结构示意图之三;
图6为了本申请实施例所提供的一种激光感应方法的流程图;
图7为本申请另一实施例所提供的一种激光感应方法的流程图。
图标:100-激光感应系统;110-手持激光发射器;120-可穿戴激光发射器;130-摄像装置;140-终端设备;141-信号接收模块;142-温度识别模块;143-信号识别模块;144-位置识别模块;145-动作识别模块;146-功能控制模块;147-姿势校正模块;1451-部位确定单元;1452-移动姿势确定单元;1453-姿势转换单元。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本申请实施例的组件可以以各种不同的配置来布置和设计。因此,以下对在附图中提供的本申请的实施例的详细描述并非旨在限制要求保护的本申请的范围,而是仅仅表示本申请的选定实施例。基于本申请的实施例,本领域技术人员在没有做出创造性劳动的前提下所获得的每个其他实施例,都属于本申请保护的范围。
经研究发现,目前的游戏方式对于一些玩家来说是非常单一的,玩家通常会发现虚拟角色所处的射击位置并不是最适合射击的射击位置,但又不能够改变这些射击位置,从而使得玩家的射击的操作难度较大,命中率较低,且很容易造成用户操作疲劳,降低了用户的体验度。
有鉴于此,本申请的目的之一在于提供一种激光感应系统及激光感应方法、计算机可读存储介质和电子设备,激光感应系统100接收用户通过手持激光发射器110发射的第一激光信号以及可穿戴激光发射器120发射的第二激光信号,识别出用户对应虚拟角色的射击位置、目标移动姿势以及移动方向,控制虚拟角色按照目标移动姿势向移动方向移动。这样,用户能够根据自己的射击习惯,调整对应虚拟角色的射击位置、射击方向以及射击姿势,从而能够增加了用户射击的命中率,降低了射击的操作难度,有助于提高用户的体验感。
首先,对本申请公开的一种激光感应系统100进行介绍。
请参阅图1,图1为本申请实施例所提供的一种激光感应系统100的结构示意图。所述激光感应系统100包括手持激光发射器110、多个可穿戴激光发射器120、摄像装置130以及终端设备140:
当用户发送触发指令时,手持激光发射器110在接收到用户发送的触发指令后,向终端设备140发射第一激光信号。
用户将可穿戴激光发射器120穿戴在用户身体的不同部位上,例如,将可穿戴激光发射器A佩戴在左手腕上,将可穿戴激光发射器B佩戴在右手腕上,将可穿戴激光发射器C佩戴在左脚腕上,将可穿戴激光发射器D佩戴在右脚上等。
上述的可穿戴激光发射器120的穿戴规则可以是技术人员事先设定好的,例如,可穿戴激光发射器A一定要穿戴在左手腕上;再或者在用户随机的穿戴之后,根据可穿戴激光发射器120的位置信息,确定出可穿戴激光发射器120的穿戴部位。
在用户将可穿戴激光发射器120穿戴在身体的不同部位上之后,可穿戴激光发射器120能够按照预先设定的时间间隔向终端设备140发送多个第二激光信号。
摄像装置130可以固定设置于终端设备140的上方,配置成按照预设的时间间隔采集用户的动作图像,并将采集到的用户的动作图像发送至终端设备140。
终端设备140能够展示与用户对应的虚拟角色,以及虚拟角色所处的场景,终端设备140的显示屏上覆盖有透明光激光感应薄膜,透明光激光感应薄膜可以感应出第一激光信号以及多个第二激光信号,当第一激光信号和第二激光信号照射到透明光激光感应薄膜上时会引起一定温差变化,因此,可以利用温差确定第一激光照射点和第二激光照射点,确定出终端设备140的显示屏中第一激光照射点和第二激光照射点的位置。
在接收到手持激光发射器110发射的第一激光信号,以及每个可穿戴激光发射器120发射的第二激光发射信号之后,分别确定出在终端设备140的显示屏中,第一激光信号对应的第一激光照射点,以及每一个第二激光发射信号对应的第二激光照射点。
并将确定出的第一激光信号对应的第一激光照射点确定为虚拟角色的射击位置;基于每个第二激光发射信号对应的第二激光照射点,以及接收到的摄像装置130发送的用户的动作图像,确定虚拟角色的目标移动姿势以及移动方向,终端设备140能够控制用户对应的虚拟角色按照目标移动姿势向移动方向移动。
终端设备140能够同时识别出第一激光照射点以及多个第二激光照射点,也就是说虚拟角色可以一边移动一边进行射击,且射击时的姿势可以为确定出的目标移动姿势。
这样,终端设备140可以根据接收到的手持激光发射器110发射的第一激光信号、每个可穿戴激光发射器120发射的第二激光信号以及摄像装置130发送的动作图像,确定出用户对应的虚拟角色的射击位置、目标移动姿势以及移动方向,并控制虚拟角色按照目标移动姿势向移动方向移动。
可选地,请参阅图2,图2为图1中所示的终端设备140的结构示意图之一,终端设备140包括信号接收模块141,温度识别模块142以及信号识别模块143。
信号接收模块141配置成接收手持激光发射器110发射的第一激光信号,以及多个可穿戴激光发射器120中每个可穿戴激光发射器120发射的第二激光信号。
当信号接收模块141接收到第一激光信号以及每个第二激光信号时,温度识别模块142可以根据终端设备140的显示屏中被照射点与未被照射区域之间的温度差,确定出显示屏中被第一激光信号以及多个第二激光信号照射到的多个待识别激光照射点。
信号识别模块143可以根据多个待识别激光照射点中,各个待识别激光照射点的照射频率,识别出多个待识别激光照射点中,由手持激光发射器110发射的第一激光信号得到的第一激光照射点,即与手持激光发射器110对应的第一激光照射点,以及由每个可穿戴激光发射器120发射的第二激光信号得到的第二激光照射点,即与每个可穿戴激光发射器120对应的第二激光照射点,并将确定出的第一激光照射点确定为虚拟角色的射击位置, 当用户向手持激光发射器110发送射击指令时,虚拟角色会向第一激光照射点对应的射击位置处射击。
可选地,请参阅图3,图3为图1中所示的终端设备140的结构示意图之二,终端设备140还包括位置识别模块144、动作识别模块145以及功能控制模块146。
位置识别模块144配置成接收摄像装置130发送的用户的动作图像,并基于接收到的至少两张动作图像,识别出用户在每张动作图像中所处的位置区域,而后根据每张动作图像的时间顺序,确定出用户的移动方向。
例如,摄像装置130事先固定安装好,以采集到的动作图像的中心为原点,用户在第一帧动作图像中处于原点的左端,即用户位于动作图像的左端,而按照预设的时间间隔采集到的第二帧动作图像中用户处于原点的右端,即用户位于动作图像的右端,显然,用户是进行了向右的移动;再例如,用户的上半身在第一帧动作图像中处于原点的上半部分,即用户上半身处于图像的上半部分,而按照预设的时间间隔采集到的第二帧动作图像中用户的上半身处于原点的下半部分,即用户的上半身处于图像的下半部分,显然,用户是做了下蹲的动作,即移动方向为向下移动。
动作识别模块145配置成根据每个第二激光照射点以及接收到的摄像装置130发送的用户的动作图像,确定出终端设备140展示的与用户对应的虚拟角色的目标移动姿势。
功能控制模块146配置成控制虚拟角色按照确定出的虚拟角色对应的目标移动姿势向确定出的移动方向移动,例如,控制虚拟角色以目标移动姿势做向下移动的动作。
可选地,请参阅图4,图4为图3中所示的动作识别模块145的结构示意图,动作识别模块145包括部位确定单元1451、移动姿势确定单元1452以及姿势转换单元1453。
部位确定单元1451配置成针对于每个第二激光照射点,确定出该激光照射点对应的可穿戴激光发射器120,同时确定出该可穿戴激光发射器120在用户身上的穿戴部位。
具体的,可以根据每个第二激光照射点对应的照射频率,确定出该第二激光照射点对应的可穿戴激光发射器120,从而确定出可穿戴激光发射器120穿戴在用户身上的穿戴部位。
其中,可穿戴激光发射器120的穿戴部位可以是技术人员提前设定好的,例如,可穿戴激光发射器A一定要穿戴在左手腕上;再或者在用户随机的穿戴之后,根据可穿戴激光发射器120的位置信息,确定出可穿戴激光发射器120的穿戴部位。
移动姿势确定单元1452配置成从动作图像中确定出用户在动作图像中的初始移动姿势,由于在动作图像中用户的动作会存在遮挡的情况,因此从采集到的动作图像中仅能够识别出用户的右胳膊或者是仅能够识别出用户的右前臂,这样会使得初始移动姿势不完整,没有办法模拟出用户的移动姿势,此时,就需要借助确定出的第二激光照射点对应的穿戴部 位,对初始移动姿势进行调整,得到完整的移动姿势,例如,在仅能够识别出右前臂的情况下,根据穿戴在用户肩膀上的可穿戴激光发射器120对应的第二激光照射点,便能够在未从动作图像中确定出肩膀位置的情况下,确定出用户肩膀所在位置,从而实现对初始移动姿势的调整,得到完整的初始移动姿势;或者是在从动作图像中识别出的初始移动姿势存在偏差的情况下,例如,受到背景的影响出现错误识别,把用户1的手识别成用户2的手,这时也可以根据用户1或者用户2穿戴的可穿戴激光发射器120进行相应的调整。
姿势转换单元1453由于摄像装置130采集到的动作图像是用户的正面图像,而用户对应的虚拟角色在游戏中对应的应该是背部图像,因此在得到调整后的初始移动姿势之后,还需要对调整后的初始移动姿势进行坐标转换,将用户的正面移动姿势确定为虚拟角色对应的背面移动姿势,即将调整后的初始移动姿势转换为虚拟角色对应的目标移动姿势。
可选地,请参阅图5,图5为图1中所示的终端设备140的结构示意图之三,终端设备140还包括姿势校正模块147。
姿势校正模块147配置成针对于每个第二激光照射点,确定出该激光照射点是否位于预设标准范围内,若检测出第二激光照射点未位于预设标准范围内,则在终端设备140的展示屏中展示未位于预设标准范围内的第二激光照射点,以及该第二激光照射点对应的预设标准范围,以供用户根据预设标准范围调整自己的移动姿势。
示例性的,当用户在运动的过程中,可以实时检测用户的肢体有没有达到预设标准范围内,例如,右手是否到达了预设的高度等,具体的,是要通过检测穿戴在右手手腕上的可穿戴激光发射器120对应的第二激光照射点是否在预设标准范围内,若检测出该第二激光照射点未在预设标准范围内,则在终端设备140的显示屏内显示未在预设范围内的右手手腕上的可穿戴激光发射器120对应的第二激光照射点,以及该第二激光照射点对应的预设标准范围。
其中,预设标准范围由技术人员事先设定好,且针对于不同的应用环境或者是场景预设标准范围是不同。当用户选择了一个相应的场景之后,终端设备140便能够确定出在该场景中各个第二激光照射点对应的预设标准范围,且随着场景中内容的切换第二激光照射点对应的预设标准范围也会随之改变。
这样,终端设备140便能够通过姿势校正模块147实时检测用户的姿势,并将未达到预设标准范围的第二激光照射点,以及该第二激光照射点对应的预设标准范围显示在显示屏上,可以在用户姿势不标准的情况下,提醒用户调整自己的姿势。
本申请实施例提供的激光感应系统100,接收手持激光发射器110根据用户发送的触发指令发射的第一激光信号,以及可穿戴激光发射器120按照预设的时间间隔发送的多个第二激光信号,分别确定出在所述终端设备140的显示屏中第一激光信号对应的第一激光照 射点,以及每一个第二激光信号对应的第二激光照射点,并将第一激光照射点确定为虚拟角色的射击位置,基于多个第二激光照射点和摄像装置130采集到的用户的动作图像,确定出虚拟角色的目标移动姿势和移动方向,并控制虚拟角色按照目标移动姿势向移动方向移动。
这样,用户能够根据自己的射击习惯,调整对应虚拟角色的射击位置、射击方向以及射击姿势,从而能够增加了用户射击的命中率,降低了射击的操作难度,有助于提高用户的体验感。
参见图6所示,为本申请实施例提供的一种激光感应方法的流程图,应配置成终端设备,所述激光感应方法包括:
S601、在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点。
该步骤中,终端设备在接收手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号之后,确定在终端设备的显示屏中第一激光信号对应的第一激光照射点,即手持激光发射器在显示屏中对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点,即可穿戴激光发射器在显示屏中对应的第二激光照射点。
S602、将所述第一激光照射点确定为虚拟角色的射击位置。
该步骤中,在确定出手持激光发射器对应的第一激光照射点之后,将确定出的第一激光照射点确定为用户对应的虚拟角色的射击位置。
这样,便能够根据接收到的手持激光发射器发送的第一激光信号,确定出用户想要射击的位置,从而可以控制虚拟角色向用户想要射击的位置处射击。
S603、基于每个所述第二激光照射点以及接收到的摄像装置发送的用户的动作图像,确定出所述终端设备展示的虚拟角色对应的目标移动姿势以及移动方向。
该步骤中,基于确定出的每个第二激光信号对应的第二激光照射点,以及从摄像装置接收到的用户的动作图像,确定出在终端设备的显示屏中展示的用户对应的虚拟角色对应的目标移动姿势和移动方向。
S604、控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动。
该步骤中,终端设备在确定出虚拟角色对应的目标移动姿势以及移动方向之后,控制虚拟角色按照目标移动姿势向移动方向移动。
需要说明的是,终端设备能够同时确定出虚拟角色的射击位置以及虚拟角色对应的目标移动姿势以及移动方向,因此,虚拟角色能够在移动的过程中进行射击,并能够按照目标移动姿势进行射击。
可选地,步骤S601包括:在接收到所述手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,根据所述终端设备的显示屏中照射点与未被照射区域之间的温度差,确定出在所述显示屏中所述第一激光信号和每个所述第二激光信号对应的多个待识别激光照射点;根据所述多个待识别激光照射点的照射频率,识别出所述多个待识别激光照射点中所述手持激光发射器对应的第一激光照射点,以及每个所述可穿戴激光发射器对应的第二激光照射点。
该步骤中,在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号之后,根据终端设备的显示屏中照射点与未被照射区域之间的温度差,确定出在终端设备的显示屏中第一激光信号和每个第二激光信号对应的待识别激光照射点;而后根据多个待识别激光照射点的照射频率,分别确定出第一激光信号对应的第一激光照射点,即手持激光发射器对应的第一激光照射点,以及每个第二激光信号对应第二激光照射点,即可穿戴激光发射器对应的第二激光照射点。
可选地,步骤S603包括:接收所述摄像装置发送的动作图像,并基于接收到的至少两张动作图像确定出所述用户的移动方向;基于每个所述第二激光照射点以及接收到的动作图像,确定出所述虚拟角色的目标移动姿势。
该步骤中,在接收到摄像装置发送的动作图像之后,识别用户在每张动作图像中的位置区域,然后根据上述每张动作图像的采集时间顺序,确定出用户的移动方向;再根据每个第二激光照射点以及接收到的动作图像,确定出虚拟角色的目标移动姿势。
例如,摄像装置事先固定安装好,以采集到的动作图像的中心为原点,用户在第一帧动作图像中处于原点的左端,即用户位于动作图像的左端,而按照预设的时间间隔采集到的第二帧动作图像中用户处于原点的右端,即用户位于动作图像的右端,显然,用户是进行了向右的移动;再例如,用户的上半身在第一帧动作图像中处于原点的上半部分,即用户上半身处于图像的上半部分,而按照预设的时间间隔采集到的第二帧动作图像中用户的上半身处于原点的下半部分,即用户的上半身处于图像的下半部分,显然,用户是做了下蹲的动作,即移动方向为向下移动。
可选地,所述基于每个所述第二激光照射点以及所述动作图像,确定出所述虚拟角色的目标移动姿势,包括:针对于每个所述第二激光照射点,确定出所述第二激光照射点对应的可穿戴激光发射器,并确定出该可穿戴激光发射器的穿戴部位;从所述动作图像中确定出所述用户的初始移动姿势,基于每个所述穿戴部位,调整所述初始移动姿势;对调整后的所述初始移动姿势进行坐标变换,将所述初始移动姿势转换为所述虚拟角色对应的目标移动姿势。
该步骤中,可以根据每个第二激光照射点对应的照射频率,确定出该激光照射点对应 的可穿戴激光发射器,从而确定出可穿戴激光发射器穿戴在用户身上的穿戴部位。其中,可穿戴激光发射器的穿戴部位可以是技术人员提前设定好的,例如,可穿戴激光发射器A一定要穿戴在左手腕上;再或者在用户随机的穿戴之后,根据可穿戴激光发射器的位置信息,确定出可穿戴激光发射器的穿戴部位;再从动作图像中确定出用户在动作图像中的初始移动姿势,由于在动作图像中用户的动作会存在遮挡的情况,可能从采集到的动作图像中仅能够识别出用户的右胳膊或者是仅能够识别出用户的右前臂,因此使得初始移动姿势不完整,没有办法模拟出用户的移动姿势,此时,就需要借助确定出的第二激光照射点对应的穿戴部位,对初始移动姿势进行调整,得到完整的移动姿势,例如,在仅能够识别出右前臂的情况下,根据穿戴在用户肩膀上的可穿戴激光发射器对应的第二激光照射点,便能够在未从动作图像中确定出肩膀位置的情况下,确定出用户肩膀所在位置,从而能够对初始移动姿势进行调整,得到完整的初始移动姿势;或者是从动作图像中识别出的初始移动姿势存在偏差的情况下,例如,受到背景的影响出现错误识别,把用户1的手识别成用户2的手,这时也可以根据用户1或者用户2穿戴的可穿戴激光发射器进行相应的调整。
最后,由于摄像装置采集到的动作图像是用户的正面图像,而用户对应的虚拟角色在游戏中对应的应该是背部图像,因此在得到调整后的初始移动姿势之后,还需要对调整后的初始移动姿势进行坐标转换,将用户的正面移动姿势确定为虚拟角色对应的背面移动姿势,即将调整后的初始移动姿势转换为虚拟角色对应的目标移动姿势。
本申请实施例提供的一种激光感应方法,在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点;将所述第一激光照射点确定为虚拟角色的射击位置;基于每个所述第二激光照射点以及接收到的摄像装置发送的用户的动作图像,确定出所述终端设备展示的虚拟角色对应的目标移动姿势以及移动方向;控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动。
这样,本申请通过用户手持的手持激光发射器以及穿戴的可穿戴激光发射器,确定出用户对应的虚拟角色的射击位置、移动方向以及目标移动姿势,从而可以控制虚拟角色按照目标移动姿势向移动方向移动,进而能够增加了用户射击的命中率,降低了射击的操作难度,有助于提高用户的体验感。
请参阅图7,图7为本申请另一实施例提供的激光感应方法的流程图。如图7中所示,本申请实施例提供的激光感应方法,包括:
S701、当接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点。
S702、将所述第一激光照射点确定为虚拟角色的射击位置。
S703、基于每个所述第二激光照射点以及接收到的摄像装置发送的用户的动作图像,确定出所述终端设备展示的虚拟角色对应的目标移动姿势以及移动方向。
S704、控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动。
S705、针对于每个所述第二激光照射点,确定所述第二激光照射点是否位于对应的预设标准范围内。
该步骤中,针对确定出的每个第二激光照射点,并确定该第二激光照射点是否位于预设标准范围内,其中,预设标准范围由技术人员事先设定好,且针对于不同的应用环境或者是场景预设标准范围是不同,当用户选择了一个相应的场景之后,终端设备便能够确定出在该场景中各个第二激光照射点对应的预设标准范围,且随着场景中内容的切换第二激光照射点对应的预设标准范围也会随之改变。
S706、若所述第二激光照射点未位于所述预设标准范围内,在所述终端设备中展示该第二激光照射点以及该第二激光照射点对应的所述预设标准范围。
该步骤中,若确定出第二激光照射点未位于预设标准范围内,则将未在预设标准范围内的第二激光照射点以及该第二激光照射点对应的预设标准范围展示在终端设备的显示屏中,以提醒用户调整自己的姿势。
示例性的,当用户在运动的过程中,实时检测用户的肢体有没有达到预设标准范围内,例如,右手是否到达了预设的高度等,具体的,是要通过检测穿戴在右手手腕上的可穿戴激光发射器对应的第二激光照射点是否在预设标准范围内,若检测出该第二激光照射点未在预设标准范围内,则在终端设备的显示屏内显示未在预设标准范围内的右手手腕上的可穿戴激光发射器对应的第二激光照射点,以及该第二激光照射点对应的预设标准范围。
这样,终端设备便能够通过各个第二激光照射点实时检测用户的姿势,并将未达到预设标准范围的第二激光照射点,以及该第二激光照射点对应的预设标准范围显示在显示屏上,可以在用户姿势不标准的情况下,提醒用户调整自己的姿势。
其中,S701至S704的描述可以参照S601至S604的描述,并且能达到相同的技术效果,对此不做赘述。
本申请实施例提供的一种激光感应方法,在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点;将所述第一激光照射点确定为虚拟角色的射击位置;基于每个所述第二激光照射点以及接收到的摄像装置发送的用户的动作图像,确定出所述终端设备展示的虚拟角色对应的目标移动姿势以及移动方向;控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动;针对于每个所述第二激光 照射点,确定所述第二激光照射点是否位于对应的预设标准范围内;若所述第二激光照射点未位于所述预设标准范围内,在所述终端设备的显示屏中展示该第二激光照射点以及该第二激光照射点对应的所述预设标准范围。
这样,本申请通过用户手持的手持激光发射器以及穿戴的可穿戴激光发射器,确定出用户对应的虚拟角色的射击位置、移动方向以及目标移动姿势,从而可以控制虚拟角色按照目标移动姿势向移动方向移动,同时,还能够通过各个第二激光照射点,实时监控用户的姿势,当用户姿势不标准时,可以提醒用户调整自己的姿势,进而能够增加了用户射击的命中率,降低了射击的操作难度,有助于提高用户的体验感。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
可选地,本申请实施例还提供一种电子设备,包括处理器和存储器,存储器中存储有可在处理器上运行的计算机程序,处理器执行计算机程序时实现前述任一项的方法的步骤。具体实现方式和技术效果类似,这里不再赘述。
可选地,本申请实施例还提供一种程序产品,例如计算机可读存储介质,包括程序,该程序在被处理器执行时配置成执行上述方法实施例。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,又例如,多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些通信接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个处理器可执行的非易失的计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only  Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上所述实施例,仅为本申请的具体实施方式,用以说明本申请的技术方案,而非对其限制,本申请的保护范围并不局限于此,尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,其依然可以对前述实施例所记载的技术方案进行修改或可轻易想到变化,或者对其中部分技术特征进行等同替换;而这些修改、变化或者替换,并不使相应技术方案的本质脱离本申请实施例技术方案的精神和范围,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。
工业实用性
本申请实施例提供的激光感应系统及激光感应方法、计算机可读存储介质和电子设备,接收手持激光发射器根据用户发送的触发指令发射的第一激光信号,以及可穿戴激光发射器按照预设的时间间隔发送的多个第二激光信号,分别确定出在所述终端设备的显示屏中第一激光信号对应的第一激光照射点,以及每一个第二激光信号对应的第二激光照射点,并将第一激光照射点确定为虚拟角色的射击位置,基于多个第二激光照射点和摄像装置采集到的用户的动作图像,确定出虚拟角色的目标移动姿势和移动方向,并控制虚拟角色按照目标移动姿势向移动方向移动。这样,用户能够根据自己的射击习惯,调整对应虚拟角色的射击位置、射击方向以及射击姿势,从而能够增加了用户射击的命中率,降低了射击的操作难度,有助于提高用户的体验感。

Claims (17)

  1. 一种激光感应系统,其特征在于,所述激光感应系统包括手持激光发射器、多个可穿戴激光发射器、摄像装置以及终端设备:
    所述手持激光发射器,配置成根据用户发送的触发指令,向所述终端设备发射第一激光信号;
    所述多个可穿戴激光发射器,分别穿戴在所述用户的身体上的不同部位处,配置成按照预设的时间间隔向所述终端设备发射多个第二激光信号;
    所述摄像装置,配置成按照预设的时间间隔采集所述用户的动作图像,并将采集到的动作图像发送至所述终端设备;
    所述终端设备,配置成展示与所述用户对应的虚拟角色以及所述虚拟角色对应的场景,分别确定所述第一激光信号对应的第一激光照射点,以及所述多个第二激光信号中每个所述第二激光信号对应的第二激光照射点,将所述第一激光照射点确定为所述虚拟角色的射击位置,基于每个第二激光照射点以及所述动作图像,确定出所述虚拟角色对应的目标移动姿势以及移动方向,并控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动。
  2. 根据权利要求1所述的激光感应系统,其特征在于,所述终端设备包括信号接收模块、温度识别模块以及信号识别模块:
    所述信号接收模块,配置成接收所述手持激光发射器发射的第一激光信号以及所述多个可穿戴激光发射器中每个所述可穿戴激光发射器发射的第二激光信号;
    所述温度识别模块,配置成根据所述终端设备的显示屏中被照射点与未被照射区域之间的温度差,确定出在所述显示屏中所述第一激光信号和所述多个第二激光信号对应的多个待识别激光照射点;
    所述信号识别模块,配置成根据所述多个待识别激光照射点的照射频率,识别出所述多个待识别激光照射点中与所述手持激光发射器对应的第一激光照射点,以及与每个所述可穿戴激光发射器对应的第二激光照射点,将所述第一激光照射点确定为所述虚拟角色的射击位置。
  3. 根据权利要求1所述的激光感应系统,其特征在于,所述终端设备还包括位置识别模块、动作识别模块以及功能控制模块:
    所述位置识别模块,配置成接收所述摄像装置发送的所述动作图像,并基于接收到的至少两张动作图像确定出所述用户的移动方向;
    所述动作识别模块,配置成基于每个所述第二激光照射点以及接收到的动作图像, 确定出所述虚拟角色的目标移动姿势;
    所述功能控制模块,配置成控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动。
  4. 根据权利要求3所述的激光感应系统,其特征在于,所述位置识别模块,具体配置成接收所述摄像装置发送的至少两张所述动作图像,识别用户在每张所述动作图像中所处的位置区域,以及每张动作图像的拍摄时间,确定出所述用户的移动方向,其中至少两张所述动作图像的拍摄时间不同。
  5. 根据权利要求3所述的激光感应系统,其特征在于,所述动作识别模块包括部位确定单元、移动姿势确定单元以及姿势转换单元:
    所述部位确定单元,配置成针对于每个所述第二激光照射点,确定出所述第二激光照射点对应的可穿戴激光发射器,并确定出该可穿戴激光发射器的穿戴部位;
    所述移动姿势确定单元,配置成从所述动作图像中确定出所述用户对应的初始移动姿势,并根据每个所述穿戴部位调整所述初始移动姿势;
    所述姿势转换单元,配置成对调整后的所述初始移动姿势进行坐标转换,将所述初始移动姿势转换为所述虚拟角色对应的目标移动姿势。
  6. 根据权利要求1-5任一项所述的激光感应系统,其特征在于,所述终端设备还包括姿势校正模块:
    所述姿势校正模块,配置成针对于每个所述第二激光照射点,确定所述第二激光照射点是否位于对应的预设标准范围内,若所述第二激光照射点未位于所述预设标准范围内,在所述终端设备的显示屏中展示该第二激光照射点以及该第二激光照射点对应的所述预设标准范围。
  7. 根据权利要求1-6任一项所述的激光感应系统,其特征在于,所述多个可穿戴激光发射器,分别穿戴在所述用户的身体上的不同部位处包括:
    分别根据所述多个可穿戴激光发射器发送的位置信息确定所述多个可穿戴激光发射器的穿戴部位。
  8. 根据权利要求1-7任一项所述的激光感应系统,其特征在于,所述终端设备的显示屏上覆盖有透明光激光感应薄膜,所述透明光激光感应薄膜配置成感应所述第一激光信号以及每个所述第二激光信号。
  9. 一种激光感应方法,其特征在于,应用于如权利要求1至8中任一项所述的终端设备,所述激光感应方法包括:
    在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二 激光信号对应的第二激光照射点;
    将所述第一激光照射点确定为虚拟角色的射击位置;
    基于每个所述第二激光照射点以及接收到的摄像装置发送的用户的动作图像,确定出所述终端设备展示的虚拟角色对应的目标移动姿势以及移动方向;
    控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动。
  10. 根据权利要求9所述的激光感应方法,其特征在于,所述在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点,包括:
    在接收到所述手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,根据所述终端设备的显示屏中照射点与未被照射区域之间的温度差,确定出在所述显示屏中所述第一激光信号和每个所述第二激光信号对应的多个待识别激光照射点;
    根据所述多个待识别激光照射点的照射频率,识别出所述多个待识别激光照射点中所述手持激光发射器对应的第一激光照射点,以及每个所述可穿戴激光发射器对应的第二激光照射点。
  11. 根据权利要求9所述的激光感应方法,其特征在于,所述基于每个所述第二激光照射点以及接收到的摄像装置发送的用户的动作图像,确定出所述终端设备展示的虚拟角色对应的目标移动姿势以及移动方向,包括:
    接收所述摄像装置发送的动作图像,并基于接收到的至少两张动作图像确定出所述用户的移动方向;
    基于每个所述第二激光照射点以及接收到的动作图像,确定出所述虚拟角色的目标移动姿势。
  12. 根据权利要求11所述的激光感应方法,其特征在于,所述接收所述摄像装置发送的动作图像,并基于接收到的至少两张动作图像确定出所述用户的移动方向包括:
    接收所述摄像装置发送的至少两张所述动作图像,识别用户在每张所述动作图像中所处的位置区域,以及每张动作图像的拍摄时间,确定出所述用户的移动方向,其中,至少两张所述动作图像的拍摄时间不同。
  13. 根据权利要求11所述的激光感应方法,其特征在于,所述基于每个所述第二激光照射点以及接收到的动作图像,确定出所述虚拟角色的目标移动姿势,包括:
    针对于每个所述第二激光照射点,确定出所述第二激光照射点对应的可穿戴激光发射器,并确定出该可穿戴激光发射器的穿戴部位;
    从所述动作图像中确定出所述用户的初始移动姿势,基于每个所述穿戴部位,调整所述初始移动姿势;
    对调整后的所述初始移动姿势进行坐标变换,将所述初始移动姿势转换为所述虚拟角色对应的目标移动姿势。
  14. 根据权利要求9所述的激光感应方法,其特征在于,在所述控制所述虚拟角色按照所述目标移动姿势向所述移动方向移动之后,所述激光感应方法还包括:
    针对于每个所述第二激光照射点,确定所述第二激光照射点是否位于对应的预设标准范围内;
    若所述第二激光照射点未位于所述预设标准范围内,在所述终端设备的显示屏中展示该第二激光照射点以及该第二激光照射点对应的所述预设标准范围。
  15. 根据权利要求9所述的激光感应方法,其特征在于,所述在接收到手持激光发射器发射的第一激光信号以及每个可穿戴激光发射器发射的第二激光信号后,分别确定所述第一激光信号对应的第一激光照射点,以及每个第二激光信号对应的第二激光照射点之前,所述方法还包括:
    分别根据所述多个可穿戴激光发射器发送的位置信息确定所述多个可穿戴激光发射器的穿戴部位。
  16. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器读取并运行时,实现如权利要求9至15任一项所述的方法。
  17. 一种电子设备,其特征在于,包括处理器和存储器,所述存储器中存储有可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现上述权利要求9至15任一项所述的方法的步骤。
PCT/CN2020/125538 2020-04-02 2020-10-30 激光感应系统及方法、计算机可读存储介质、电子设备 WO2021196584A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010256682.8A CN111589099A (zh) 2020-04-02 2020-04-02 一种激光感应系统及激光感应方法
CN202010256682.8 2020-04-02

Publications (1)

Publication Number Publication Date
WO2021196584A1 true WO2021196584A1 (zh) 2021-10-07

Family

ID=72187374

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/125538 WO2021196584A1 (zh) 2020-04-02 2020-10-30 激光感应系统及方法、计算机可读存储介质、电子设备

Country Status (2)

Country Link
CN (1) CN111589099A (zh)
WO (1) WO2021196584A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111589099A (zh) * 2020-04-02 2020-08-28 深圳创维-Rgb电子有限公司 一种激光感应系统及激光感应方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150348330A1 (en) * 2012-10-31 2015-12-03 Sulon Technologies Inc. Dynamic environment and location based augmented reality (ar) systems
CN205598614U (zh) * 2016-03-11 2016-09-28 广州虚域网络科技有限公司 一种激光跟踪定位穿戴游戏设备系统
CN106710348A (zh) * 2016-12-20 2017-05-24 江苏前景信息科技有限公司 一种人防互动体验方法与系统
CN108398049A (zh) * 2018-04-28 2018-08-14 上海亿湾特训练设备科技有限公司 一种联网互战式投影对抗射击训练系统
CN111589099A (zh) * 2020-04-02 2020-08-28 深圳创维-Rgb电子有限公司 一种激光感应系统及激光感应方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101655739B (zh) * 2008-08-22 2012-07-04 原创奈米科技股份有限公司 一种三次元虚拟输入与仿真的装置
JP2014169797A (ja) * 2013-03-01 2014-09-18 Hitachi Kokusai Electric Inc 射撃訓練システム
CN108885487B (zh) * 2016-02-29 2021-01-29 华为技术有限公司 一种可穿戴式系统的手势控制方法以及可穿戴式系统
KR101938257B1 (ko) * 2017-04-10 2019-04-11 주식회사 제이콥시스템 영상 사격 훈련 시스템
CN110793393B (zh) * 2019-10-10 2022-06-14 成都贝瑞光电科技股份有限公司 一种激光多射击点识别系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150348330A1 (en) * 2012-10-31 2015-12-03 Sulon Technologies Inc. Dynamic environment and location based augmented reality (ar) systems
CN205598614U (zh) * 2016-03-11 2016-09-28 广州虚域网络科技有限公司 一种激光跟踪定位穿戴游戏设备系统
CN106710348A (zh) * 2016-12-20 2017-05-24 江苏前景信息科技有限公司 一种人防互动体验方法与系统
CN108398049A (zh) * 2018-04-28 2018-08-14 上海亿湾特训练设备科技有限公司 一种联网互战式投影对抗射击训练系统
CN111589099A (zh) * 2020-04-02 2020-08-28 深圳创维-Rgb电子有限公司 一种激光感应系统及激光感应方法

Also Published As

Publication number Publication date
CN111589099A (zh) 2020-08-28

Similar Documents

Publication Publication Date Title
US11347311B2 (en) Systems and methods for providing haptic feedback for remote interactions
US11262841B2 (en) Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing
US10636212B2 (en) Method for generating image to be displayed on head tracking type virtual reality head mounted display and image generation device
JP2020091904A (ja) システムおよびコントローラ
US9891435B2 (en) Apparatus, systems and methods for providing motion tracking using a personal viewing device
CN108572728B (zh) 信息处理设备、信息处理方法和程序
CN104364733A (zh) 注视位置检测装置、注视位置检测方法和注视位置检测程序
US20180232051A1 (en) Automatic localized haptics generation system
US11158101B2 (en) Information processing system, information processing device, server device, image providing method and image generation method
JP2014531588A (ja) スポーツアイテムのユーザ依存の状態を検出するシステム及び方法
CN111930223A (zh) 用于观看计算机生成的环境并与其互动的可移动显示器
US20210004132A1 (en) Information processing device, information processing method, and program
US20210004981A1 (en) Methods and Devices for Electronically Altering Captured Images
KR101811809B1 (ko) 3d hmd를 활용한 아케이드 게임시스템
US20160096075A1 (en) Modification of an exercise plan
WO2021196584A1 (zh) 激光感应系统及方法、计算机可读存储介质、电子设备
CN110520822B (zh) 控制装置、信息处理系统、控制方法和程序
JP2018092416A (ja) 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるプログラム
EP3943167A1 (en) Device provided with plurality of markers
JP2012196286A (ja) ゲーム装置、ゲーム装置の制御方法、及びプログラム
KR101360888B1 (ko) 오프라인 연동형 가상현실을 제공하는 휴대용 통신 단말기 및 이를 이용한 원격 게임방법
WO2019111027A1 (en) Method for creating virtual or augmented reality and system for creating virtual or augmented reality
US20020118163A1 (en) System for interacting of a user with an electronic system image
KR20180106572A (ko) 가상현실 제공장치 및 그 방법
Baqai et al. Kinect as a Generalised Interface for Games and PC Control

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20928569

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16-02-2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20928569

Country of ref document: EP

Kind code of ref document: A1