WO2022048428A1 - 目标物体的控制方法、装置、电子设备及存储介质 - Google Patents

目标物体的控制方法、装置、电子设备及存储介质 Download PDF

Info

Publication number
WO2022048428A1
WO2022048428A1 PCT/CN2021/112693 CN2021112693W WO2022048428A1 WO 2022048428 A1 WO2022048428 A1 WO 2022048428A1 CN 2021112693 W CN2021112693 W CN 2021112693W WO 2022048428 A1 WO2022048428 A1 WO 2022048428A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
target object
virtual
real
computer
Prior art date
Application number
PCT/CN2021/112693
Other languages
English (en)
French (fr)
Inventor
张嘉益
Original Assignee
北京字节跳动网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字节跳动网络技术有限公司 filed Critical 北京字节跳动网络技术有限公司
Priority to EP21863495.4A priority Critical patent/EP4086732A4/en
Publication of WO2022048428A1 publication Critical patent/WO2022048428A1/zh
Priority to US17/882,443 priority patent/US11869195B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/352Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • the embodiments of the present disclosure relate to the field of computers, and in particular, to a control method, an apparatus, an electronic device, and a storage medium for a target object.
  • Augmented Reality (AR) technology is a technology that skillfully integrates virtual information with the real world. With the development of AR technology, it is possible to control the movement of the target object in the AR real-life image.
  • the user can control the target object in the AR real-life image through the AR real-life image provided by the terminal, so that the target object in the real-life image can move under the user's control.
  • embodiments of the present disclosure provide a control method, apparatus, electronic device, and storage medium for a target object.
  • an embodiment of the present disclosure provides a method for controlling a target object, including:
  • the target object is controlled to move.
  • an embodiment of the present disclosure provides a control device for a target object, comprising:
  • a processing module configured to determine the control direction corresponding to the movement control operation and the shooting direction of the real-life image in response to the movement control operation triggered by the target object in the real-life image;
  • the control module is configured to control the target object to move according to the control direction and the shooting direction.
  • embodiments of the present disclosure provide an electronic device, including: at least one processor and a memory;
  • the memory stores computer-executable instructions
  • the at least one processor executes the computer-executable instructions stored in the memory, so that the at least one processor executes the first aspect and various possible control methods involving the target object as described above.
  • embodiments of the present disclosure provide a computer-readable storage medium, where computer-executable instructions are stored in the computer-readable storage medium, and when a processor executes the computer-executable instructions, the first aspect and the first Aspects various possible designs of the control method of the target object.
  • embodiments of the present disclosure further provide a computer program product, including computer program instructions, the computer program instructions enable a computer to execute the control method for a target object described in the first aspect and various possible designs of the first aspect .
  • the embodiments of the present disclosure further provide a computer program, when the computer program runs on the computer, the computer executes the method for controlling the target object described in the first aspect and various possible designs of the first aspect.
  • the target object control method, device, electronic device, and storage medium provided by the embodiments of the present disclosure determine the control direction corresponding to the movement control operation by responding to the movement control operation triggered by the target object in the real scene image; obtain the real scene The shooting direction of the image; according to the control direction and the shooting direction, the target object is controlled to move in the live image, and the control method provided in this embodiment can effectively solve the problem in the prior art when the shooting direction of the live image occurs.
  • controlling the target object to move in the real scene image will cause the problem of deviation of the moving direction, and it can also effectively improve the operation performance of the target object in the real scene image, and bring a better control experience to the user.
  • FIG. 1 is a schematic diagram of a control interface of an existing AR game scene
  • FIG. 2 is a schematic diagram of a network architecture on which the disclosure is based;
  • FIG. 3 is a schematic diagram of a scene on which target object control is based in an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of another network architecture on which the disclosure is based;
  • Fig. 5 is the scene schematic diagram that the target object control scene of remote control is based on
  • FIG. 6 is a schematic flowchart of a method for controlling a target object according to an embodiment of the present disclosure
  • FIG. 7 is a schematic diagram of a first interface for obtaining a control direction according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a second interface for obtaining a control direction according to an embodiment of the present application.
  • FIG. 9a is a schematic diagram of a terminal perspective for fusion in a first direction provided by an embodiment of the present application.
  • FIG. 9b is a top view of a scene fused in a first direction provided by an embodiment of the present application.
  • FIG. 9c is a schematic diagram of a terminal perspective of fusion in a second direction provided by an embodiment of the present application.
  • FIG. 9d is a top view of a scene fused in a second direction provided by an embodiment of the present application.
  • FIG. 10 is a structural block diagram of an apparatus for controlling a target object provided by an embodiment of the present disclosure
  • FIG. 11 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present disclosure.
  • Augmented Reality (AR) technology is a technology that skillfully integrates virtual information with the real world. With the development of AR technology, it is possible to control the movement of the target object in the AR real-life image.
  • the user can control the target object in the AR real-life image through the AR real-life image provided by the terminal, so that the target object in the real-life image can move under the user's control.
  • the shooting device can provide the terminal with a real-life image obtained by shooting, and the user can see the target object to be controlled in the real-life image through the terminal, and use the terminal to initiate a control command to control the target object to move in the real-life image.
  • the target object may be a virtual target object or a solid target object, which will not be limited in the present disclosure.
  • Figure 1 is a schematic diagram of the control interface of an existing AR game scene.
  • the user Through the real-life image provided by the terminal, the user can obtain the game target object representing himself in the real-life image. , and game target objects that represent others. And through the virtual controls provided by the terminal, users can control their game target objects to move, attack, defend, jump, rotate and other actions.
  • the direction control of the target object is realized based on the real geographic coordinate system, that is, the control command issued by the terminal can generally cause the game target object to move based on the direction of "southeast, northwest". That is to say, the user needs to first determine the moving direction of the target object, such as "eastward", according to the position of the target object in the real-life image, and then, the virtual control on the operation terminal initiates a control command to control the target object to move "eastwardly".
  • the user can also perform movement control on some virtual objects displayed in the real scene.
  • virtual objects such as virtual clouds can be added in the obtained real scene of the streetscape, and when adding these virtual objects, the user can control the virtual objects to move to determine their adding positions.
  • the problem of direction deviation caused by the mismatch between the control direction and the moving direction as described above also occurs during the movement.
  • embodiments of the present disclosure provide a method, that is, when responding to a movement control operation triggered by a target object in a real-life image, on the one hand, the control direction corresponding to the movement control operation is determined; The shooting direction of the live image, and then determine the moving direction of the target object in the live image according to the control direction and the shooting direction, and then control the target object to move in the live image based on the determined moving direction.
  • This embodiment uses such a control method This effectively solves the problem in the prior art that the direction deviation of the control target object occurs when the shooting direction changes, and can also effectively improve the operation performance of the target object in the real scene image, bringing a better control experience to the user.
  • FIG. 2 is a schematic diagram of a network architecture on which the present disclosure is based.
  • the network architecture shown in FIG. 2 may specifically include a terminal 1 , a control device 2 of a target object, and a server 3 .
  • the terminal 1 may be a hardware device such as a user's mobile phone, a smart home device, a tablet computer, etc. that can be used to capture the real scene and display the captured real scene
  • the control device 2 of the target object may be a customer integrated or installed on the terminal 1.
  • the server 3 may be a server or a server cluster including a game platform set in the cloud.
  • the control device 2 of the target object can run on the terminal 1 and provide the terminal 1 with an operation page, and the terminal 1 uses its display or display component to display the operation page provided by the control device 2 of the target object to the user.
  • control device 2 of the target object can also use the network components of the terminal 1 to interact with the server 3 to obtain the shape information, position information, scene information of the target object and even some other information resources of the target object in the server 3.
  • FIG. 3 is a schematic diagram of a scene on which the target object control in an embodiment of the present disclosure is based.
  • user A and user B can each turn on the shooting component or camera of the terminal, And shoot the same scene identifier (two-dimensional code as shown in FIG. 3 or a specific item, etc.), so as to establish the communication connection between the respective terminals of users A and B and the server 3 in this way.
  • scene identifier two-dimensional code as shown in FIG. 3 or a specific item, etc.
  • the server 3 will synchronously deliver the data including the virtual scene and the virtual game characters of both parties to the control device 2 of the target object of the two parties, so that the terminals 1 of the two parties construct the game scene on the basis of the real scene image, and The virtual game characters of both parties are displayed in the game scene.
  • user A can control the virtual game character A through the terminal, and user B can control the virtual game character B through the terminal, and the control results of both parties will be displayed on the terminals of both parties synchronously. .
  • FIG. 3 is only one of the achievable application scenarios of the architecture shown in FIG. 2 .
  • FIG. 3 and the embodiments of the present disclosure can also be applied to other scenarios, such as a game character treasure hunting scenario based on an AR scenario, Another example is the AR-based game character social scene and so on.
  • the architecture shown in FIG. 2 can also be applied to the aforementioned scenes displayed in some virtual and augmented reality, and the user can also perform movement control on some virtual objects displayed in the real scene.
  • the user can control the movement of adding virtual objects such as virtual clouds through the method provided by the present disclosure to determine the addition position.
  • the specific interaction method is similar to the aforementioned scene, and will not be repeated here .
  • FIG. 4 is a schematic diagram of another network architecture on which the present disclosure is based, and the network architecture shown in FIG.
  • the architecture shown in FIG. 4 further includes a photographing system 4 .
  • the terminal 1 may be a hardware device such as a user's mobile phone, a smart home device, a tablet computer, etc. that can be used to display the real scene captured, and the control device 2 of the target object may be a client or display integrated or installed on the terminal 1
  • the server 3 may be a server or a server cluster including a game platform provided in the cloud.
  • the control device 2 of the target object can run on the terminal 1 and provide the terminal 1 with an operation page, and the terminal 1 uses its display or display component to display the operation page provided by the control device 2 of the target object to the user.
  • control device 2 of the target object can also use the network components of the terminal 1 to interact with the server 3 to obtain the shape information, location information, scene information of the game character in the server 3, and even some other information resources.
  • the photographing system 4 may be composed of multiple photographing devices arranged in the same photographing area, and the multiple photographing devices will photograph the photographing area at different photographing angles.
  • the real image obtained by the photographing will be transmitted to the server 3 and forwarded to the control device 2 of the target object through the server 3 for presentation on the terminal 1 .
  • FIG. 4 is applicable to a remotely controlled robot scenario
  • FIG. 5 is a schematic diagram of a scenario based on which the remotely controlled target object control scenario is based.
  • the shooting system will shoot the scene from multiple shooting angles to send the live image including the robot to the server.
  • the terminals of user A and user B located at different locations communicate with the server at the same time to obtain multi-angle real-world images captured by the capturing system.
  • user A can remotely control robot A through the terminal
  • user B can remotely control robot B through the terminal.
  • the control device of the target object integrated on the terminal will use the control method provided in the present application to control the robot's control based on the angle of the real image captured by the shooting system and the control direction in the remote control triggered by the user.
  • the moving direction is determined, and then robot A and robot B will perform corresponding actions according to the remote control commands they receive.
  • robot A and robot B synchronously present the real-life images captured by the arc through the shooting system to the terminal for the execution of the remote control instructions for user A and user B to obtain.
  • FIG. 5 is only one of the achievable application scenarios of the architecture shown in FIG. 4 , and in other scenarios, such as a remote control-based engineering robot scenario, and so on.
  • FIG. 6 is a schematic flowchart of a method for controlling a target object according to an embodiment of the present disclosure.
  • the control method of the target object provided by the embodiment of the present disclosure includes:
  • Step 101 in response to the movement control operation triggered by the target object in the real scene image, determine the control direction corresponding to the movement control operation;
  • Step 102 acquiring the shooting direction of the real scene image
  • Step 103 Control the target object to move in the real-life image according to the control direction and the shooting direction.
  • the execution body of the processing method provided in this example is the aforementioned control device of the target object.
  • it specifically refers to a client or display that can be installed or integrated on a terminal.
  • the control device of the target object can be presented to the user by means of an application interface or a control display interface.
  • the user can interact with the server in the cloud through the terminal to control the target object in the scene.
  • a real image will be presented through the control device of the target object, and virtual controls for controlling the movement of the target object (the components shown in the lower right corner of the terminal interface as shown in Figure 3 or Figure 5) are provided synchronously.
  • the user can control the virtual control through different touch methods such as clicking, sliding, pressing, etc., based on needs, so as to trigger the movement control operation of the target object in the real-life image.
  • the control device of the target object determines the control direction corresponding to the movement control operation and the shooting direction of the live image, and then, according to the The control direction and the shooting direction are used to control the target object to move.
  • the above-mentioned shooting direction is used to represent the orientation of the shooting device that shoots the real image in the geographic coordinate system; and the control direction is used to represent the operation direction of the movement control operation in the image orthogonal coordinate system.
  • the specific implementation of controlling the target object to move can be based on direction fusion processing, that is, the direction fusion processing is performed on the control direction and the shooting direction to obtain the moving direction of the target object,
  • the target object is controlled to move along the moving direction.
  • direction fusion processing when performing direction fusion processing, it can be realized based on alpha blending technology, by subtracting the direction angle of the control direction and the direction angle of the shooting direction to obtain the direction angle of the moving direction, and then based on the direction of the moving direction.
  • the angle controls the movement of the target object in the direction of movement.
  • the control direction will be orthogonal to that of the terminal itself, that is, the direction angle of the control direction is based on the image orthogonal coordinate system, which should be the operation direction based on the image orthogonal coordinate system.
  • the shooting direction refers to the pointing of the shooting device that shoots the real image in the geographic coordinate system.
  • the direction angle of the shooting direction can be the pointing of the shooting device on the horizontal plane based on the geographic coordinate system.
  • sensor information can be used to obtain, that is, the posture sensor set on the photographing device/terminal is used to determine the posture state of the photographing device/terminal, and then the photographing direction is obtained. That is, determining the shooting direction of the real-life image includes: acquiring a pose state of a shooting device that shoots the real-life image; and determining the shooting direction according to the pose state.
  • the acquisition of the control direction can be realized through a virtual control.
  • a virtual control is also presented on the display interface, and the virtual control is used to receive the movement control operation to control the movement of the target object.
  • the acquisition of the control direction may be determined based on the current shape of the virtual control, where the shape of the virtual control refers to the shape and state of the virtual control.
  • the current shape of the virtual control may be determined first, and then the control direction may be obtained according to the current shape of the virtual control.
  • two different virtual controls are provided, and the virtual controls can be arranged at the lower right of the interface such as in FIG. 3 or FIG. 5 to facilitate the user to operate to control the target object in the interface.
  • the virtual control is a virtual joystick
  • the virtual joystick includes a joystick disk and a joystick.
  • FIG. 7 is a schematic diagram of a first interface for obtaining a control direction provided by an embodiment of the present application.
  • the virtual control in the figure is a virtual joystick.
  • the relative positional relationship between the joystick and the joystick disk may be determined, and then the control direction is determined according to the relative positional relationship.
  • the relative positional relationship between the joystick and the joystick disk can be achieved by the following methods, for example, obtaining the coordinates of the center points of the joystick and the joystick disk respectively, and determining their positions by the vector difference between the coordinates of the center points of the two
  • the overlapping area of the joystick and the joystick disk is determined, and the distance between the area edge of the overlapping area and the area edge of the area occupied by the virtual control is determined, and then the relative position relationship is determined according to the distance.
  • it can also be obtained in other ways, which is not limited in this embodiment.
  • the virtual controls are virtual keys, and the virtual keys include at least two direction keys.
  • FIG. 8 is a schematic diagram of a second interface for obtaining a control direction provided by an embodiment of the present application.
  • the virtual controls in the figure are virtual buttons.
  • the control direction is determined, the triggered direction key can be determined; the control direction is determined according to the direction corresponding to the triggered direction key.
  • the determination of the triggered direction key can be realized through the existing touch technology, which will not be described in detail in this application.
  • FIG. 9a is a schematic diagram of a terminal perspective view of a first-direction fusion provided by an embodiment of the present application
  • FIG. 9b is a top view of a scene of a first-direction fusion provided by an embodiment of the present application
  • FIG. 9c is a terminal of the second-direction fusion provided by an embodiment of the present application.
  • FIG. 9d is a top view of a scene fused in a second direction according to an embodiment of the present application.
  • control direction a and the shooting direction b(b') can be determined respectively through the above method, and the moving direction c(c') can be obtained through fusion, and the terminal controls the target object based on the moving direction c(c'). ') to move.
  • Figures 9a-9d show the process of controlling the target object "motorcycle" to move in the real scene through the terminal.
  • Fig. 9b shows the shooting direction b of the terminal and the moving direction c of the motorcycle
  • Fig. 9a shows the interface of the viewing angle of the terminal in the state of Fig. 9b.
  • FIG. 9c and FIG. 9d the scene after the control direction a is not changed and the shooting direction b is changed is shown.
  • Fig. 9d shows the shooting direction b' of the terminal and the moving direction c' of the motorcycle
  • Fig. 9c shows the interface of the viewing angle of the terminal in the state of Fig. 9b.
  • the moving speed of the target object can also be controlled during the moving process.
  • the control device of the target object can obtain the moving speed of the target object according to the current form of the virtual control, and The control target object moves based on the moving speed.
  • the movement speed based on the current shape of the virtual control when determining the movement speed based on the current shape of the virtual control, it can be realized based on the strength of the virtual control being pressed, or it can be realized based on the change speed of the virtual control when the shape changes, which will not be described in detail in this application.
  • the control method for a target object determines the control direction corresponding to the movement control operation and the shooting direction of the real-life image by responding to the movement control operation triggered by the target object in the real-life image; according to the control direction and the shooting direction to control the target object to move, this embodiment determines the moving direction of the target object in the real-life image according to the shooting direction of the real-life image and the control direction corresponding to the movement control operation, thereby effectively solving the problem in the prior art.
  • the problem of controlling the direction deviation of the target object can also effectively improve the operation performance of the target object in the real image, and bring a better control experience to the user.
  • FIG. 10 is a structural block diagram of a control apparatus for a target object provided by an embodiment of the present disclosure. For convenience of explanation, only the parts related to the embodiments of the present disclosure are shown.
  • the control device for the target object includes: a processing module 10 and a control module 20 .
  • the processing module 10 is configured to respond to the movement control operation triggered by the target object in the real scene image, determine the control direction corresponding to the movement control operation, and obtain the shooting direction of the real scene image;
  • the control module 20 is configured to control the target object to move in the real scene image according to the control direction and the shooting direction.
  • the shooting direction is used to indicate the orientation of the shooting device that shoots the real-life image in a geographic coordinate system
  • the control direction is used to represent the operation direction of the movement control operation in the image orthogonal coordinate system.
  • the processing module 10 is specifically configured to: acquire the pose state of the shooting device that shoots the live image; and determine the shooting direction according to the pose state.
  • it also includes providing a virtual control, the virtual control is used to receive the movement control operation to control the target object to move;
  • the processing module 10 is specifically configured to determine the current form of the virtual control, and obtain the control direction according to the current form of the virtual control.
  • the virtual control includes a virtual joystick
  • the virtual joystick includes a joystick disk and a joystick located in the joystick disk.
  • the processing module 10 is specifically configured to: determine the relative positional relationship between the joystick and the joystick disk; and determine the control direction according to the relative positional relationship.
  • the virtual controls include virtual keys, and the virtual keys include at least two direction keys.
  • the processing module 10 is specifically configured to determine the triggered direction key; and determine the control direction according to the direction corresponding to the triggered direction key.
  • control module 20 is specifically configured to: perform direction fusion processing on the control direction and the shooting direction to obtain the moving direction of the target object; and control the target object to move along the moving direction.
  • processing module 10 is further configured to obtain the moving speed of the target object according to the current form of the virtual control
  • the control module 20 is specifically configured to control the target object to move based on the moving speed.
  • the apparatus for controlling a target object determines the control direction corresponding to the movement control operation and the shooting direction of the real-life image by responding to the movement control operation triggered by the target object in the real-life image; according to the control direction and the shooting direction to control the target object to move, this embodiment determines the moving direction of the target object in the real-life image according to the shooting direction of the real-life image and the control direction corresponding to the movement control operation, thereby effectively solving the problem in the prior art.
  • the problem of controlling the direction deviation of the target object can also effectively improve the operation performance of the target object in the real image, and bring a better control experience to the user.
  • the electronic device provided in this embodiment can be used to implement the technical solutions of the foregoing method embodiments, and the implementation principles and technical effects thereof are similar, and details are not described herein again in this embodiment.
  • the electronic device 900 may be a terminal device or a media library.
  • the terminal equipment may include, but is not limited to, such as mobile phones, notebook computers, digital broadcast receivers, personal digital assistants (Personal Digital Assistant, referred to as PDA), tablet computers (Portable Android Device, referred to as PAD), portable multimedia players (Portable Media Player, PMP for short), mobile terminals such as in-vehicle terminals (such as in-vehicle navigation terminals), etc., and fixed terminals such as digital TVs, desktop computers, and the like.
  • PDA Personal Digital Assistant
  • PAD Portable Android Device
  • PMP Portable Multimedia Player
  • mobile terminals such as in-vehicle terminals (such as in-vehicle navigation terminals), etc.
  • fixed terminals such as digital TVs, desktop computers, and the like.
  • the electronic device shown in FIG. 11 is only an example, and should not impose any limitation on the function and scope of use of the embodiments of the present disclosure.
  • the electronic device 900 may include a processor 901 for executing a control method (such as a central processing unit, a graphics processing unit, etc.) of the target object, which may be stored in a read only memory (Read Only Memory, ROM for short) according to the ) 902 or the program loaded from the storage device 908 into a random access memory (Random Access Memory, RAM for short) 903 to execute various appropriate actions and processes.
  • a control method such as a central processing unit, a graphics processing unit, etc.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the control method 901 of the target object, the ROM 902 and the RAM 903 are connected to each other through a bus 904.
  • An input/output (I/O) interface 905 is also connected to the bus 904 .
  • the following devices can be connected to the I/O interface 905: input devices 906 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a Liquid Crystal Display (LCD for short) ), speaker, vibrator, etc. output device 907; storage device 908 including, eg, magnetic tape, hard disk, etc.; and communication device 909.
  • the communication means 909 may allow the electronic device 900 to communicate wirelessly or by wire with other devices to exchange data. While FIG. 11 shows an electronic device 900 having various means, it should be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
  • an embodiment of the present disclosure includes a computer program product comprising a computer program carried on a computer-readable medium, the computer program comprising a method for executing the method shown in each flowchart according to the embodiment of the present disclosure code.
  • the computer program may be downloaded and installed from the network via the communication device 909, or from the storage device 908, or from the ROM 902.
  • the control method 901 of the target object the above-mentioned functions defined in the methods of the embodiments of the present disclosure are executed.
  • the computer-readable medium mentioned above in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • the computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • Computer readable storage media may include, but are not limited to, electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable Programmable read-only memory (electrical programmable ROM, EPROM or flash memory), optical fiber, portable compact disk read-only memory (compact disc ROM, CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, carrying computer-readable program code therein. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
  • Program code embodied on a computer-readable medium may be transmitted using any suitable medium, including but not limited to: electrical wire, optical fiber cable, RF (radio frequency), etc., or any suitable combination of the foregoing.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or may exist alone without being assembled into the electronic device.
  • the aforementioned computer-readable medium carries one or more programs, and when the aforementioned one or more programs are executed by the electronic device, causes the electronic device to execute the methods shown in the foregoing embodiments.
  • the present disclosure also provides a computer program, the computer program enables a computer to execute the methods shown in the above embodiments.
  • Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including object-oriented programming languages—such as Java, Smalltalk, C++, but also conventional Procedural programming language - such as the "C" language or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or media library.
  • the remote computer can be connected to the user's computer through any kind of network—including a Local Area Network (LAN) or a Wide Area Network (WAN)—or, can be connected to an external A computer (eg using an internet service provider to connect via the internet).
  • LAN Local Area Network
  • WAN Wide Area Network
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more functions for implementing the specified logical function(s) executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or operations , or can be implemented in a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments of the present disclosure may be implemented in a software manner, and may also be implemented in a hardware manner.
  • the name of the unit does not constitute a limitation of the unit itself under certain circumstances, for example, the first obtaining unit may also be described as "a unit that obtains at least two Internet Protocol addresses".
  • exemplary types of hardware logic components include: field-programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), application specific standard products (application-specific standard products) specific standard product (ASSP), system on a chip (SOC), complex programmable logic device (CPLD), etc.
  • FPGAs field-programmable gate arrays
  • ASICs application specific integrated circuits
  • ASSP application specific standard products
  • SOC system on a chip
  • CPLD complex programmable logic device
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with the instruction execution system, apparatus or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), fiber optics, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM compact disk read only memory
  • magnetic storage or any suitable combination of the foregoing.
  • a method for controlling a target object includes:
  • the target object is controlled to move in the real image.
  • the shooting direction is used to indicate the orientation of the shooting device that shoots the real-life image in a geographic coordinate system
  • the control direction is used to represent the operation direction of the movement control operation in the image orthogonal coordinate system.
  • acquiring the shooting direction of the live image includes:
  • the shooting direction is determined according to the pose state.
  • control method further comprises: providing a virtual control, the virtual control is used to receive the movement control operation to control the target object to move;
  • the determining of the control direction corresponding to the movement control operation includes:
  • the current shape of the virtual control is determined, and the control direction is obtained according to the current shape of the virtual control.
  • the virtual control includes a virtual joystick
  • the virtual joystick includes a joystick disk and a joystick located in the joystick disk.
  • the determining the current form of the virtual control, and obtaining the control direction according to the current form of the virtual control includes:
  • the control direction is determined according to the position relative relationship.
  • the virtual controls include virtual keys, and the virtual keys include at least two direction keys.
  • the determining the current form of the virtual control, and obtaining the control direction according to the current form of the virtual control includes:
  • the control direction is determined according to the direction corresponding to the triggered direction key.
  • controlling the target object to move according to the control direction and the shooting direction includes:
  • the target object is controlled to move along the moving direction.
  • it also includes:
  • controlling the target object to move further includes:
  • the control target object moves based on the moving speed.
  • a control device for a target object includes: a processing module and a control module.
  • a processing module configured to determine the control direction corresponding to the movement control operation in response to the movement control operation triggered by the target object in the real-life image, and obtain the shooting direction of the real-life image
  • the control module is configured to control the target object to move in the real scene image according to the control direction and the shooting direction.
  • the shooting direction is used to indicate the orientation of the shooting device that shoots the real-life image in a geographic coordinate system
  • the control direction is used to represent the operation direction of the movement control operation in the image orthogonal coordinate system.
  • the processing module is specifically configured to: acquire a pose state of a shooting device that shoots the live image; and determine the shooting direction according to the pose state.
  • it also includes providing a virtual control, the virtual control is used to receive the movement control operation to control the target object to move;
  • the processing module is specifically configured to determine the current form of the virtual control, and obtain the control direction according to the current form of the virtual control.
  • the virtual control includes a virtual joystick
  • the virtual joystick includes a joystick disk and a joystick located in the joystick disk.
  • the processing module is specifically configured to: determine the relative positional relationship between the joystick and the joystick disk; and determine the control direction according to the relative positional relationship.
  • the virtual controls include virtual keys, and the virtual keys include at least two direction keys.
  • the processing module is specifically configured to determine the triggered direction key; the control direction is determined according to the direction corresponding to the triggered direction key.
  • control module is specifically configured to: perform direction fusion processing on the control direction and the shooting direction to obtain the moving direction of the target object; and control the target object to move along the moving direction.
  • processing module is further configured to obtain the moving speed of the target object according to the current form of the virtual control
  • the control module is specifically configured to control the target object to move based on the moving speed.
  • an electronic device includes: at least one processor and a memory;
  • the memory stores computer-executable instructions
  • the at least one processor executes the computer-executable instructions stored in the memory, so that the at least one processor executes the method for controlling a target object as described in any preceding item.
  • a computer-readable storage medium stores computer-executable instructions, and when a processor executes the computer-executable instructions, the The control method of the target object described in the preceding item.
  • a computer program product comprising computer program instructions, the computer program instructions causing a computer to execute the method for controlling a target object as described in any preceding item.
  • a computer program that, when the computer program runs on a computer, causes the computer to execute the method for controlling a target object as described in any preceding item.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种目标物体的控制方法、装置(2)、电子设备及存储介质,通过响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向(101);获取实景图像的拍摄方向(102);根据控制方向和拍摄方向,控制目标物体在实景图像中进行移动(103)。目标物体的控制方法,能够有效解决现有技术中当实景图像的拍摄方向发生变化时,控制目标物体在实景图像中移动会出现移动方向偏差的问题,也能够有效提高实景图像中的目标物体操作性能,为用户带来更好的操控体验。

Description

目标物体的控制方法、装置、电子设备及存储介质
相关申请的交叉引用
本申请要求于2020年9月7日提交中国专利局、申请号为202010931355.8、发明名称为“目标物体的控制方法、装置、电子设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开实施例涉及计算机领域,尤其涉及一种目标物体的控制方法、装置、电子设备及存储介质。
背景技术
增强现实(Augmented Reality,简称AR)技术是一种将虚拟信息与真实世界巧妙融合的技术。而随着AR技术的发展,对于AR的实景图像中的目标物体进行移动控制成为可能。
现有技术中,用户可通过终端提供的AR实景图像,对AR实景图像中的目标物体进行控制,以使实景图像中的目标物体可在用户控制下进行移动。
但是,当AR的实景图像的拍摄角度不固定时,用户对于目标物体的移动控制容易出现方向偏差,严重影响了用户对目标物体的控制体验。
发明内容
针对上述问题,本公开实施例提供了一种目标物体的控制方法、装置、电子设备及存储介质。
第一方面,本公开实施例提供一种目标物体的控制方法,包括:
响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向以及所述实景图像的拍摄方向;
根据所述控制方向和所述拍摄方向,控制目标物体进行移动。
第二方面,本公开实施例提供一种目标物体的控制装置,包括:
处理模块,用于响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向以及所述实景图像的拍摄方向;
控制模块,用于根据所述控制方向和所述拍摄方向,控制目标物体进行移动。
第三方面,本公开实施例提供一种电子设备,包括:至少一个处理器和存储器;
所述存储器存储计算机执行指令;
所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如上第一方面以及第一方面各种可能的涉及所述的目标物体的控制方法。
第四方面,本公开实施例提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上第一方面以及第一方面各种可能的设计所述的目标物体的控制方法。
第五方面,本公开实施例还提供了一种计算机程序产品,包括计算机程序指令,该计算机程序指令使得计算机执行如上第一方面以及第一方面各种可能的设计所述的目标物体的控制方法。
第六方面,本公开实施例还提供了一种计算机程序,当计算机程序在计算机上运行时,使得计算机执行如上第一方面以及第一方面各种可能的设计所述的目标物体的控制方法。
本公开实施例提供的目标物体的控制方法、装置、电子设备及存储介质,通过响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向;获取所述实景图像的拍摄方向;根据所述控制方向和所述拍摄方向,控制目标物体在所述实景图像中进行移动,本实施例提供的控制方法,能够有效解决现有技术中当实景图像的拍摄方向发生变化时,控制目标物体在实景图像中移动会出现移动方向偏差的问题,也能够有效提高实景图像中的目标物体操作性能,为用户带来更好的操控体验。
附图说明
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为一种现有的AR游戏场景的控制界面示意图;
图2为本公开所基于的一种网络架构的示意图;
图3为本公开一种实施例中的目标物体控制所基于的场景示意图;
图4为本公开所基于的另一种网络架构的示意图;
图5为远程控制的目标物体控制场景所基于的场景示意图;
图6为本公开实施例提供的一种目标物体的控制方法的流程示意图;
图7为本申请实施例提供的获取控制方向的第一界面示意图;
图8为本申请实施例提供的获取控制方向的第二界面示意图;
图9a为本申请实施例提供的第一方向融合的终端视角示意图;
图9b为本申请实施例提供的第一方向融合的场景俯视图;
图9c为本申请实施例提供的第二方向融合的终端视角示意图;
图9d为本申请实施例提供的第二方向融合的场景俯视图;
图10为本公开实施例提供的目标物体的控制装置的结构框图;
图11为本公开实施例提供的电子设备的硬件结构示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
增强现实(Augmented Reality,简称AR)技术是一种将虚拟信息与真实世界巧妙融合的技术。而随着AR技术的发展,对于AR的实景图像中的目标物体进行移动控制成为可能。
现有技术中,用户可通过终端提供的AR实景图像,对AR实景图像中的目标物体进行控制,以使实景图像中的目标物体可在用户控制下进行移动。
具体来说,拍摄设备可向终端提供拍摄得到的实景图像,用户可通过终端看到实景图像中的待控制的目标物体,并利用终端发起控制指令,以控制该目标物体在实景图像中发生移动。需要说明的是,该目标物体可为虚拟目标物体,也可为实体目标物体,本公开将不对其进行限制。
以基于AR游戏中的目标物体控制场景为例,图1为一种现有的AR游戏场景的控制界面示意图,通过终端提供的实景图像,用户可获取到实景图像中的代表自己的游戏目标物体,以及代表他人的游戏目标物体。而通过终端提供的虚拟控件,用户可控制自己的游戏目标物体进行移动、攻击、防守、跳跃、旋转等动作。
在上述移动控制过程中,对于目标物体的方向控制是基于真实地理坐标系来实现的,即终端所发出的控制指令一般可使得游戏目标物体基于“东南西北”的方向进行移动。也就是说,用户需要先根据实景图像中目标物体的位置,确定其移动方向,如“向东”,然 后,操作终端上的虚拟控件发起控制目标物体“向东”移动的控制指令。
但是,如图1所示的,由于用户是通过实景图像观察并控制游戏目标物体进行移动的,当实景图像的拍摄角度发生变化时,用户无法快速分辨实景图像中的游戏目标物体的位置以及相应的移动方向,也就是说,用户认为的实景图像中游戏目标物体应“向东”移动,在实际中应为“向北”移动,因而发起“向东”移动的指令将明显与实际所需的移动方向不符。
换句话说,用户通过终端控制游戏目标物体进行移动控制时将出现方向偏差的问题,严重影响了用户对游戏目标物体的控制体验。
上述问题不仅出现在终端控制游戏目标物体进行移动的场景中,还会出现在其他一些基于虚拟增强现实显示技术的应用场景中:
例如,在一些虚拟增强现实显示的场景中,对于显示在现实场景中的一些虚拟物体,用户也可对其进行移动控制。举例来说,可在获得的街道景观的现实场景中,添加虚拟云彩等虚拟物体,在添加这些虚拟物体时,用户可控制虚拟物体进行移动以确定其添加位置。但是,在移动过程中也会出现如前所述的控制方向与移动方向之间不对应而导致的方向偏差的问题。
针对这样的问题,根据本公开的实施例提供这样的方式,即在通过应对实景图像中的目标物体触发的移动控制操作时,一方面确定所述移动控制操作对应的控制方向,另一方面确定实景图像的拍摄方向,然后根据控制方向和所述拍摄方向以确定实景图像中目标物体的移动方向,进而基于确定的移动方向控制目标物体在实景图像中进行移动,本实施例通过这样的控制方式有效解决现有技术中出现在拍摄方向发生变化时控制目标物体出现方向偏差的问题,也能够有效提高实景图像中的目标物体操作性能,为用户带来更好的操控体验。
参考图2,图2为本公开所基于的一种网络架构的示意图,该图2所示网络架构具体可包括终端1、目标物体的控制装置2以及服务器3。
其中,终端1具体可为用户手机、智能家居设备、平板电脑等可用于拍摄实景并且展现拍摄的实景的硬件设备,而目标物体的控制装置2可为集成或安装在所述终端1上的客户端或展示端,服务器3可为设置在云端的包括游戏平台在内的服务器或者服务器集群。
目标物体的控制装置2可在终端1上运行,并为终端1提供操作页面,并且,终端1利用其显示器或显示组件向用户显示目标物体的控制装置2所提供的操作页面。
同时,目标物体的控制装置2还可利用终端1的网络组件与服务器3进行交互,以获 取服务器3中的目标物体的外形信息、位置信息、目标物体所在场景信息甚至其他的一些信息资源。
特别来说,图2所示架构可适用于基于AR的游戏场景中。其中,图3为本公开一种实施例中的目标物体控制所基于的场景示意图,如图3所示的,在该场景中,用户A和用户B均各自可开启终端的拍摄组件或摄像头,并拍摄相同的场景标识物(如图3所示的二维码或者某个特定物品等),以通过这样的方式建立用户A和B的各自终端与服务器3之间的通信连接。然后,服务器3会将包括虚拟场景以及双方虚拟游戏角色在内的数据同步下发至二者的目标物体的控制装置2,以使二者的终端1在实景图像的基础上构建游戏场景,并在游戏场景中显示双方的虚拟游戏角色。
而通过本申请提供的目标物体的控制方法,用户A通过终端可对于虚拟游戏角色A进行控制,用户B通过终端可对于虚拟游戏角色B进行控制,双方控制的结果将同步显示在双方的终端上。
当然,图3所示场景仅为图2所示架构的其中一种可实现的应用场景,图3以及本公开的实施例还可以应用在其他场景中,如基于AR场景的游戏角色寻宝场景,又如基于AR的游戏角色社交场景等等。
此外,图2所示架构还可应用于前述的在一些虚拟增强现实显示的场景中,可对于显示在现实场景中的一些虚拟物体,用户也可对其进行移动控制。例如,在获得的街道景观的现实场景中,用户通过本公开提供的方法可对于添加虚拟云彩等虚拟物体进行移动控制以确定其添加位置,其具体交互方式与前述场景类似,在此不进行赘述。
此外,本申请提供的目标物体的控制方法中,该目标物体还可为真实存在的物体。参考图4,图4为本公开所基于的另一种网络架构的示意图,该图4所示网络架构具体可包括终端1、目标物体的控制装置2、服务器3以及拍摄系统4。
与图2所示架构不同的是,在终端1、目标物体的控制装置2、服务器3的基础上,图4所示架构还包括有拍摄系统4。
其中,终端1具体可为用户手机、智能家居设备、平板电脑等可用于展现拍摄的实景的硬件设备,而目标物体的控制装置2可为集成或安装在所述终端1上的客户端或展示端,服务器3可为设置在云端的包括游戏平台在内的服务器或者服务器集群。
目标物体的控制装置2可在终端1上运行,并为终端1提供操作页面,终端1利用其显示器或显示组件向用户显示目标物体的控制装置2所提供的操作页面。
同时,目标物体的控制装置2还可利用终端1的网络组件与服务器3进行交互,以获 取服务器3中的游戏角色的外形信息、位置信息、游戏角色所在场景信息甚至其他的一些信息资源。
与图2不同的是,在图4所示结构中,拍摄系统4可由设置在同一拍摄区域的多个拍摄设备构成,该多个拍摄设备将以不同拍摄角度对于拍摄区域进行拍摄。当拍摄系统4在对拍摄区域进行拍摄时,其拍摄得到的实景图像将被传输至服务器3,并通过服务器3转发至目标物体的控制装置2,以呈现在终端1上。
特别来说,图4所示架构可适用于远程控制的机器人场景,其中,图5为远程控制的目标物体控制场景所基于的场景示意图,如图5所示的,在该场景中,可包括有真实存在与场景中的机器人A和机器人B。
拍摄系统将从多个拍摄角度对该场景进行拍摄,以将包括有机器人的实景图像发送至服务器。位于不同地点的用户A和用户B的终端同时与服务器进行通信,以获得由拍摄系统拍摄的多角度的实景图像。
然后,通过本申请提供的目标物体的控制方法,用户A通过终端可对于机器人A进行远程控制,用户B通过终端可对于机器人B进行远程控制。
在控制过程中,集成在终端上的目标物体的控制装置,将利用本申请提供的控制方法将基于拍摄系统的拍摄的实景图像的角度,以及用户触发的远程控制中的控制方向,对机器人的移动方向进行确定,然后机器人A和机器人B将会根据各自接收到的远程控制指令执行相应的动作。
最后,机器人A和机器人B对于远程控制指令的执行情况将弧通过拍摄系统拍摄的实景图像同步进行呈现给终端,以供用户A和用户B获取。
当然,图5所示场景仅为图4所示架构的其中一种可实现的应用场景,在其他场景中,如基于远程控制的工程机器人场景中等等。
下面将针对本申请提供的目标物体的控制方法进行进一步说明:
第一方面,参考图6,图6为本公开实施例提供的一种目标物体的控制方法的流程示意图。本公开实施例提供的目标物体的控制方法,包括:
步骤101、响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向;
步骤102、获取所述实景图像的拍摄方向;
步骤103、根据所述控制方向和所述拍摄方向,控制目标物体在所述实景图像中进行移动。
需要说明的是,本示例的提供的处理方法的执行主体为前述的目标物体的控制装置,在本公开的一些实施例中,其具体指代的可安装或集成在终端上的客户端或展示端,该目标物体的控制装置可通过应用界面或控制显示界面的方式呈现给用户。用户可通过终端与云端的服务器进行交互,以实现对场景中的目标物体进行控制。
在终端上,将通过目标物体的控制装置呈现一实景图像,并同步提供用于控制目标物体移动的虚拟控件(如图3或图5所示中终端界面右下角所呈现的部件)。而用户可基于需求,通过可通过点击、滑动、按压等不同的触控方式对虚拟控件进行控制,以触发对实景图像中目标物体的移动控制操作。
与现有技术不同的是,在本公开所基于的方案中,目标物体的控制装置在接收到该移动控制操作之后,确定移动控制操作对应的控制方向以及实景图像的拍摄方向,然后,根据所述控制方向和所述拍摄方向,控制目标物体进行移动。
其中,上述的拍摄方向用于表示拍摄所述实景图像的拍摄设备在地理坐标系下的指向;而控制方向用于表示所述移动控制操作在图像正交坐标系下的操作方向。
具体的,根据所述控制方向和所述拍摄方向,控制目标物体进行移动的具体实现上可基于方向融合处理,即对控制方向和所述拍摄方向进行方向融合处理,获得目标物体的移动方向,控制目标物体沿所述移动方向进行移动。
进一步来说,在进行方向融合处理时,可基于alpha混合技术实现,通过将控制方向的方向角以及拍摄方向的方向角进行相减,以得到移动方向的方向角,然后基于该移动方向的方向角控制目标物体沿移动方向进行移动。
其中,该控制方向将与终端本身的呈正交,即控制方向的方向角是基于图像正交坐标系而言的,其应为基于图像正交坐标系的操作指向。
而拍摄方向则是指拍摄实景图像的拍摄设备在地理坐标系下的指向,相应的,拍摄方向的方向角可为拍摄设备基于地理坐标系在水平面上的指向。
针对拍摄方向的获取,可采用传感器信息获得,即通过设置在拍摄设备/终端上的姿态传感器,以确定拍摄设备/终端的位姿状态,进而得到拍摄方向。即,确定所述实景图像的拍摄方向,包括:获取拍摄所述实景图像的拍摄设备的位姿状态;根据所述位姿状态确定所述拍摄方向。
针对控制方向的获取,可通过虚拟控件实现。具体来说,在本公开提供的控制方法中,在显示界面上还将呈现以一虚拟控件,该虚拟控件用于接收所述移动控制操作以控制目标物体进行移动。
而对于控制方向的获取可基于虚拟控件的当前形态来确定,其中虚拟控件的形态是指虚拟控件的形状和状态。在实现上,可首先确定虚拟控件的当前形态,然后根据所述虚拟控件的当前形态,获得所述控制方向。
在本实施例中提供了两种不同的虚拟控件,该虚拟控件可以设置在诸如图3或者图5中的界面的右下方,方便用户操作以控制界面中的目标物体。
下面将针对基于不同的虚拟控件获取控制方向分别进行说明:
其一,虚拟控件为虚拟摇杆,虚拟摇杆包括摇杆盘以及摇杆。
图7为本申请实施例提供的获取控制方向的第一界面示意图,如图7所示的,图中的虚拟控件为虚拟摇杆。在确定控制方向时,可确定摇杆与摇杆盘之间的位置相对关系,然后,根据位置相对关系确定所述控制方向。具体来说,摇杆与摇杆盘之间的位置相对关系可通过如下方式实现,如,分别获取摇杆和摇杆盘中心点坐标,通过二者中心点坐标之间的向量差确定其位置相对关系;又如,确定摇杆与摇杆盘的重叠区域,并确定该重叠区域的区域边缘与虚拟控件所占区域的区域边缘之间的距离,进而根据距离确定位置相对关系。当然,还可通过其他方式获取,本实施方式对此不进行限制。
其二,虚拟控件为虚拟按键,所述虚拟按键包括至少两个方向键。
图8为本申请实施例提供的获取控制方向的第二界面示意图,如图8所示的,图中的虚拟控件为虚拟按键。在确定控制方向时,可确定被触发的方向键;根据被触发的方向键对应的方向,确定所述控制方向。
而确定被触发的方向键可通过现有的触控技术实现,本申请对其不在进行过多赘述。
图9a为本申请实施例提供的第一方向融合的终端视角示意图,图9b为本申请实施例提供的第一方向融合的场景俯视图;图9c为本申请实施例提供的第二方向融合的终端视角示意图;图9d为本申请实施例提供的第二方向融合的场景俯视图。如图9a-9d所示的,通过上述方式可分别确定控制方向a和拍摄方向b(b’),通过融合可获取移动方向c(c’),终端将控制目标物体基于移动方向c(c’)进行移动。
具体来说,图9a-9d示出了通过终端控制目标物体“摩托车”在真实场景中进行移动的过程。
在图9b中示出了终端的拍摄方向b以及摩托车的移动方向c,图9a中则示出了在图9b状态下终端视角的界面。
在图9a和图9b中,当用户触发虚拟控件以控制摩托车向“北(N)”运动时,控制方向为a,将利用本公开提供的控制方式,以结合终端在真实地理坐标系下的拍摄方向b, 确定摩托车在真实地理坐标系下的移动方向c。
随后,在图9c和图9d中,示出了在控制方向a未改变,而拍摄方向b发生变化后的场景。其中,在图9d中示出了终端的拍摄方向b’以及摩托车的移动方向c’,图9c中则示出了在图9b状态下终端视角的界面。
在图9c和图9d中,终端在真实地理坐标系下的拍摄方向从b变为b’,而控制方向a未发生变化。
也就是说,用户触发虚拟控件依旧控制摩托车向“北(N)”运动时,控制方向为a,而由于拍摄方向发生变化,摩托车在真实地理坐标系下的移动方向也将变为c’。
当然,在其他实施例中,还可对于目标物体移动过程中的移动速度进行控制,具体的,目标物体的控制装置可根据所述虚拟控件的当前形态,获得所述目标物体的移动速度,并控制目标物体基于所述移动速度进行移动。
具体的,在基于虚拟控件的当前形态确定移动速度时,可基于虚拟控件被按压的力度实现,也可基于虚拟控件发生形态变化时的变化速度实现,本申请对此不进行过多赘述。
本公开实施例提供的目标物体的控制方法,通过应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向以及所述实景图像的拍摄方向;根据所述控制方向和所述拍摄方向,控制目标物体进行移动,本实施例通过根据实景图像的拍摄方向以及移动控制操作对应的控制方向,来确定实景图像中目标物体的移动方向,从而有效解决现有技术中出现在拍摄方向发生变化时控制目标物体出现方向偏差的问题,也能够有效提高实景图像中的目标物体操作性能,为用户带来更好的操控体验。
对应于上文实施例的目标物体的控制方法,图10为本公开实施例提供的目标物体的控制装置的结构框图。为了便于说明,仅示出了与本公开实施例相关的部分。参照图10,所述目标物体的控制装置包括:处理模块10和控制模块20。
处理模块10,用于响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向,以及获取所述实景图像的拍摄方向;
控制模块20,用于根据所述控制方向和所述拍摄方向,控制目标物体在所述实景图像中进行移动。
可选实施方式中,所述拍摄方向用于表示拍摄所述实景图像的拍摄设备在地理坐标系下的指向;
所述控制方向用于表示所述移动控制操作在图像正交坐标系下的操作方向。
可选实施方式中,处理模块10具体用于:获取拍摄所述实景图像的拍摄设备的位姿 状态;根据所述位姿状态确定所述拍摄方向。
可选实施方式中,还包括提供一虚拟控件,所述虚拟控件用于接收所述移动控制操作以控制目标物体进行移动;
处理模块10具体用于确定虚拟控件的当前形态,并根据所述虚拟控件的当前形态,获得所述控制方向。
可选实施方式中,所述虚拟控件包括虚拟摇杆,虚拟摇杆包括摇杆盘以及位于所述摇杆盘内的摇杆。
可选实施方式中,处理模块10具体用于:确定摇杆与摇杆盘之间的位置相对关系;根据所述位置相对关系确定所述控制方向。
可选实施方式中,所述虚拟控件包括虚拟按键,所述虚拟按键包括至少两个方向键。
可选实施方式中,处理模块10具体用于确定被触发的方向键;根据被触发的方向键对应的方向,确定所述控制方向。
可选实施方式中,控制模块20具体用于:对所述控制方向和所述拍摄方向进行方向融合处理,获得目标物体的移动方向;控制目标物体沿所述移动方向进行移动。
可选实施方式中,处理模块10还用于根据所述虚拟控件的当前形态,获得所述目标物体的移动速度;
控制模块20具体用于控制目标物体基于所述移动速度进行移动。
本公开实施例提供的目标物体的控制装置,通过应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向以及所述实景图像的拍摄方向;根据所述控制方向和所述拍摄方向,控制目标物体进行移动,本实施例通过根据实景图像的拍摄方向以及移动控制操作对应的控制方向,来确定实景图像中目标物体的移动方向,从而有效解决现有技术中出现在拍摄方向发生变化时控制目标物体出现方向偏差的问题,也能够有效提高实景图像中的目标物体操作性能,为用户带来更好的操控体验。
本实施例提供的电子设备,可用于执行上述方法实施例的技术方案,其实现原理和技术效果类似,本实施例此处不再赘述。
参考图11,其示出了适于用来实现本公开实施例的电子设备900的结构示意图,该电子设备900可以为终端设备或媒体库。其中,终端设备可以包括但不限于诸如移动电话、笔记本电脑、数字广播接收器、个人数字助理(Personal Digital Assistant,简称PDA)、平板电脑(Portable Android Device,简称PAD)、便携式多媒体播放器(Portable Media Player,简称PMP)、车载终端(例如车载导航终端)等等的移动终端以及诸如数字TV、台式计 算机等等的固定终端。图11示出的电子设备仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图11所示,电子设备900可以包括用于执行目标物体的控制方法(例如中央处理器、图形处理器等)的处理器901,其可以根据存储在只读存储器(Read Only Memory,简称ROM)902中的程序或者从存储装置908加载到随机访问存储器(Random Access Memory,简称RAM)903中的程序而执行各种适当的动作和处理。在RAM 903中,还存储有电子设备900操作所需的各种程序和数据。目标物体的控制方法901、ROM 902以及RAM 903通过总线904彼此相连。输入/输出(input/output,I/O)接口905也连接至总线904。
通常,以下装置可以连接至I/O接口905:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置906;包括例如液晶显示器(Liquid Crystal Display,简称LCD)、扬声器、振动器等的输出装置907;包括例如磁带、硬盘等的存储装置908;以及通信装置909。通信装置909可以允许电子设备900与其他设备进行无线或有线通信以交换数据。虽然图11示出了具有各种装置的电子设备900,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行根据本公开实施例所述的各流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信装置909从网络上被下载和安装,或者从存储装置908被安装,或者从ROM 902被安装。在该计算机程序被目标物体的控制方法901执行时,执行本公开实施例的方法中限定的上述功能。
需要说明的是,本公开上述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(electrical programmable ROM,EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(compact disc ROM,CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中, 计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、RF(射频,radio frequency)等等,或者上述的任意合适的组合。
上述计算机可读介质可以是上述电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。
上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该电子设备执行时,使得该电子设备执行上述实施例所示的方法。
本公开还提供了一种计算机程序,该计算机程序使得计算机执行上述实施例所示的方法。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开的操作的计算机程序代码,上述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或媒体库上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(Local Area Network,简称LAN)或广域网(Wide Area Network,简称WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。其中,单元的名称在某种情况下并不构成对该单元本身的限定,例如,第一获取单元还可以被描述为“获取至少两个网际协议地址的单元”。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(field-programmable gate array,FPGA)、专用集成电路(application specific integrated circuit,ASIC)、专用标准产品(application specific standard product,ASSP)、片上系统(system on a chip,SOC)、复杂可编程逻辑设备(complex programmable logic device,CPLD)等等。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
以下是本公开的一些实施例。
第一方面,根据本公开的一个或多个实施例,一种目标物体的控制方法,包括:
响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向;
获取所述实景图像的拍摄方向;
根据所述控制方向和所述拍摄方向,控制目标物体在所述实景图像中进行移动。
可选实施方式中,所述拍摄方向用于表示拍摄所述实景图像的拍摄设备在地理坐标系下的指向;
所述控制方向用于表示所述移动控制操作在图像正交坐标系下的操作方向。
可选实施方式中,获取所述实景图像的拍摄方向,包括:
获取拍摄所述实景图像的拍摄设备的位姿状态;
根据所述位姿状态确定所述拍摄方向。
可选实施方式中,所述控制方法还包括:提供一虚拟控件,所述虚拟控件用于接收所 述移动控制操作以控制目标物体进行移动;
所述确定所述移动控制操作对应的控制方向,包括:
确定虚拟控件的当前形态,并根据所述虚拟控件的当前形态,获得所述控制方向。
可选实施方式中,所述虚拟控件包括虚拟摇杆,虚拟摇杆包括摇杆盘以及位于所述摇杆盘内的摇杆。
可选实施方式中,所述确定虚拟控件的当前形态,并根据所述虚拟控件的当前形态,获得所述控制方向,包括:
确定摇杆与摇杆盘之间的位置相对关系;
根据所述位置相对关系确定所述控制方向。
可选实施方式中,所述虚拟控件包括虚拟按键,所述虚拟按键包括至少两个方向键。
可选实施方式中,所述确定虚拟控件的当前形态,并根据所述虚拟控件的当前形态,获得所述控制方向,包括:
确定被触发的方向键;
根据被触发的方向键对应的方向,确定所述控制方向。
可选实施方式中,所述根据所述控制方向和所述拍摄方向,控制目标物体进行移动,包括:
对所述控制方向和所述拍摄方向进行方向融合处理,获得目标物体的移动方向;
控制目标物体沿所述移动方向进行移动。
可选实施方式中,还包括:
根据所述虚拟控件的当前形态,获得所述目标物体的移动速度;
相应的,所述控制目标物体进行移动,还包括:
控制目标物体基于所述移动速度进行移动。
第二方面,根据本公开的一个或多个实施例,一种目标物体的控制装置,包括:处理模块和控制模块。
处理模块,用于响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向,以及获取所述实景图像的拍摄方向;
控制模块,用于根据所述控制方向和所述拍摄方向,控制目标物体在所述实景图像中进行移动。
可选实施方式中,所述拍摄方向用于表示拍摄所述实景图像的拍摄设备在地理坐标系下的指向;
所述控制方向用于表示所述移动控制操作在图像正交坐标系下的操作方向。
可选实施方式中,处理模块具体用于:获取拍摄所述实景图像的拍摄设备的位姿状态;根据所述位姿状态确定所述拍摄方向。
可选实施方式中,还包括提供一虚拟控件,所述虚拟控件用于接收所述移动控制操作以控制目标物体进行移动;
处理模块具体用于确定虚拟控件的当前形态,并根据所述虚拟控件的当前形态,获得所述控制方向。
可选实施方式中,所述虚拟控件包括虚拟摇杆,虚拟摇杆包括摇杆盘以及位于所述摇杆盘内的摇杆。
可选实施方式中,处理模块具体用于:确定摇杆与摇杆盘之间的位置相对关系;根据所述位置相对关系确定所述控制方向。
可选实施方式中,所述虚拟控件包括虚拟按键,所述虚拟按键包括至少两个方向键。
可选实施方式中,处理模块具体用于确定被触发的方向键;根据被触发的方向键对应的方向,确定所述控制方向。
可选实施方式中,控制模块具体用于:对所述控制方向和所述拍摄方向进行方向融合处理,获得目标物体的移动方向;控制目标物体沿所述移动方向进行移动。
可选实施方式中,处理模块还用于根据所述虚拟控件的当前形态,获得所述目标物体的移动速度;
控制模块具体用于控制目标物体基于所述移动速度进行移动。
第三方面,根据本公开的一个或多个实施例,一种电子设备,包括:至少一个处理器和存储器;
所述存储器存储计算机执行指令;
所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如前任一项所述的目标物体的控制方法。
第四方面,根据本公开的一个或多个实施例,一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如前任一项所述的目标物体的控制方法。
第五方面,根据本公开的一个或多个实施例,提供了一种计算机程序产品,包括计算机程序指令,该计算机程序指令使得计算机执行如前任一项所述的目标物体的控制方法。
第六方面,根据本公开的一个或多个实施例,提供了一种计算机程序,当计算机程序 在计算机上运行时,使得计算机执行如前任一项所述的目标物体的控制方法。
以上描述仅为本公开的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本公开中所涉及的公开范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述公开构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本公开中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。
此外,虽然采用特定次序描绘了各操作,但是这不应当理解为要求这些操作以所示出的特定次序或以顺序次序执行来执行。在一定环境下,多任务和并行处理可能是有利的。同样地,虽然在上面论述中包含了若干具体实现细节,但是这些不应当被解释为对本公开的范围的限制。在单独的实施例的上下文中描述的某些特征还可以组合地实现在单个实施例中。相反地,在单个实施例的上下文中描述的各种特征也可以单独地或以任何合适的子组合的方式实现在多个实施例中。
尽管已经采用特定于结构特征和/或方法逻辑动作的语言描述了本主题,但是应当理解所附权利要求书中所限定的主题未必局限于上面描述的特定特征或动作。相反,上面所描述的特定特征和动作仅仅是实现权利要求书的示例形式。

Claims (15)

  1. 一种目标物体的控制方法,其特征在于,包括:
    响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向;
    获取所述实景图像的拍摄方向;
    根据所述控制方向和所述拍摄方向,控制目标物体在所述实景图像中进行移动。
  2. 根据权利要求1所述的目标物体的控制方法,其特征在于,所述拍摄方向用于表示拍摄所述实景图像的拍摄设备在地理坐标系下的指向;
    所述控制方向用于表示所述移动控制操作在图像正交坐标系下的操作方向。
  3. 根据权利要求1所述的目标物体的控制方法,其特征在于,获取所述实景图像的拍摄方向,包括:
    获取拍摄所述实景图像的拍摄设备的位姿状态;
    根据所述位姿状态确定所述拍摄方向。
  4. 根据权利要求1-3任一项所述的目标物体的控制方法,其特征在于,所述控制方法还包括:提供一虚拟控件,所述虚拟控件用于接收所述移动控制操作以控制目标物体进行移动;
    所述确定所述移动控制操作对应的控制方向,包括:
    确定虚拟控件的当前形态,并根据所述虚拟控件的当前形态,获得所述控制方向。
  5. 根据权利要求4所述的目标物体的控制方法,其特征在于,所述虚拟控件包括虚拟摇杆,虚拟摇杆包括摇杆盘以及位于所述摇杆盘内的摇杆。
  6. 根据权利要求5所述的目标物体的控制方法,其特征在于,所述确定虚拟控件的当前形态,并根据所述虚拟控件的当前形态,获得所述控制方向,包括:
    确定摇杆与摇杆盘之间的位置相对关系;
    根据所述位置相对关系确定所述控制方向。
  7. 根据权利要求4所述的目标物体的控制方法,其特征在于,所述虚拟控件包括虚拟按键,所述虚拟按键包括至少两个方向键。
  8. 根据权利要求7所述的目标物体的控制方法,其特征在于,所述确定虚拟控件的当前形态,并根据所述虚拟控件的当前形态,获得所述控制方向,包括:
    确定被触发的方向键;
    根据被触发的方向键对应的方向,确定所述控制方向。
  9. 根据权利要求1-8任一项所述的目标物体的控制方法,其特征在于,所述根据所述控制方向和所述拍摄方向,控制目标物体进行移动,包括:
    对所述控制方向和所述拍摄方向进行方向融合处理,获得目标物体的移动方向;
    控制目标物体沿所述移动方向进行移动。
  10. 根据权利要求1-9任一项所述的目标物体的控制方法,其特征在于,还包括:
    根据所述虚拟控件的当前形态,获得所述目标物体的移动速度;
    相应的,所述控制目标物体进行移动,还包括:
    控制目标物体基于所述移动速度进行移动。
  11. 一种目标物体的控制装置,其特征在于,包括:
    处理模块,用于响应对实景图像中的目标物体触发的移动控制操作,确定所述移动控制操作对应的控制方向,以及获取所述实景图像的拍摄方向;
    控制模块,用于根据所述控制方向和所述拍摄方向,控制目标物体在所述实景图像中进行移动。
  12. 一种电子设备,其中,包括:
    至少一个处理器;以及
    存储器;
    所述存储器存储计算机执行指令;
    所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求1-10任一项所述的目标物体的控制方法。
  13. 一种计算机可读存储介质,其中,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如权利要求1-10任一项所述的目标物体的控制方法。
  14. 一种计算机程序产品,其特征在于,包括计算机程序指令,所述计算机程序指令被计算机执行时,使得所述计算机实现权利要求1-10任一项所述的目标物体的控制方法。
  15. 一种计算机程序,其特征在于,所述计算机程序被计算机执行时,使得所述计算机实现权利要求1-10任一项所述的目标物体的控制方法。
PCT/CN2021/112693 2020-09-07 2021-08-16 目标物体的控制方法、装置、电子设备及存储介质 WO2022048428A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21863495.4A EP4086732A4 (en) 2020-09-07 2021-08-16 METHOD AND DEVICE FOR CONTROLLING A TARGET OBJECT, ELECTRONIC DEVICE AND STORAGE MEDIUM
US17/882,443 US11869195B2 (en) 2020-09-07 2022-08-05 Target object controlling method, apparatus, electronic device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010931355.8A CN112068703B (zh) 2020-09-07 2020-09-07 目标物体的控制方法、装置、电子设备及存储介质
CN202010931355.8 2020-09-07

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/882,443 Continuation US11869195B2 (en) 2020-09-07 2022-08-05 Target object controlling method, apparatus, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
WO2022048428A1 true WO2022048428A1 (zh) 2022-03-10

Family

ID=73664066

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/112693 WO2022048428A1 (zh) 2020-09-07 2021-08-16 目标物体的控制方法、装置、电子设备及存储介质

Country Status (4)

Country Link
US (1) US11869195B2 (zh)
EP (1) EP4086732A4 (zh)
CN (1) CN112068703B (zh)
WO (1) WO2022048428A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112068703B (zh) * 2020-09-07 2021-11-16 北京字节跳动网络技术有限公司 目标物体的控制方法、装置、电子设备及存储介质
CN114332224A (zh) * 2021-12-29 2022-04-12 北京字节跳动网络技术有限公司 3d目标检测样本的生成方法、装置、设备及存储介质

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108415570A (zh) * 2018-03-07 2018-08-17 网易(杭州)网络有限公司 基于增强现实的控件选择方法和装置
US20190091561A1 (en) * 2017-09-26 2019-03-28 Netease (Hangzhou) Network Co.,Ltd. Method and apparatus for controlling virtual character, electronic device, and storage medium
US20190126148A1 (en) * 2017-10-24 2019-05-02 Netease (Hangzhou) Network Co.,Ltd. Virtual Character Controlling Method and Apparatus, Electronic Device, and Storage Medium
CN109754471A (zh) * 2019-01-10 2019-05-14 网易(杭州)网络有限公司 增强现实中的图像处理方法及装置、存储介质、电子设备
CN109908583A (zh) * 2019-02-25 2019-06-21 成都秘灵互动科技有限公司 基于vr的角色控制方法及装置
CN110764614A (zh) * 2019-10-15 2020-02-07 北京市商汤科技开发有限公司 增强现实数据呈现方法、装置、设备及存储介质
CN111190485A (zh) * 2019-12-27 2020-05-22 北京市商汤科技开发有限公司 信息显示方法、装置、电子设备、及计算机可读存储介质
CN111359200A (zh) * 2020-02-26 2020-07-03 网易(杭州)网络有限公司 基于增强现实的游戏交互方法和装置
WO2020143146A1 (zh) * 2019-01-10 2020-07-16 网易(杭州)网络有限公司 游戏中的显示控制方法、装置、存储介质、处理器及终端
CN112068703A (zh) * 2020-09-07 2020-12-11 北京字节跳动网络技术有限公司 目标物体的控制方法、装置、电子设备及存储介质
CN112148125A (zh) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 一种ar交互状态控制的方法、装置、设备及存储介质

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101260576B1 (ko) * 2010-10-13 2013-05-06 주식회사 팬택 Ar 서비스를 제공하기 위한 사용자 단말기 및 그 방법
CN102123194B (zh) * 2010-10-15 2013-12-18 张哲颖 利用增强实景技术优化移动导航和人机交互功能的方法
US9041622B2 (en) * 2012-06-12 2015-05-26 Microsoft Technology Licensing, Llc Controlling a virtual object with a real controller device
JP2014191718A (ja) * 2013-03-28 2014-10-06 Sony Corp 表示制御装置、表示制御方法および記録媒体
CN104748746B (zh) * 2013-12-29 2017-11-03 刘进 智能机姿态测定及虚拟现实漫游方法
US9227141B2 (en) * 2013-12-31 2016-01-05 Microsoft Technology Licensing, Llc Touch screen game controller
CN106575299A (zh) * 2014-08-01 2017-04-19 索尼公司 信息处理装置、信息处理方法和程序
US9911235B2 (en) * 2014-11-14 2018-03-06 Qualcomm Incorporated Spatial interaction in augmented reality
US10373392B2 (en) * 2015-08-26 2019-08-06 Microsoft Technology Licensing, Llc Transitioning views of a virtual model
CN105597311B (zh) * 2015-12-25 2019-07-12 网易(杭州)网络有限公司 3d游戏中的相机控制方法和装置
CN105955483A (zh) * 2016-05-06 2016-09-21 乐视控股(北京)有限公司 虚拟现实终端及其视觉虚拟方法和装置
CN106843681A (zh) * 2016-12-30 2017-06-13 深圳超多维科技有限公司 触控应用的运行控制方法、装置及电子设备
CN106843861A (zh) 2017-01-05 2017-06-13 深圳市爱立峰科技有限公司 应用界面布局更新方法及装置
CN109934931B (zh) * 2017-12-19 2023-03-28 阿里巴巴集团控股有限公司 采集图像、建立目标物体识别模型的方法及装置
CN108379837A (zh) * 2018-02-01 2018-08-10 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
CN108543309B (zh) * 2018-04-03 2020-03-10 网易(杭州)网络有限公司 在增强现实中控制虚拟控制对象移动的方法、装置及终端
JP2022505999A (ja) * 2019-10-15 2022-01-17 ベイジン センスタイム テクノロジー デベロップメント カンパニー, リミテッド 拡張現実データの提示方法、装置、機器および記憶媒体
CN111408137A (zh) * 2020-02-28 2020-07-14 苏州叠纸网络科技股份有限公司 一种基于增强现实的场景交互方法、装置、设备和介质
CN111651051B (zh) * 2020-06-10 2023-08-22 浙江商汤科技开发有限公司 一种虚拟沙盘展示方法及装置

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190091561A1 (en) * 2017-09-26 2019-03-28 Netease (Hangzhou) Network Co.,Ltd. Method and apparatus for controlling virtual character, electronic device, and storage medium
US20190126148A1 (en) * 2017-10-24 2019-05-02 Netease (Hangzhou) Network Co.,Ltd. Virtual Character Controlling Method and Apparatus, Electronic Device, and Storage Medium
CN108415570A (zh) * 2018-03-07 2018-08-17 网易(杭州)网络有限公司 基于增强现实的控件选择方法和装置
CN109754471A (zh) * 2019-01-10 2019-05-14 网易(杭州)网络有限公司 增强现实中的图像处理方法及装置、存储介质、电子设备
WO2020143146A1 (zh) * 2019-01-10 2020-07-16 网易(杭州)网络有限公司 游戏中的显示控制方法、装置、存储介质、处理器及终端
CN109908583A (zh) * 2019-02-25 2019-06-21 成都秘灵互动科技有限公司 基于vr的角色控制方法及装置
CN110764614A (zh) * 2019-10-15 2020-02-07 北京市商汤科技开发有限公司 增强现实数据呈现方法、装置、设备及存储介质
CN111190485A (zh) * 2019-12-27 2020-05-22 北京市商汤科技开发有限公司 信息显示方法、装置、电子设备、及计算机可读存储介质
CN111359200A (zh) * 2020-02-26 2020-07-03 网易(杭州)网络有限公司 基于增强现实的游戏交互方法和装置
CN112068703A (zh) * 2020-09-07 2020-12-11 北京字节跳动网络技术有限公司 目标物体的控制方法、装置、电子设备及存储介质
CN112148125A (zh) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 一种ar交互状态控制的方法、装置、设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4086732A4 *

Also Published As

Publication number Publication date
CN112068703B (zh) 2021-11-16
US20220375092A1 (en) 2022-11-24
CN112068703A (zh) 2020-12-11
EP4086732A1 (en) 2022-11-09
US11869195B2 (en) 2024-01-09
EP4086732A4 (en) 2023-09-06

Similar Documents

Publication Publication Date Title
WO2022088918A1 (zh) 虚拟图像的显示方法、装置、电子设备及存储介质
WO2021082801A1 (zh) 增强现实处理方法及装置、系统、存储介质和电子设备
US9268410B2 (en) Image processing device, image processing method, and program
US11869195B2 (en) Target object controlling method, apparatus, electronic device, and storage medium
WO2023051185A1 (zh) 图像处理方法、装置、电子设备及存储介质
WO2020228682A1 (zh) 对象交互方法及装置、系统、计算机可读介质和电子设备
US20230328197A1 (en) Display method and apparatus based on augmented reality, device, and storage medium
US11561651B2 (en) Virtual paintbrush implementing method and apparatus, and computer readable storage medium
CN113407084B (zh) 展示内容更新方法、头戴式显示设备和计算机可读介质
WO2020211422A1 (zh) 视频处理方法、装置及设备
CN109992111B (zh) 增强现实扩展方法和电子设备
CN112907652B (zh) 相机姿态获取方法、视频处理方法、显示设备和存储介质
WO2023138559A1 (zh) 虚拟现实交互方法、装置、设备和存储介质
WO2024016924A1 (zh) 视频处理方法、装置、电子设备及存储介质
CN114445500A (zh) 增强现实场景构建方法、装置、终端设备和存储介质
CN109636917B (zh) 三维模型的生成方法、装置、硬件装置
US12041374B2 (en) Segmentation-based video capturing method, apparatus, device and storage medium
WO2022227918A1 (zh) 视频处理方法、设备及电子设备
WO2022055419A2 (zh) 文字的显示方法、装置、电子设备及存储介质
WO2022135018A1 (zh) 动态流体显示方法、装置、电子设备和可读介质
US20240153211A1 (en) Methods, apparatuses, terminals and storage media for display control based on extended reality
CN110047520B (zh) 音频播放的控制方法、装置、电子设备和计算机可读存储介质
US20240284038A1 (en) Photographing guiding method and apparatus, and electronic device and storage medium
US20240346732A1 (en) Method and apparatus for adding video effect, and device and storage medium
US20240269553A1 (en) Method, apparatus, electronic device and storage medium for extending reality display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21863495

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021863495

Country of ref document: EP

Effective date: 20220805

NENP Non-entry into the national phase

Ref country code: DE