WO2022089075A1 - 图像采集方法、手柄设备、头戴设备及头戴系统 - Google Patents

图像采集方法、手柄设备、头戴设备及头戴系统 Download PDF

Info

Publication number
WO2022089075A1
WO2022089075A1 PCT/CN2021/118545 CN2021118545W WO2022089075A1 WO 2022089075 A1 WO2022089075 A1 WO 2022089075A1 CN 2021118545 W CN2021118545 W CN 2021118545W WO 2022089075 A1 WO2022089075 A1 WO 2022089075A1
Authority
WO
WIPO (PCT)
Prior art keywords
handle device
visible light
head
handle
camera
Prior art date
Application number
PCT/CN2021/118545
Other languages
English (en)
French (fr)
Inventor
张秀志
周宏伟
柳光辉
郭衡江
Original Assignee
青岛小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛小鸟看看科技有限公司 filed Critical 青岛小鸟看看科技有限公司
Priority to EP21884805.9A priority Critical patent/EP4142277A4/en
Publication of WO2022089075A1 publication Critical patent/WO2022089075A1/zh
Priority to US17/817,634 priority patent/US11754835B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0308Detection arrangements using opto-electronic means comprising a plurality of distinctive and separately oriented light emitters or reflectors associated to the pointing device, e.g. remote cursor controller with distinct and separately oriented LEDs at the tip whose radiations are captured by a photo-detector associated to the screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • A63F2300/1031Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection using a wireless connection, e.g. Bluetooth, infrared connections
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0384Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the embodiments of the present disclosure relate to the technical field of smart head-mounted devices, and more particularly, to an image acquisition method, a handle device, a head-mounted device, and a head-mounted system.
  • the existing headset designs support 6 degrees of freedom (6DOF) tracking of the head.
  • 6DOF degrees of freedom
  • the headset can be judged through optical, ultrasonic, electromagnetic and other solutions. relative position to the handle device.
  • the device due to the characteristics of the magnetic field, the device will be interfered by the external magnetic field, which will affect the judgment accuracy and tracking quality, and the cost is extremely high.
  • the ultrasonic tracking solution since it uses the ultrasonic transmitter and receiver to determine the relative distance of the handle device, and the ultrasonic wave is easily affected by factors such as occlusion and reflection, the ultrasonic tracking has the problem of poor measurement accuracy.
  • the optical tracking solution the current tracking is mainly based on infrared light. This tracking solution is greatly affected by the external environment. For example, if the ambient light is strong, the light point of the handle will be weakened, causing the problem of unstable tracking. It can be seen that it is very necessary to provide a solution capable of accurately judging the relative position of the handle device, so as to improve the tracking accuracy of the handle device and the stability of the environment.
  • the embodiments of the present disclosure provide a new technical solution capable of accurately judging the relative position of the handle device.
  • a handle device comprising a housing
  • control module is arranged in the housing
  • the switch control end of the infrared light circuit is connected with the control module, and the infrared lamp beads of the infrared light circuit are exposed to the outside through the housing;
  • the switch control end of the visible light circuit is connected with the control module, and the visible light strip of the visible light circuit is exposed to the outside through the casing.
  • control module includes a processor chip and a wireless communication chip connected to the processor chip, and both the switch control terminal of the infrared light circuit and the switch control terminal of the visible light circuit are connected to the wireless communication chip;
  • the handle device further includes an input device for user operation, and the input device is connected with the processor chip.
  • the wireless communication chip is a Bluetooth low energy chip.
  • the handle device further includes an inertial measurement unit, and the inertial measurement unit is connected with the wireless communication chip.
  • an image acquisition method based on the handle device described in the above embodiments the method is implemented by a head-mounted device, and the head-mounted device is provided with at least one camera device,
  • the method includes:
  • the environment image is an image of the environment where the head-mounted device is located
  • the handle image is an image of the handle device
  • the corresponding camera device is controlled to collect the environment image and the handle image in time-division during the image collection period.
  • the head-mounted device includes a plurality of cameras, and the center points of the first exposure times of the plurality of cameras are aligned; and/or the second exposure times of the plurality of cameras are the same.
  • the method further includes:
  • the second exposure time obtain the first lighting time for the handle device to light up the infrared lamp beads and the second lighting time for the visible light strip in the image acquisition cycle;
  • the control information about the first lighting time and the second lighting time is sent to the handle device, so that the handle device can control the infrared light circuit and the visible light circuit according to the control information.
  • the method further includes:
  • beacon frames for time synchronization are sent to the handle device.
  • the method further includes:
  • the control information about the lighting brightness is sent to the handle device, so that the handle device can control the visible light circuit according to the control information.
  • a head mounted device including a memory and a processor, the memory stores a computer program, and the processor is configured to execute the image acquisition method in the above embodiment under the control of the computer program .
  • a head-mounted system including the above-mentioned head-mounted device and the above-mentioned handle device, wherein the head-mounted device and the handle device are wirelessly connected.
  • the handle device of this embodiment it is provided with an infrared light circuit and a visible light circuit, and can control the switch of the visible light circuit and the infrared light circuit through its control module.
  • FIG. 1 is a schematic structural diagram of a handle device to which an embodiment of the present disclosure can be applied;
  • FIG. 2 is a schematic diagram of an internal device of a handle device according to an embodiment of the present disclosure
  • FIG. 3 is a schematic flowchart of an image acquisition method for a handle device according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of an exposure process of a camera device of a head mounted device according to an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of controlling visible light points and infrared light patterns of a handle device according to an embodiment of the present disclosure
  • FIG. 6 is a schematic diagram of a hardware structure of a head mounted device according to an embodiment of the present disclosure
  • FIG. 7 is a schematic structural diagram of a head-mounted system according to an embodiment of the present disclosure.
  • Fig. 1 shows a schematic diagram of a mechanical structure of a handle device according to an embodiment
  • Fig. 2 shows a schematic diagram of an internal device of a handle device.
  • the handle device may include a housing 103 , a control module 207 , an infrared light circuit 201 and a visible light circuit 202 .
  • the color of the casing 103 can be selected according to the user's preference, such as black, white, black and white, and the like.
  • the casing 103 may be made of any material, such as plastic material, which is not limited herein.
  • the control module 207 , the infrared light circuit 201 and the visible light circuit 202 may be arranged on a circuit board of the handle device, and the circuit board is arranged in the housing 103 .
  • the switch control end of the infrared light circuit 201 is connected to the control module 207 , and the infrared lamp beads 101 of the infrared light circuit 201 are exposed through the housing 103 , and each infrared lamp bead 101 corresponds to an infrared light spot.
  • the part of the casing 103 corresponding to the infrared lamp bead 101 may be a hollow part or a transparent part, so that the infrared lamp bead 101 is exposed to the outside through the casing.
  • the infrared lamp beads 101 and the infrared light circuits 201 can be arranged in a one-to-one correspondence, so that each infrared lamp bead 101 can be independently controlled. All infrared lamp beads 101 may also correspond to one infrared light circuit 201 , or groups may correspond to different infrared light circuits 201 , etc., which are not limited herein.
  • the infrared light circuit 201 may include an infrared lamp bead 101 and a first controllable switch, the infrared lamp bead 101 and the first controllable switch are connected in series between the power supply terminal and the ground terminal, and the control terminal of the first controllable switch is the infrared light The control terminal of the circuit 201 .
  • the first controllable switch may be a transistor or a MOS transistor, etc., which is not limited herein.
  • the control module 207 can output a control signal to turn on the infrared lamp bead 101 to the switch control terminal of the infrared light circuit 201, so that the infrared lamp bead 101 is lit; the control module 207 can also output to the switch control terminal of the infrared light circuit 201 to turn it off The control signal of the infrared lamp bead 101, so that the infrared lamp bead 101 is turned off.
  • the switch control end of the visible light circuit 202 is connected to the control module 207 , and the visible light strip 102 of the visible light circuit 202 protrudes from the casing.
  • Each visible light strip 102 corresponds to a visible light pattern.
  • the part of the casing corresponding to the visible light strip 102 can be a hollow part or a transparent part, so that the visible light strip 102 can be seen through the casing.
  • the handle device may be provided with one visible light strip 102 or at least two visible light strips 102.
  • the visible light strips 102 and the visible light circuits 202 can be arranged in a one-to-one correspondence, so that each visible light strip 102 can be independently controlled. All the visible light strips 102 may also correspond to one visible light circuit 202 , or may be grouped to correspond to different visible light circuits 202 , etc., which are not limited herein.
  • the visible light circuit 202 may include a visible light strip 102 and a second controllable switch, the visible light strip 102 and the second controllable switch are connected in series between the power supply terminal and the ground terminal, and the control terminal of the second controllable switch is the visible light The control terminal of the circuit 202 .
  • the second controllable switch may be a transistor or a MOS transistor, etc., which is not limited herein.
  • the control module 207 can output a control signal to turn on the visible light strip 102 to the switch control terminal of the visible light circuit 202, so that the visible light strip 102 is lit; the control module 207 can also output to the switch control terminal of the visible light circuit 202 to turn off the visible light lamp The control signal of the strip 102 so that the visible light strip 102 is turned off.
  • the visible light strips of the visible light circuit 202 are exposed from the casing 103 through 102 to form a visible light pattern outside the casing 103 . needs to be designed.
  • control module 207 may include a processor chip 203 and a wireless communication chip 204 connected to the processor chip 203 , a switch control terminal of the infrared light circuit 201 and a switch of the visible light circuit 202 The control terminals are all connected with the wireless communication chip 204;
  • the wireless communication chip 204 can send a corresponding control signal to the switch control terminal of the infrared light circuit 201 according to the control control information received from the head-mounted device for lighting the infrared lamp beads 101 to control the infrared light circuit 201 to light up the infrared lamp beads 101.
  • the wireless communication chip 204 can send a corresponding control signal to the switch control terminal of the infrared light circuit 201 according to the control information received from the head-mounted device for extinguishing the infrared lamp beads 101 to control the infrared light circuit 201 to extinguish the infrared lamp beads 101.
  • the wireless communication chip 204 can send a corresponding control signal to the switch control terminal of the visible light circuit 202 according to the control information for lighting the visible light bar 102 received from the headset, so as to control the visible light circuit 202 to light the visible light bar 102 .
  • the wireless communication chip 204 may send a corresponding control signal to the switch control terminal of the visible light circuit 202 according to the control information received from the head-mounted device for turning off the visible light bar 102 to control the visible light circuit 202 to turn off the visible light bar 102 .
  • the handle device further includes an input device 205 for user operation, and the input device 205 is connected to the processor chip 203 .
  • the user can operate the button module 104 on the handle device and input corresponding instructions to the processor chip 203 in the handle device through the input device 205, thereby realizing the control of the handle device;
  • the user can input corresponding instructions to the processor chip 203 in the handle device through the touch screen module through the input device 205, so that the handle device can complete a series of operations.
  • the touch screen module may be arranged where the button module 104 is located.
  • the housing 103 may include a grip portion 106 for the user to hold and a positioning portion 105 for tracking and positioning, and the positioning portion is connected with the grip portion.
  • the above input device 205 may be disposed on the holding portion 106
  • the above infrared lamp beads 101 and visible light strips 102 may be disposed on the positioning portion 105 .
  • the positioning portion may have any shape, such as a curved band, and both ends of the positioning portion are connected to the gripping portion, and may also have any shape, such as a ring shape.
  • the wireless communication chip 204 in the handle device may use a LoRa low-power module, a 2G IoT module, a Bluetooth low-power module, etc., as long as it can transmit data signals wirelessly. Chip consumption is the best, which can greatly save energy costs.
  • the wireless wireless communication chip 204 adopts a Bluetooth low energy chip, and its main function is to wirelessly transmit the data information measured by the inertial measurement unit 206 in the handle device to the head-mounted device;
  • the data information input by the operation button module or the touch screen module in the handle device is transmitted to the head-mounted device through the input device 205 .
  • the wireless communication chip 204 also exists in the head-mounted device, and the above Bluetooth low energy chip can be used, and then the time synchronization with the wireless communication chip 204 in the handle device can be realized through the wireless transmission function of the wireless communication chip 204 , and realize the wireless connection with the handle device.
  • the Bluetooth low energy chip in the handle device can receive the control information sent by the Bluetooth low energy chip in the head-mounted device, and is used to send the corresponding control signal to the switch control end of the visible light circuit 202 , or send a corresponding control signal to the switch control terminal of the infrared light circuit 201 .
  • the handle device further includes an inertial measurement unit 206, the main function of which is to collect the attitude data information of the handle device.
  • the attitude data information of the handle device According to the attitude data information of the handle device, prediction and tracking of the position and attitude of the head-mounted device can be realized, wherein, In order to reduce the error of the handle attitude data information, the handle device attitude data information will have operations such as static and dynamic calibration and temperature compensation.
  • a motor drive circuit is set in the handle device to control the linear motor, mainly for better use feedback of the handle device in the actual experience.
  • the motor drive circuit is used in the game part, and the button module is improved.
  • the touch function can provide richer input information and improve the user's virtual reality experience.
  • the camera of the head-mounted device before capturing an image, needs to acquire the first exposure time of the camera for the environment image and the second exposure time for the handle image in an image collection cycle, and then according to the For the first exposure time and the second exposure time of the camera, the exposure parameters of the corresponding camera are set, and the exposure parameters are directly set in the memory in the head-mounted device, and finally, the corresponding exposure parameters are controlled according to the exposure parameters of the camera.
  • the camera device time-divisionally collects the environment image and the handle image in the image collection period.
  • Fig. 3 shows a schematic flowchart of an image acquisition method of a handle device according to an embodiment. As shown in FIG. 3, the method may include the following steps S301-S303:
  • Step S301 Acquire the first exposure time of the camera device for the environment image and the second exposure time of the handle image in an image acquisition cycle, wherein the environment image is the image of the environment where the head-mounted device is located, so the The handle image is an image of the handle device;
  • the head-mounted device includes a plurality of cameras, and the center points of the first exposure times for the environment image acquired before the plurality of cameras capture images are aligned; and/or, in the plurality of cameras Each of the second exposure times for the handle image acquired before the device acquires the image is the same.
  • FIG. 4 shows a schematic diagram of an exposure process of a camera device of a head mounted device according to an embodiment.
  • the head-mounted device includes four camera devices, and the centers of the first exposure times of the four camera devices (camera 1, camera 2, camera 3, and camera 4) are aligned, as shown in the first exposure time in FIG. 4 .
  • the first exposure time 404 of the camera device (camera 4) their exposure time centers are aligned, but their first exposure times are different; and the second camera device (camera 1) shown in FIG.
  • Step S302 setting exposure parameters corresponding to the imaging device according to the first exposure time and the second exposure time of the imaging device;
  • multiple camera devices in the process of capturing images, multiple camera devices will be affected by ambient light, and the exposure parameters of the multiple camera devices are also different.
  • the timestamp data of the images is different.
  • VTS 11 If there are four camera devices (camera 1, camera 2, camera 3, camera 4), in VTS 11, the first exposure times of the four camera devices are different, but the first exposure times of the four camera devices require each Center points are aligned; in VTS 21, the second exposure times of the four cameras need to be the same, and the center points of the second exposure times of the four cameras are aligned.
  • Step S303 According to the exposure parameters of the camera device, control the corresponding camera device to collect the environment image and the handle image in time-division during the image collection period.
  • a lower exposure parameter (tens to hundreds of microseconds) is usually used, In this way, the power consumption of the handle system can be reduced, and the influence of the external environment on the visible light pattern and infrared light spot of the handle device can also be reduced; in addition, higher exposure parameters can be adopted for different environments.
  • image acquisition is performed according to 60 Hz and 30 Hz, wherein a plurality of camera devices can perform the work of collecting images according to 60 Hz, and some of the camera devices also track the environment image according to 30 Hz to complete the head-mounted device tracking function, and some camera devices also
  • the tracking function of the handle device can be completed according to the visible light pattern and infrared light spot on the 30Hz tracking handle device, and the image acquisition can also be performed according to 60Hz and 90Hz.
  • the specific arrangement can be arbitrarily set by the user.
  • multiple camera devices need to operate synchronously, and the infrared light circuit and visible light circuit are turned on when the handle image is exposed, which is beneficial to reduce the power consumption of the handle device; and when multiple camera devices operate synchronously, it is necessary to realize The exposure center points are aligned, thereby maintaining the same time stamp data for each frame of images captured by multiple cameras.
  • the handle device obtains a first lighting time for lighting the infrared lamp beads and a second lighting time for lighting the visible light strip in the image acquisition period;
  • the control information about the first lighting time and the second lighting time is sent to the handle device, so that the handle device can control the infrared light circuit and the visible light circuit according to the control information.
  • a schematic diagram of the control of visible light points and infrared light patterns of a handle device is an embodiment.
  • the infrared light circuit 201 and the infrared light circuit 201 are turned on.
  • the visible light circuit 202 that is, the infrared light circuit 201 controls the infrared light lamp beads to light up and the visible light circuit 202 controls the visible light LED light bar to light up; when the head mounted device is exposed 502, the infrared light circuit 201 and the visible light circuit 202 are turned off.
  • the head-mounted device and the handle device are enabled wirelessly synchronously through the wireless communication chip 204, the resulting control information deviation and the deviation caused by the system crystal oscillator, so the infrared lamp bead is lit longer than the head-mounted device.
  • the device is exposed for 502 time, and during the exposure 502 of the head-mounted device, an anti-flicker pulse 503 is added to the visible light pattern, and the visible light LED strip is kept refreshed at 90 Hz, wherein the visible light LED strip is refreshed at a frequency that can be adjusted. Set by the user.
  • the anti-flicker pulse 503 added in the visible light pattern is controlled by a narrow pulse with high current output, which further ensures that the anti-flicker pulse will not light up during the head-mounted tracking image. Not only does the flash pulse reduce visible-light power consumption, it also doesn't affect the tracking capabilities of the headset.
  • a time gap for Bluetooth transmission between the headset and the handle device is monitored; during the time gap, a beacon frame for time synchronization is sent to the handle device.
  • BLE wireless and 2.4G private protocol wireless can be used.
  • BLE wireless mainly realizes the transmission of handle attitude data information and input data information of input device Transmission, take advantage of the strong anti-interference ability of BLE wireless to ensure stable transmission;
  • 2.4G private protocol wireless is to use the time gap for BLE transmission between the headset and the handle device after the BLE wireless interval data transmission is completed.
  • a time beacon data packet because the 2.4G private protocol does not have complex protocol stack operations, it can directly read and write the underlying part of RF transmission and reception, and there is no data retransmission and other operations, so the time interval from sending to receiving a data packet It is basically fixed.
  • the time systems of the headset and the handle device can be well synchronized.
  • the local time accuracy of the handle device and the headset device is high, so there is no need to synchronize the time system frequently.
  • a data synchronization at a basic interval (such as 1s) can meet the synchronization requirements, and can also solve the power consumption problem caused by frequent wireless communication.
  • the distance between the head mounted device and the handle device is obtained; according to the distance, the lighting brightness of the visible light strip is determined; and the control information about the lighting brightness is sent to A handle device, for the handle device to control the visible light circuit according to the control information.
  • the environment in which the handle device is located is complex and diverse, so the environment position where the handle device is located is different, and the visible light pattern and infrared light spot are greatly affected by the external ambient light, so the handle device is in the first place.
  • the infrared light beads and visible light strips that require visible light are scanned from the lowest brightness to the highest brightness, until the headset can successfully calculate and estimate the position and attitude of the handle device through the infrared light spots and visible light patterns.
  • the specific measures are as follows:
  • the processor chip After the headset locks the position and brightness of the handle device (the brightness of the infrared light spot and the visible light pattern), the processor chip will predict the position of the handle image in the next frame through the data information of the inertial measurement unit 206, and also predict its exposure time. , exposure gain, and brightness of the next frame. Among them, the exposure time and exposure gain are directly set in the memory of the headset, which can ensure its real-time and effectiveness.
  • the overall control logic can focus on adjusting the exposure gain to ensure visible light patterns and infrared light spots.
  • the range of the user's use distance eg, 10cm-120cm
  • by adjusting the exposure parameters of the camera device a better tracking effect can be achieved. Therefore, by adjusting the brightness of the handle device body, better accuracy can be achieved when the handle device is close to the head-mounted device at a short distance or a long distance.
  • the present disclosure also provides a head mounted device, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the image acquisition method shown in FIG. 3 under the control of the computer program.
  • FIG. 6 is a schematic diagram of a hardware structure of a head mounted device according to an embodiment.
  • the memory 602 is used to store a computer program, and the computer program is used to control the processor 601 to operate to execute An image acquisition method according to an embodiment of the present disclosure.
  • a skilled person can design the computer program according to the solution disclosed in the present invention. How the computer program controls the processor to operate is well known in the art, so it will not be described in detail here.
  • the computer program may be written using an instruction set of architectures such as x86, Arm, RISC, MIPS, and SSE.
  • the memory 602 includes, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), a nonvolatile memory such as a hard disk, and the like.
  • the present disclosure also provides a head-mounted system, a schematic structural diagram of the head-mounted system of the embodiment shown in FIG. 7 , as shown in FIG. 7 , including a head-mounted device 701 and a handle device 702 , wherein the head-mounted system
  • the device 701 is wirelessly connected to the handle device 702 .
  • the head-mounted device the handle device, and the wireless connection have been discussed, and will not be repeated here.
  • Embodiments of the present disclosure may be systems, methods and/or computer program products.
  • the computer program product may include a computer-readable storage medium having computer-readable program instructions loaded thereon for causing a processor to implement various aspects of the present invention.
  • a computer-readable storage medium may be a tangible device that can hold and store instructions for use by the instruction execution device.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Non-exhaustive list of computer readable storage media include: portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) or flash memory), static random access memory (SRAM), portable compact disk read only memory (CD-ROM), digital versatile disk (DVD), memory sticks, floppy disks, mechanically coded devices, such as printers with instructions stored thereon Hole cards or raised structures in grooves, and any suitable combination of the above.
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory static random access memory
  • SRAM static random access memory
  • CD-ROM compact disk read only memory
  • DVD digital versatile disk
  • memory sticks floppy disks
  • mechanically coded devices such as printers with instructions stored thereon Hole cards or raised structures in grooves, and any suitable combination of the above.
  • Computer-readable storage media are not to be construed as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (eg, light pulses through fiber optic cables), or through electrical wires transmitted electrical signals.
  • the computer readable program instructions described herein may be downloaded to various computing/processing devices from a computer readable storage medium, or to an external computer or external storage device over a network such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer-readable program instructions from a network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
  • the computer program instructions for carrying out the operations of the present invention may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or instructions in one or more programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server implement.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through the Internet connect).
  • LAN local area network
  • WAN wide area network
  • custom electronic circuits such as programmable logic circuits, field programmable gate arrays (FPGAs), or programmable logic arrays (PLAs)
  • FPGAs field programmable gate arrays
  • PDAs programmable logic arrays
  • Computer readable program instructions are executed to implement various aspects of the present invention.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus to produce a machine that causes the instructions when executed by the processor of the computer or other programmable data processing apparatus , resulting in means for implementing the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • These computer readable program instructions can also be stored in a computer readable storage medium, these instructions cause a computer, programmable data processing apparatus and/or other equipment to operate in a specific manner, so that the computer readable medium on which the instructions are stored includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • Computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operational steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process , thereby causing instructions executing on a computer, other programmable data processing apparatus, or other device to implement the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executables for implementing the specified logical function(s) instruction.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or actions , or can be implemented in a combination of dedicated hardware and computer instructions. It is well known to those skilled in the art that implementation in hardware, implementation in software, and implementation in a combination of software and hardware are all equivalent.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Studio Devices (AREA)

Abstract

本公开实施例涉及一种图像采集方法、手柄设备、头戴设备及头戴系统,该手柄设备包括壳体及控制模块设置在所述壳体中,红外光电路开关控制端与所述控制模块连接,所述红外光电路的红外灯珠经由所述壳体向外透出;以及,可见光电路的开关控制端与所述控制模块连接,所述可见光电路的可见光灯条经由所述壳体向外透出。通过手柄上设置的可见光及红外光,实现手柄设备位置的判断,追踪精度极高。

Description

图像采集方法、手柄设备、头戴设备及头戴系统
相关申请的交叉引用
本申请基于2020年10月28日提交的发明名称为“图像采集方法、手柄设备、头戴设备及头戴系统”的中国专利申请CN202011176796.8,并且要求该专利申请的优先权,通过引用将其所公开的内容全部并入本申请。
技术领域
本公开实施例涉及智能头戴设备技术领域,更具体地,涉及一种图像采集方法、手柄设备、头戴设备及头戴系统。
背景技术
目前,现有的头戴一体机设计中,大部分是支持头部6自由度(6 degrees of freedom,6DOF)的追踪,在追踪时,可以通过光学、超声波、电磁等方案来判断头戴设备和手柄设备的相对位置。
对于电磁追踪方案,由于磁场的特性,设备会受到外部磁场的干扰,进而影响判断精确度和追踪质量,成本极高。对于超声波追踪方案,由于其是通过超声波发射和接收器来判断手柄设备的相对距离,而超声波容易受到遮挡、反射等因素影响,因此,超声波追踪存在测量精确度较差的问题。对于光学追踪方案,目前主要基于红外光进行追踪,该种追踪方案,受外界环境影响较大,例如周围环境光线较强,手柄的光点灯光就会被削弱,造成追踪不稳定的问题。由此可见,非常有必要提供一种能够精确判断手柄设备的相对位置的方案,以提升手柄设备追踪精度和环境的稳定性。
发明内容
本公开实施例提供了能够精确判断手柄设备的相对位置的新的技术方案。
根据本公开实施例的第一方面,提供了一种手柄设备,该手柄设备包 括壳体;
控制模块,控制模块设置在壳体中;
红外光电路,红外光电路的开关控制端与控制模块连接,红外光电路的红外灯珠经由壳体向外透出;以及,
可见光电路,可见光电路的开关控制端与控制模块连接,可见光电路的可见光灯条经由壳体向外透出。
在一示例性实施例中,控制模块包括处理器芯片和与处理器芯片连接的无线通信芯片,红外光电路的开关控制端和可见光电路的开关控制端均与无线通信芯片连接;
手柄设备还包括供用户操作的输入装置,输入装置与处理器芯片连接。
在一示例性实施例中,无线通信芯片为蓝牙低能耗芯片。
在一示例性实施例中,手柄设备还包括惯性测量单元,惯性测量单元与无线通信芯片连接。
根据本公开实施例的第二方面,还提供了一种基于上述实施例中所述的手柄设备的图像采集方法,该方法由头戴设备实施,所述头戴设备设置有至少一个摄像装置,该方法包括:
获取摄像装置在一图像采集周期中对于环境图像的第一曝光时间和对于手柄图像的第二曝光时间,其中,环境图像为头戴设备所在环境的图像,手柄图像为手柄设备的图像;
根据摄像装置的第一曝光时间和第二曝光时间,设置对应摄像装置的曝光参数;
根据摄像装置的曝光参数,控制对应摄像装置在图像采集周期分时采集所述环境图像和手柄图像。
在一示例性实施例中,头戴设备包括多个摄像装置,多个摄像装置的第一曝光时间的中心点对齐;和/或,多个摄像装置的第二曝光时间相同。
在一示例性实施例中,方法还包括:
根据第二曝光时间,获得手柄设备在图像采集周期点亮红外灯珠的第 一点亮时间和点亮可见光灯条的第二点亮时间;
将关于第一点亮时间和第二点亮时间的控制信息发送至手柄设备,以供手柄设备根据控制信息进行红外光电路和可见光电路的控制。
在一示例性实施例中,方法还包括:
监测头戴设备与手柄设备之间进行BLE传输的时间间隙;
在时间间隙,向手柄设备发送用于进行时间同步的信标帧。
在一示例性实施例中,方法还包括:
获取头戴设备与所述手柄设备之间的距离;
根据距离,确定可见光灯条的点亮亮度;
将关于点亮亮度的控制信息发送至手柄设备,以供手柄设备根据控制信息进行可见光电路的控制。
根据本公开实施例的第三方面,还提供了一种头戴设备,包括存储器和处理器,存储器存储有计算机程序,处理器用于在计算机程序的控制下,执行上述实施例中的图像采集方法。
根据本公开实施例的第四方面,还提供了一种头戴系统,包括上述的头戴设备和上述的手柄设备,其中,头戴设备与所述手柄设备通过无线连接。
根据本实施例的手柄设备,其设置有红外光电路和可见光电路,并能够通过其控制模块进行可见光电路和红外光电路的开关控制,对于该种手柄设备,头戴设备通过采集该手柄设备的手柄图像,并根据手柄图像中的可见光图案和红外光点,便能够明确地计算出手柄设备所在的位置,实现手柄设备的追踪与定位,因此,这可以实现手柄设备位置的判断,实现高精度追踪,解决在任意环境下难于准确判断手柄设备位置的问题。
通过以下参照附图对本发明的示例性实施例的详细描述,本发明的其它特征及其优点将会变得清楚。
附图说明
被结合在说明书中并构成说明书的一部分的附图示出了本发明的实施例,并且连同其说明一起用于解释本发明的原理。
图1是能够应用本公开实施例的一种手柄设备的结构示意图;
图2是根据本公开实施例的一种手柄设备的内部装置示意图;
图3是根据本公开实施例的一种手柄设备的图像采集方法流程示意图;
图4是根据本公开实施例的一种头戴设备的摄像装置曝光过程示意图;
图5是根据本公开实施例的一种手柄设备的可见光点和红外光图案的控制示意图;
图6是根据本公开实施例的一种头戴设备硬件结构示意图;
图7是根据本公开实施例的一种头戴系统结构示意图。
具体实施方式
现在将参照附图来详细描述本发明的各种示例性实施例。应注意到:除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置、数字表达式和数值不限制本发明的范围。
以下对至少一个示例性实施例的描述实际上仅仅是说明性的,决不作为对本发明及其应用或使用的任何限制。
对于相关领域普通技术人物已知的技术、方法和设备可能不作详细讨论,但在适当情况下,技术、方法和设备应当被视为说明书的一部分。
在这里示出和讨论的所有例子中,任何具体值应被解释为仅仅是示例性的,而不是作为限制。因此,示例性实施例的其它例子可以具有不同的值。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步讨论。
图1示出了根据一个实施例的一种手柄设备的机械结构的示意图以及图2示出了一种手柄设备的内部装置示意图。
结合图1与图2所示,该手柄设备可以包括壳体103、控制模块207、红外光电路201和可见光电路202。
壳体103的颜色可根据用户的喜好进行选择,例如黑色、白色、黑白相间等。壳体103可以采用任意材质制成,例如塑料材质等,在此不做限定。
控制模块207、红外光电路201和可见光电路202可以设置在手柄设备的电路板上,该电路板设置在壳体103中。
红外光电路201的开关控制端与所述控制模块207连接,所述红外光电路201的红外灯珠101经由壳体103向外透出,每一红外灯珠101对应一个红外光点。该壳体103的对应红外灯珠101的部位可以为镂空部位,也可以为透明部位,以使得红外灯珠101经由壳体向外透出。
红外灯珠101与红外光电路201可以一一对应设置,以使得每一红外灯珠101可以被独立控制。所有红外灯珠101也可以对应一个红外光电路201,或者分组对应不同的红外光电路201等,在此不做限定。
红外光电路201可以包括红外灯珠101和第一可控开关,红外灯珠101与第一可控开关串联连接在电源端与接地端之间,第一可控开关的控制端即为红外光电路201的控制端。该第一可控开关可以为晶体管或者MOS管等,在此不做限定。
控制模块207可以向红外光电路201的开关控制端输出点亮红外灯珠101的控制信号,以使得红外灯珠101被点亮;控制模块207也可以向红外光电路201的开关控制端输出熄灭红外灯珠101的控制信号,以使得红外灯珠101被熄灭。
可见光电路202的开关控制端与所述控制模块207连接,所述可见光电路202的可见光灯条102由所述壳体向外透出。每一可见光灯条102对应一种可见光图案。该壳体的对应可见光灯条102的部位可以为镂空部位,也可以为透明部位,以使得可见光灯条102经由壳体向外透出。
手柄设备可以设置有一个可见光灯条102或者至少两个可见光灯条 102。可见光灯条102与可见光电路202可以一一对应设置,以使得每一可见光灯条102可以被独立控制。所有可见光灯条102也可以对应一个可见光电路202,或者分组对应不同的可见光电路202等,在此不做限定。
可见光电路202可以包括可见光灯条102和第二可控开关,可见光灯条102与该第二可控开关串联连接在电源端与接地端之间,该第二可控开关的控制端即为可见光电路202的控制端。该第二可控开关可以为晶体管或者MOS管等,在此不做限定。
控制模块207可以向可见光电路202的开关控制端输出点亮可见光灯条102的控制信号,以使得可见光灯条102被点亮;控制模块207也可以向可见光电路202的开关控制端输出熄灭可见光灯条102的控制信号,以使得可见光灯条102被熄灭。
在一些实施例中,所述可见光电路202的可见光灯条经102由所述壳体103向外透出,在壳体103外形成可见光图案,所述可见光的图案形状不定,可由用户根据自己的需求进行设计。
在一些实施例中,所述控制模块207可以包括处理器芯片203和与所述处理器芯片203连接的无线通信芯片204,所述红外光电路201的开关控制端和所述可见光电路202的开关控制端均与所述无线通信芯片204连接;
无线通信芯片204可以根据从头戴设备接收到的点亮红外灯珠101的控制控制信息,向红外光电路201的开关控制端发送对应的控制信号,以控制红外光电路201点亮红外灯珠101。或者,无线通信芯片204可以根据从头戴设备接收到的熄灭红外灯珠101的控制控制信息,向红外光电路201的开关控制端发送对应的控制信号,以控制红外光电路201熄灭红外灯珠101。
无线通信芯片204可以根据从头戴设备接收到的点亮可见光灯条102的控制信息,向可见光电路202的开关控制端发送对应的控制信号,以控制可见光电路202点亮可见光灯条102。或者,无线通信芯片204可以根据从头戴设备接收到的熄灭可见光灯条102的控制信息,向可见光电路202的开关控制端发送对应的控制信号,以控制可见光电路202熄灭可见光灯条102。
在一些实施例中,所述手柄设备还包括供用户操作的输入装置205,所述输入装置205与所述处理器芯片203连接。
如图1所示,用户在操作过程中,可通过操作手柄设备上的按键模块104,通过输入装置205对手柄设备内的处理器芯片203输入相应的指令,进而实现手柄设备的操控;在一部分种类的手柄设备上,还附有可操作的显示屏,用户可通过触屏模块,通过输入装置205对手柄设备内的处理器芯片203输入相应的指令,进而使手柄设备完成一系列的操作,其中,触屏模块可设置在按键模块104所在。
在一些实施例中,如图1所示,该壳体103可以包括供用户握持的握持部106和用于追踪定位的定位部105,所述定位部与握持部连接。该实施例中,以上输入装置205可以设置在握持部106上,以上红外灯珠101和可见光灯条102可以在定位部105上设置。
该定位部可以具有任意形状,例如呈弯曲的带状,该定位部的两端与握持部连接,其也可具有任意形状,例如呈环形状。
在一些实施例中,手柄设备中的无线通信芯片204可采用LoRa低功耗模块、2G物联模块、蓝牙低功耗模块等,只要能够起到无线传输数据信号便可,其中,采用蓝牙低能耗芯片是为最佳,能够大幅度的节省能源成本。
在一个实施例中,所述无线无线通信芯片204采用蓝牙低能耗芯片,其主要功能是将手柄设备中的惯性测量单元206所测量的数据信息无线传输到头戴设备中去;以及还可以将手柄设备中操作按键模块或触屏模块通过输入装置205所输入的数据信息传输到头戴设备中去。
在一些实施例中,头戴设备之中也存在无线通信芯片204,可采用如上的蓝牙低能耗芯片,进而通过无线通信芯片204的无线传输功能实现与手柄设备中的无线通信芯片204的时间同步,以及实现与手柄设备的无线连接,例如,手柄设备中的蓝牙低能耗芯片可接收头戴设备中蓝牙低能耗芯片发出的控制信息,用于向可见光电路202的开关控制端发送对应的控制信号,或向红外光电路201的开关控制端发送对应的控制信号。
在一些实施例中,所述手柄设备还包括惯性测量单元206,其主要功能是采集手柄设备的姿态数据信息,根据手柄设备的姿态数据信息可以实现头戴设备位置和姿态的预测追踪,其中,为了降低手柄姿态数据信息的误差,手柄设备姿态数据信息会有静态和动态校准以及温度补偿等操作。
在一些实施例中,手柄设备中设置了马达驱动电路,用于控制线性马达,主要是为了手柄设备在实际体验中有更好的使用反馈,例如,马达驱动电路用于游戏部分,提升按键模块的触摸功能,可以提供更加丰富的输入信息,提升用户虚拟现实体验。
本实施例中,头戴设备的摄像装置在采集图像之前,需要获取所述摄像装置在一图像采集周期中对于环境图像的第一曝光时间和对于手柄图像的第二曝光时间,然后根据所述摄像装置的第一曝光时间和第二曝光时间,设置对应摄像装置的曝光参数,将曝光参数直接设置在所述头戴设备中的存储器中,最后才根据所述摄像装置的曝光参数,控制对应摄像装置在所述图像采集周期分时采集所述环境图像和手柄图像。
图3示出了根据一个实施例的一种手柄设备的图像采集方法流程示意图。如图3所示,该方法可以包括如下步骤S301~S303:
步骤S301:获取所述摄像装置在一图像采集周期中对于环境图像的第一曝光时间和对于手柄图像的第二曝光时间,其中,所述环境图像为所述头戴设备所在环境的图像,所述手柄图像为所述手柄设备的图像;
在一些实施例中,所述头戴设备包括多个摄像装置,在多个摄像装置采集图像之前所获取的对于环境图像的第一曝光时间,其中心点对齐;和/或,在多个摄像装置采集图像之前所获取的对于手柄图像的第二曝光时间,其各个相同。
如图4示出了根据一个实施例的一种头戴设备的摄像装置曝光过程示意图。如图4所示,设所述头戴设备包括四个摄像装置,四个摄像装置(camera 1、camera 2、camera 3、camera 4)的第一曝光时间的中心对齐,如图4中的第一个摄像装置(camera 1)的第一曝光时间401、第二个摄像装置(camera 2)的第一曝光时间402、第三个摄像装置(camera 3)的第一曝光时间403和第四个摄像装置(camera 4)的第一曝光时间404,它们的曝光时间中心对齐,但它们的第一曝光时间各不相同;而如图4所示的第一个摄像装置(camera 1)的第二曝光时间405、第二个摄像装置(camera2)的第二曝光时间406、第三个摄像装置(camera 3)的第二曝光时间407、第四个摄像装置(camera 4)的第二曝光时间408,它们的曝光时间相同以及它们的曝光时间中心也对齐。
步骤S302:根据所述摄像装置的第一曝光时间和第二曝光时间,设置对应摄像装置的曝光参数;
在一些实施例中,多个摄像装置在采集图像的过程中,会受到周围环境光线的影响,以及多个摄像装置的曝光参数也不同,所以在每个摄像装置采集环境图像后,所采集到的图像的时间戳数据不同。
为了保证各个摄像装置采集环境图像时,每帧图像具有相同的时间戳数据,需通过设置每个摄像装置的VTS来保证此刻曝光中心点的对齐,具体如何设置如图4所示:
若摄像装置为四个(camera 1、camera 2、camera 3、camera 4),在VTS 11中,四个摄像装置的第一曝光时间各不相同,但四个摄像装置的第一曝光时间需要各个中心点对齐;在VTS 21中,四个摄像装置的第二曝光时间需要相同,并且四个摄像装置的第二曝光时间各个中心点对齐。
步骤S303:根据所述摄像装置的曝光参数,控制对应摄像装置在所述图像采集周期分时采集所述环境图像和手柄图像。
在一些实施例中,在手柄图像采集时,由于手柄设备的可见光图案及红外光光点亮度和外界环境亮度比起来亮度高,所以通常使用较低的曝光参数(几十到几百微妙),这样做可以降低手柄系统的功耗,也可以降低外界环境对手柄设备的可见光图案和红外光点的影响;除此之外,也可对于不同的环境采取较高的曝光参数。
在一个实施例中,按照60Hz和30Hz进行图像采集,其中多个摄像装置可按照60Hz进行采集图像的工作,其中也有一部分摄像装置按照30Hz来追踪环境图像完成头戴设备追踪功能,也有一部分摄像装置按照30Hz追踪手柄设备上到的可见光图案和红外光点,完成手柄设备的追踪功能,也可按照60Hz和90Hz进行图像采集,具体如何安排可随用户任意设定。
在一些实施例中,多个摄像装置需同步进行操作,进而在手柄图像曝光时打开红外光电路和可见光电路,有利于降低手柄设备的功耗;以及多个摄像装置同步进行操作时,需要实现其曝光中心点对齐,进而保持多个摄像装置采集的每帧图像有相同的时间戳数据。
在一些实施例中,根据所述第二曝光时间,获得所述手柄设备在所述图像采集周期点亮红外灯珠的第一点亮时间和点亮可见光灯条的第二点亮时间;
将关于所述第一点亮时间和所述第二点亮时间的控制信息发送至手柄设备,以供所述手柄设备根据所述控制信息进行红外光电路和可见光电路的控制。
在一个实施例中,如图5示出的一个实施例的一种手柄设备的可见光点和红外光图案的控制示意图,如图5所示的,手柄设备曝光501时,打开红外光电路201和可见光电路202,即红外光电路201控制红外光灯珠亮起和可见光电路202控制可见光的LED灯条亮起;头戴设备曝光502时,关闭红外光电路201和可见光电路202。
其中,由于头戴设备和手柄设备是通过无线通信芯片204实现的无线同步开启的,所以带来的控制信息偏差以及系统晶振带来的偏差,所以所述红外灯珠点亮的时间长于头戴设备曝光502时间,并在头戴设备曝光502期间,所述可见光图案中加入抗闪烁脉冲503,并保持所述可见光的LED灯条按90Hz刷新,其中所述可见光的LED灯条刷新的频率可随用户自行设置。
在一些实施例中,所述可见光图案中加入的抗闪烁脉冲503是使用高电流输出的窄脉冲控制,进一步保证抗闪烁脉冲不会在头戴追踪图像时点亮,通过可见光图案中加入的抗闪烁脉冲不仅能降低可见光功耗,也不会对追踪头戴功能产生影响。
在一些实施例中,监测所述头戴设备与所述手柄设备之间进行蓝牙传输的时间间隙;在所述时间间隙,向所述手柄设备发送用于进行时间同步的信标帧。
如上所说,头戴设备和手柄设备之间是通过无线通信芯片204实现的无线连接,可采用BLE无线和2.4G私有协议无线,BLE无线主要实现手柄姿态数据信息、输入装置的输入数据信息的传输,利用BLE无线抗干扰能力强优势保证稳定传输;2.4G私有协议无线是在BLE无线一个interval数据传输完成以后,利用所述头戴设备与所述手柄设备之间进行BLE传输的时间间隙发送一个时间的beacon数据包,由于2.4G私有协议没有复杂的协议栈操作,可以直接读写RF发射和接收的底层部分,并且没有数据重传等操作,所以一个数据包从发送到接收的时间间隔基本固定,利用这一个特点,可以很好的将头戴设备和手柄设备的时间系统同步起来,其中,手柄设备和头戴设备本地的时间精度较高,所以不需要频繁地做时间系统的同步操作,基本间隔一定时间(如1s)做一次数据同步即可满足同步需求,还可以解决无线频繁通讯引起的功耗问题。
在一些实施例中,获取所述头戴设备与所述手柄设备之间的距离;根据所述距离,确定所述可见光灯条的点亮亮度;将关于所述点亮亮度的控制信息发送至手柄设备,以供所述手柄设备根据所述控制信息进行所述可见光电路的控制。
在一些实施例中,在使用手柄设备的过程中,所处的环境复杂多样,所以手柄设备所处的环境位置不同,可见光图案和红外光点受外部环境光线影响较大,所以手柄设备在第一次使用(开机时),需要红外光灯珠和可见光的可见光灯条从最低亮度到最高亮度扫描,直到头戴设备可以通过红外光点及可见光图案成功计算和估算手柄设备的位置和姿态。
在一些实施例中,用户在使用手柄设备时,当手柄设备近距离接近头戴设备的摄像装置时,会造成摄像装置采集的图像像素过饱和;而在手柄设备远离头戴设备的摄像装置时,又会造成摄像装置采集的图像像素过暗,影响追踪及追踪精度,其中,手柄设备和头戴设备之间的距离的如何把控,具体措施如下:
在头戴设备锁定手柄设备位置和亮度(红外光点和可见光图案的亮度)以后,处理器芯片会通过惯性测量单元206的数据信息预测下帧手柄图像的位置,同时也会预测到其曝光时间、曝光增益以及下帧图像的亮度。其中,曝光时间和曝光增益直接设置在头戴设备的存储器中,可以保证其实时性及有效性。
在实际应用中,对于手柄设备亮度的控制,会存在无线传输延时及无线丢包的可能性,所以整体的控制逻辑可以以调整曝光增益为主,保证可见光图案和红外光点的情况下,在用户的使用距离范围内(如10cm-120cm),通过调整摄像装置的曝光参数,可以实现较好的追踪效果。所以通过调整手柄设备本体的亮度,在手柄设备近距离和远距离靠近头戴设备的时,能够实现更好的精度。
本公开还提供了一种头戴设备,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器用于在所述计算机程序的控制下,执行如图3所示的图像采集方法。
图6示出的一实施例的一种头戴设备硬件结构示意图,如图6所示,本实施例中,存储器602用于存储计算机程序,该计算机程序用于控制处理器601进行操作以执行根据本公开实施例的图像采集方法。技术人员可以根据本发明所公开方案设计该计算机程序。该计算机程序如何控制处理器进行操作,这是本领域公知,故在此不再详细描述。
其中,计算机程序可以采用比如x86、Arm、RISC、MIPS、SSE等架构的指令集编写。存储器602例如包括ROM(只读存储器)、RAM(随机存取存储器)、诸如硬盘的非易失性存储器等。
本公开还提供了一种头戴系统,如图7所示的实施例的一种头戴系统结构示意图,如图7所示,包括头戴设备701和手柄设备702,其中,所 述头戴设备701与所述手柄设备702通过无线连接。
本实施例中,头戴设备和手柄设备以及所述的无线连接皆已经论述过,在此就不再赘述。
本公开实施例可以是系统、方法和/或计算机程序产品。计算机程序产品可以包括计算机可读存储介质,其上载有用于使处理器实现本发明的各个方面的计算机可读程序指令。
计算机可读存储介质可以是可以保持和存储由指令执行设备使用的指令的有形设备。计算机可读存储介质例如可以是――但不限于――电存储设备、磁存储设备、光存储设备、电磁存储设备、半导体存储设备或者上述的任意合适的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、静态随机存取存储器(SRAM)、便携式压缩盘只读存储器(CD-ROM)、数字多功能盘(DVD)、记忆棒、软盘、机械编码设备、例如其上存储有指令的打孔卡或凹槽内凸起结构、以及上述的任意合适的组合。这里所使用的计算机可读存储介质不被解释为瞬时信号本身,诸如无线电波或者其他自由传播的电磁波、通过波导或其他传输媒介传播的电磁波(例如,通过光纤电缆的光脉冲)、或者通过电线传输的电信号。
这里所描述的计算机可读程序指令可以从计算机可读存储介质下载到各个计算/处理设备,或者通过网络、例如因特网、局域网、广域网和/或无线网下载到外部计算机或外部存储设备。网络可以包括铜传输电缆、光纤传输、无线传输、路由器、防火墙、交换机、网关计算机和/或边缘服务器。每个计算/处理设备中的网络适配卡或者网络接口从网络接收计算机可读程序指令,并转发该计算机可读程序指令,以供存储在各个计算/处理设备中的计算机可读存储介质中。
用于执行本发明操作的计算机程序指令可以是汇编指令、指令集架构(ISA)指令、机器指令、机器相关指令、微代码、固件指令、状态设置数 据、或者以一种或多种编程语言的任意组合编写的源代码或目标代码,编程语言包括面向对象的编程语言—诸如Smalltalk、C++等,以及常规的过程式编程语言—诸如“C”语言或类似的编程语言。计算机可读程序指令可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络—包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。在一些实施例中,通过利用计算机可读程序指令的状态信息来个性化定制电子电路,例如可编程逻辑电路、现场可编程门阵列(FPGA)或可编程逻辑阵列(PLA),该电子电路可以执行计算机可读程序指令,从而实现本发明的各个方面。
这里参照根据本发明实施例的方法、装置(系统)和计算机程序产品的流程图和/或框图描述了本发明的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其它可编程数据处理装置的处理器,从而生产出一种机器,使得这些指令在通过计算机或其它可编程数据处理装置的处理器执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。
也可以把计算机可读程序指令加载到计算机、其它可编程数据处理装置、或其它设备上,使得在计算机、其它可编程数据处理装置或其它设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其它可编程数据处理装置、或其它设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。
附图中的流程图和框图显示了根据本发明的多个实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。对于本领域技术人物来说公知的是,通过硬件方式实现、通过软件方式实现以及通过软件和硬件结合的方式实现都是等价的。
以上已经描述了本发明的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本技术领域的普通技术人物来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实施例的原理、实际应用或对市场中的技术改进,或者使本技术领域的其它普通技术人物能理解本文披露的各实施例。本发明的范围由所附权利要求来限定。

Claims (12)

  1. 一种手柄设备,包括:
    壳体;
    控制模块,所述控制模块设置在所述壳体中;
    红外光电路,所述红外光电路的开关控制端与所述控制模块连接,所述红外光电路的红外灯珠经由所述壳体向外透出;以及,
    可见光电路,所述可见光电路的开关控制端与所述控制模块连接,所述可见光电路的可见光灯条经由所述壳体向外透出。
  2. 根据权利要求1所述的手柄设备,其中,所述控制模块包括处理器芯片和与所述处理器芯片连接的无线通信芯片,所述红外光电路的开关控制端和所述可见光电路的开关控制端均与所述无线通信芯片连接;
    所述手柄设备还包括供用户操作的输入装置,所述输入装置与所述处理器芯片连接。
  3. 根据权利要求2所述的手柄设备,其中,所述无线通信芯片为蓝牙低能耗芯片。
  4. 根据权利要求2所述的手柄设备,其中,所述手柄设备还包括惯性测量单元,所述惯性测量单元与所述无线通信芯片连接。
  5. 一种根据权利要求1至4中任一项所述的手柄设备的图像采集方法,由头戴设备实施,所述头戴设备设置有至少一个摄像装置,所述方法包括:
    获取所述摄像装置在一图像采集周期中对于环境图像的第一曝光时间和对于手柄图像的第二曝光时间,其中,所述环境图像为所述头戴设备所在环境的图像,所述手柄图像为所述手柄设备的图像;
    根据所述摄像装置的第一曝光时间和第二曝光时间,设置对应摄像装置的曝光参数;
    根据所述摄像装置的曝光参数,控制对应摄像装置在所述图像采集周期分时采集所述环境图像和手柄图像。
  6. 根据权利要求5所述的图像采集方法,其中,所述头戴设备包括多个摄像装置,所述多个摄像装置的第一曝光时间的中心点对齐;和/或,所述多个摄像装置的第二曝光时间相同。
  7. 根据权利要求5所述的图像采集方法,其中,所述方法还包括:
    根据所述第二曝光时间,获得所述手柄设备在所述图像采集周期点亮红外灯珠的第一点亮时间和点亮可见光灯条的第二点亮时间;
    将关于所述第一点亮时间和所述第二点亮时间的控制信息发送至手柄设备,以供所述手柄设备根据所述控制信息进行红外光电路和可见光电路的控制。
  8. 根据权利要求7所述的图像采集方法,其中,所述方法还包括:
    监测所述头戴设备与所述手柄设备之间进行蓝牙传输的时间间隙;
    在所述时间间隙,向所述手柄设备发送用于进行时间同步的信标帧。
  9. 根据权利要求7所述的图像采集方法,其中,所述方法还包括:
    获取所述头戴设备与所述手柄设备之间的距离;
    根据所述距离,确定所述可见光灯条的点亮亮度;
    将关于所述点亮亮度的控制信息发送至手柄设备,以供所述手柄设备根据所述控制信息进行所述可见光电路的控制。
  10. 一种头戴设备,存储器和处理器,所述存储器存储有计算机程序,所述处理器用于在所述计算机程序的控制下,执行权利要求5-9中任一项所述的图像采集方法。
  11. 一种头戴系统,包括权利要求10所述的头戴设备和权利要求1至4 中任一项所述的手柄设备,其中,所述头戴设备与所述手柄设备通过无线连接。
  12. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机程序,其中,所述计算机程序被处理器执行时实现所述权利要求5至9任一项中所述的方法的步骤。
PCT/CN2021/118545 2020-10-28 2021-09-15 图像采集方法、手柄设备、头戴设备及头戴系统 WO2022089075A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21884805.9A EP4142277A4 (en) 2020-10-28 2021-09-15 IMAGE ACQUISITION METHOD, HANDLE DEVICE, HEAD-MOUNTED DEVICE AND HEAD-MOUNTED SYSTEM
US17/817,634 US11754835B2 (en) 2020-10-28 2022-08-04 Image acquisition method, handle device, head-mounted device and head-mounted system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011176796.8 2020-10-28
CN202011176796.8A CN112437213A (zh) 2020-10-28 2020-10-28 图像采集方法、手柄设备、头戴设备及头戴系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/817,634 Continuation US11754835B2 (en) 2020-10-28 2022-08-04 Image acquisition method, handle device, head-mounted device and head-mounted system

Publications (1)

Publication Number Publication Date
WO2022089075A1 true WO2022089075A1 (zh) 2022-05-05

Family

ID=74696372

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118545 WO2022089075A1 (zh) 2020-10-28 2021-09-15 图像采集方法、手柄设备、头戴设备及头戴系统

Country Status (4)

Country Link
US (1) US11754835B2 (zh)
EP (1) EP4142277A4 (zh)
CN (1) CN112437213A (zh)
WO (1) WO2022089075A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112437213A (zh) * 2020-10-28 2021-03-02 青岛小鸟看看科技有限公司 图像采集方法、手柄设备、头戴设备及头戴系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110622107A (zh) * 2017-05-09 2019-12-27 微软技术许可有限责任公司 跟踪可穿戴设备和手持对象姿势
JP2020052793A (ja) * 2018-09-27 2020-04-02 株式会社Nttドコモ システム
CN111614915A (zh) * 2020-05-13 2020-09-01 深圳市欢创科技有限公司 一种空间定位方法、装置、系统和头戴式设备
CN112437213A (zh) * 2020-10-28 2021-03-02 青岛小鸟看看科技有限公司 图像采集方法、手柄设备、头戴设备及头戴系统

Family Cites Families (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7627139B2 (en) * 2002-07-27 2009-12-01 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US8019121B2 (en) * 2002-07-27 2011-09-13 Sony Computer Entertainment Inc. Method and system for processing intensity from input devices for interfacing with a computer program
US20040256561A1 (en) * 2003-06-17 2004-12-23 Allyson Beuhler Wide band light sensing pixel array
CN101502013A (zh) * 2006-10-23 2009-08-05 松下电器产业株式会社 应用可见光及红外光的光空间传输系统
US9147327B2 (en) * 2007-06-19 2015-09-29 Jeffrey A. Breier Multi-functional emergency device
US20150371083A1 (en) * 2008-04-24 2015-12-24 Ambrus Csaszar Adaptive tracking system for spatial input devices
DE102008042145A1 (de) * 2008-09-17 2010-03-18 Robert Bosch Gmbh Verfahren und Messanordnung zum Bestimmen der Rad-oder Achsgeometrie eines Fahrzeugs
JP2012242948A (ja) * 2011-05-17 2012-12-10 Sony Corp 表示制御装置および方法、並びにプログラム
WO2012166135A1 (en) * 2011-06-01 2012-12-06 Empire Technology Development,Llc Structured light projection for motion detection in augmented reality
US9608727B2 (en) * 2012-12-27 2017-03-28 Panasonic Intellectual Property Corporation Of America Switched pixel visible light transmitting method, apparatus and program
GB201314984D0 (en) * 2013-08-21 2013-10-02 Sony Comp Entertainment Europe Head-mountable apparatus and systems
US9360935B2 (en) * 2013-12-20 2016-06-07 Hong Kong Applied Science And Technology Research Institute Co. Ltd. Integrated bi-sensing optical structure for head mounted display
US9649558B2 (en) * 2014-03-14 2017-05-16 Sony Interactive Entertainment Inc. Gaming device with rotatably placed cameras
US10204406B2 (en) * 2014-11-05 2019-02-12 Illinois Tool Works Inc. System and method of controlling welding system camera exposure and marker illumination
WO2016120392A1 (en) * 2015-01-30 2016-08-04 Trinamix Gmbh Detector for an optical detection of at least one object
US10455887B2 (en) * 2015-05-21 2019-10-29 Justin London Fitness apparatus
US10368007B2 (en) * 2015-10-08 2019-07-30 Sony Interactive Entertainment Inc. Control apparatus, head-mounted display, control system, control method, and program
US10708573B2 (en) * 2016-01-04 2020-07-07 Occipital, Inc. Apparatus and methods for three-dimensional sensing
AU2017220404B2 (en) * 2016-02-18 2019-06-06 Apple Inc. Head-mounted display for virtual and mixed reality with inside-out positional, user body and environment tracking
US20170273665A1 (en) * 2016-03-28 2017-09-28 Siemens Medical Solutions Usa, Inc. Pose Recovery of an Ultrasound Transducer
JP2019514016A (ja) * 2016-04-19 2019-05-30 トリナミクス ゲゼルシャフト ミット ベシュレンクテル ハフツング 少なくとも1個の物体を光学的に検出する検出器
CN109496144B (zh) * 2016-08-01 2022-08-16 索尼公司 控制装置、控制系统、以及控制方法
US10888773B2 (en) * 2016-10-11 2021-01-12 Valve Corporation Force sensing resistor (FSR) with polyimide substrate, systems, and methods thereof
CN107960989B (zh) * 2016-10-20 2022-02-08 松下知识产权经营株式会社 脉搏波计测装置以及脉搏波计测方法
CN106768361B (zh) * 2016-12-19 2019-10-22 北京小鸟看看科技有限公司 与vr头戴设备配套的手柄的位置追踪方法和系统
US10447265B1 (en) * 2017-06-07 2019-10-15 Facebook Technologies, Llc Hand-held controllers including electrically conductive springs for head-mounted-display systems
WO2019000430A1 (en) * 2017-06-30 2019-01-03 Guangdong Virtual Reality Technology Co., Ltd. ELECTRONIC SYSTEMS AND TEXT INPUT METHODS IN A VIRTUAL ENVIRONMENT
US10175489B1 (en) * 2017-07-05 2019-01-08 Microsoft Technology Licensing, Llc Compact optical system with MEMS scanners for image generation and object tracking
CN107233729A (zh) * 2017-07-14 2017-10-10 歌尔科技有限公司 虚拟现实设备控制手柄及虚拟现实设备
US10379628B2 (en) * 2017-09-27 2019-08-13 Htc Corporation Tracking system, virtual reality system and attachable device
US11662433B2 (en) * 2017-12-22 2023-05-30 Denso Corporation Distance measuring apparatus, recognizing apparatus, and distance measuring method
CN207704395U (zh) * 2018-01-10 2018-08-07 深圳创维新世界科技有限公司 手柄、增强现实显示系统及头戴显示系统
US10475196B2 (en) * 2018-03-22 2019-11-12 Microsoft Technology Licensing, Llc Hybrid depth detection and movement detection
US10565678B2 (en) * 2018-03-23 2020-02-18 Microsoft Technology Licensing, Llc Asynchronous camera frame allocation
JP7014009B2 (ja) * 2018-03-29 2022-02-01 セイコーエプソン株式会社 操作デバイス、位置検出システム及び操作デバイスの制御方法
US10819926B2 (en) * 2018-04-09 2020-10-27 Facebook Technologies, Llc Systems and methods for synchronizing image sensors
US10831286B1 (en) * 2019-01-09 2020-11-10 Facebook, Inc. Object tracking based on detected tracking patterns in a non-visible spectrum range
US11416074B1 (en) * 2019-01-11 2022-08-16 Apple Inc. Electronic devices having flexible light guides
JP7271245B2 (ja) * 2019-03-18 2023-05-11 株式会社ソニー・インタラクティブエンタテインメント 入力デバイス
CN110059621A (zh) * 2019-04-17 2019-07-26 青岛海信电器股份有限公司 手柄光球颜色的控制方法、装置及设备
US11671720B2 (en) * 2019-08-08 2023-06-06 Microsoft Technology Licensing, Llc HDR visible light imaging using TOF pixel
CN110572635A (zh) * 2019-08-28 2019-12-13 重庆爱奇艺智能科技有限公司 一种用于手持控制设备追踪定位的方法、设备与系统
US11127215B1 (en) * 2019-09-20 2021-09-21 Facebook Technologies, Llc LED emitter timing alignment for image-based peripheral device tracking in artificial reality systems
CN110865706A (zh) * 2019-11-07 2020-03-06 深圳市虚拟现实科技有限公司 一种虚拟现实手柄以及系统以及定位方法
US20210208673A1 (en) * 2020-01-03 2021-07-08 Facebook Technologies, Llc Joint infrared and visible light visual-inertial object tracking
CN111174683B (zh) * 2020-01-07 2021-11-30 青岛小鸟看看科技有限公司 手柄定位方法、头戴显示设备以及存储介质
EP3893239B1 (en) * 2020-04-07 2022-06-22 Stryker European Operations Limited Surgical system control based on voice commands
US11128636B1 (en) * 2020-05-13 2021-09-21 Science House LLC Systems, methods, and apparatus for enhanced headsets
US11567528B2 (en) * 2020-05-19 2023-01-31 Microsoft Technology Licensing, Llc LED synchronization for virtual and augmented reality devices
CN111752386A (zh) * 2020-06-05 2020-10-09 深圳市欢创科技有限公司 一种空间定位方法、系统及头戴式设备
EP4222951A1 (en) * 2020-09-30 2023-08-09 Snap Inc. Multi-purpose cameras for augmented reality and computer vision applications
WO2022217603A1 (zh) * 2021-04-16 2022-10-20 深圳意科莱照明技术有限公司 一种led珍珠灯、智能珍珠灯串、智能珍珠帘屏及智能珍珠网屏

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110622107A (zh) * 2017-05-09 2019-12-27 微软技术许可有限责任公司 跟踪可穿戴设备和手持对象姿势
JP2020052793A (ja) * 2018-09-27 2020-04-02 株式会社Nttドコモ システム
CN111614915A (zh) * 2020-05-13 2020-09-01 深圳市欢创科技有限公司 一种空间定位方法、装置、系统和头戴式设备
CN112437213A (zh) * 2020-10-28 2021-03-02 青岛小鸟看看科技有限公司 图像采集方法、手柄设备、头戴设备及头戴系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4142277A4 *

Also Published As

Publication number Publication date
CN112437213A (zh) 2021-03-02
US20220373793A1 (en) 2022-11-24
EP4142277A1 (en) 2023-03-01
US11754835B2 (en) 2023-09-12
EP4142277A4 (en) 2023-12-06

Similar Documents

Publication Publication Date Title
CN103702029B (zh) 拍摄时提示对焦的方法及装置
US20180010903A1 (en) Distance image acquisition apparatus and distance image acquisition method
US9930320B2 (en) Resolving three dimensional spatial information using time-shared structured lighting that embeds digital communication
KR102489402B1 (ko) 디스플레이 장치 및 그의 영상 표시 방법
KR102180479B1 (ko) 전자기기 및 전자기기의 동작 방법
AU2018393169A1 (en) Ambient light detection method, proximity detection method, photographing method, and terminal
CN104486536B (zh) 一种自动影像拍摄系统及其实现方法
WO2020259542A1 (zh) 一种显示设备的控制方法及相关装置
WO2019129020A1 (zh) 一种摄像头自动调焦方法、存储设备及移动终端
JP6423710B2 (ja) 携帯機器、通信システム、通信接続方法、及び通信接続プログラム
JP5973704B2 (ja) 投影制御装置及び投影制御方法
WO2022089075A1 (zh) 图像采集方法、手柄设备、头戴设备及头戴系统
TWI699607B (zh) 電子組件和電子裝置
US11663992B2 (en) Fade-in user interface display based on finger distance or hand proximity
WO2018219267A1 (zh) 曝光方法、装置、计算机可读存储介质和移动终端
CN105262538B (zh) 一种光信息定位系统
WO2022007944A1 (zh) 一种设备控制方法及相关装置
US20210289128A1 (en) Image Acquisition Method, Apparatus, and Terminal
JP2019511018A (ja) モバイルデバイス時間の表示方法及び装置
US9564966B1 (en) Reconstructing light-based communication signals using an alias frequency
CN110287903B (zh) 一种皮肤检测方法及终端
JP2014203153A (ja) 表示制御装置
CN109164456B (zh) 深度摄像头模组、移动终端及摄像头模组互扰处理方法
US10447998B2 (en) Power efficient long range depth sensing
CN110457885A (zh) 一种操作方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21884805

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021884805

Country of ref document: EP

Effective date: 20221124

NENP Non-entry into the national phase

Ref country code: DE