CN114610165A - Handheld control device and augmented reality equipment - Google Patents

Handheld control device and augmented reality equipment Download PDF

Info

Publication number
CN114610165A
CN114610165A CN202210295628.3A CN202210295628A CN114610165A CN 114610165 A CN114610165 A CN 114610165A CN 202210295628 A CN202210295628 A CN 202210295628A CN 114610165 A CN114610165 A CN 114610165A
Authority
CN
China
Prior art keywords
module
control device
control
hand
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210295628.3A
Other languages
Chinese (zh)
Inventor
王凯莉
黄乔昆
李紫薇
牟晓倩
任超
金朱荣
王晓安
王国辉
戈云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210295628.3A priority Critical patent/CN114610165A/en
Publication of CN114610165A publication Critical patent/CN114610165A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a handheld control device and an augmented reality device, wherein the handheld control device comprises a handle main body and a binding piece, the handle main body is provided with a holding part, and the binding piece is connected with the handle main body; wherein, tie up the binding and include relative setting aerify portion and gassing portion and locate the gasbag portion of aerifing between portion and the gassing portion, the relative both sides of holding portion are located to gassing portion and the portion of aerifing, and gasbag portion sets up with holding portion interval. According to the handheld control device and the augmented reality equipment provided by the embodiment of the application, the inflation part and the deflation part are arranged on the two opposite sides of the holding part, so that the binding piece is connected with the handle main body; and set up gasbag portion and the interval of portion of gripping, can borrow to inflate gasbag portion or the realization of exitting is tied up or is relieved to tie up of handle main part and tie up by so as to realize preventing of handle main part fall, prevent leaving functions such as hand, and have better wearing travelling comfort.

Description

Handheld control device and augmented reality equipment
Technical Field
The application relates to the technical field of augmented reality equipment, in particular to a handheld control device and augmented reality equipment.
Background
Extended Reality (XR) devices may be used to provide the user with altered Reality, which generally include Virtual Reality (VR) devices, Mixed Reality (MR) devices, and/or Augmented Reality (AR) devices.
Wherein the XR device may include a handheld control to supplement the user's augmented reality experience. The handle may serve as a handheld control for the XR device human-computer interaction. Therefore, how to provide a handle with convenient wearing and good comfort becomes a technical problem to be solved urgently for XR equipment.
Disclosure of Invention
The embodiment of the application provides a handheld control device which can be used for an augmented reality device and comprises a handle main body and a binding piece, wherein the handle main body is provided with a holding part, and the binding piece is connected with the handle main body; wherein, tie up the binding and include the relative portion of aerifing that sets up and the portion of exitting and locate aerify the portion with gasbag portion between the portion of exitting, the portion of exitting with aerify the portion and locate the relative both sides of the portion of gripping, gasbag portion with the portion interval setting of gripping.
On the other hand, the embodiment of the application further provides an augmented reality device, which comprises a device main body and a handheld control device in signal connection with the device main body, wherein the handheld control device comprises a handle main body and a binding piece, the handle main body is provided with a holding part, and the binding piece is connected with the handle main body; wherein, tie up the binding and include the relative portion of aerifing that sets up and the portion of exitting and locate aerify the portion with gasbag portion between the portion of exitting, the portion of exitting with aerify the portion and locate the relative both sides of the portion of gripping, gasbag portion with the portion interval setting of gripping.
According to the handheld control device and the augmented reality equipment provided by the embodiment of the application, the inflation part and the deflation part of the bandaging piece are arranged on the two opposite sides of the holding part of the handle main body, so that the bandaging piece is connected with the handle main body; the air bag part and the holding part of the binding piece are arranged at intervals, so that the binding or the unbinding of the handle main body can be realized by inflating or deflating the air bag part, the functions of falling prevention, hand leaving prevention and the like of the handle main body are realized, and the wearing comfort is better; in addition, when the hand-held control device of in-service use, only need inflate or deflate the gasbag portion through one-hand operation control and can realize tying up or removing tying up the handle main part, promote user and use experience.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram illustrating an example of an augmented reality device according to some embodiments of the present disclosure;
FIG. 2 is a schematic diagram of a hand-held control device according to some embodiments of the present application;
FIG. 3 is a schematic view of the configuration of the binding as it is inflated in some embodiments of the present application;
FIG. 4 is a schematic view of the configuration of the binding of some embodiments of the present application when deflated;
FIG. 5 is a schematic diagram of a hand-held control device according to further embodiments of the present application;
FIG. 6 is a schematic representation of an XR device in accordance with further embodiments of the present application;
FIG. 7 is a schematic diagram of an alternate embodiment of the XR device of the embodiment of FIG. 6;
FIG. 8 is a schematic diagram of an alternative embodiment of the XR device of the embodiment of FIG. 6;
FIG. 9 is a schematic diagram of the structure of the host unit in the embodiment of FIG. 8;
FIG. 10 is a block diagram illustrating an electronic device in some embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be noted that the following examples are only illustrative of the present application, and do not limit the scope of the present application. Likewise, the following examples are only some examples and not all examples of the present application, and all other examples obtained by a person of ordinary skill in the art without any inventive work are within the scope of the present application.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
As used herein, an "XR device" may be used to provide a user with a reality of change, which may include VR devices, MR devices, and/or AR devices. The XR device may also include a display to provide video, images, and/or other visual stimuli to the user via the display to provide a "virtual and/or augmented" reality experience to the user. The XR device may include an audio output device to provide auditory stimuli to the user to further enhance the virtual reality of the user experience. The XR device may include a handheld control to supplement the user's augmented reality experience. For example, the handheld control device may be used to virtually animate a user's hand movements, such as moving, grasping, releasing, and the like. Among other things, it is understood that the term "Extended Reality (XR)" in this application refers to a computing device generated scene that simulates an experience through sensation and perception.
Based on this, the XR device may provide video, audio, images, and/or other stimuli to the user through the display to provide the user with altered reality. As previously mentioned, the term "XR device" in this application refers to a device that provides a virtual, mixed and/or augmented reality experience for a user. Wherein the user may experience the XR device by using the wearable device and/or the handheld control apparatus.
Wherein, wearable equipment can be wearable equipment such as intelligent bracelet, intelligent wrist-watch, VR glasses, AR glasses, intelligent foot chain and intelligent waistband, does not do the injecing here, as long as this wearable equipment can be worn and can understand the wearable equipment of this application on the human body. For convenience of explanation, the wearable device in the embodiment of the present application is described by taking a head-mounted device as an example.
In some embodiments, the XR device may cover the user's eyes and provide visual stimuli to the user through the display, replacing "augmented reality" (e.g., "virtual reality," "mixed reality," and/or "augmented reality") for actual reality. In some embodiments, the XR device may provide an image on the front that covers the transparent or translucent screen user's eyes to "augment" the reality with additional information such as graphical representations and/or supplemental data. For example, the XR device may overlay transparent or translucent weather information, directions, and/or other information on the XR display for user inspection. The handheld control device may be used in conjunction with an XR device and may be a useful way to animate the hand movements of a user. It will be understood that the term "animation" in this application refers to dynamic visual media generated from sequential images that are manipulated to appear in motion.
Wherein, the handle can be as XR equipment human-computer interaction's handheld controlling means. The applicant has found in their research that users are generally unable to see the environment when using XR devices, and therefore the basic functions required of the handle are that they do not fall down after the hands have slipped, they can be retrieved blindly after the handle has left the hands, or they do not leave the hands even if the user looses the hands. Based on this, the applicant provides a novel handheld control device and an augmented reality device after further research. Wherein, this handheld control device can realize preventing falling, blindly look for, prevent leaving functions such as hand, and have better wearing travelling comfort.
Referring to fig. 1, fig. 1 is a schematic block diagram of an Extended Reality (XR) apparatus 100 according to some embodiments of the present application, where the XR apparatus 100 generally includes an apparatus body 10 and a handheld control device 20 in signal connection with the apparatus body 10. Wherein the signal connection may be a wired connection, a wireless connection, or a combination thereof. The signal connection may be configured to carry any kind of data, such as image data (e.g., still images and/or full motion video, including 2D and 3D images), audio, multimedia, voice, and/or any other type of data. The signal connection may be, for example, a Universal Serial Bus (USB) connection, a Wi-Fi connection, a bluetooth or Bluetooth Low Energy (BLE) connection, an ethernet connection, a cable connection, a DSL connection, a cellular connection (e.g., 3G, LTE/4G or 5G), etc., or a combination thereof.
The device body 10 may be the aforementioned wearable device, among others. For example, the head-mounted device may be VR glasses, AR glasses, or the like. Wherein the device body may be configured to signal the connection to transfer data to and receive data from the external processing device. However, in other cases, the apparatus main body may be used as a stand-alone apparatus, i.e., data processing is performed in the apparatus main body itself. The external processing device may be, for example, a gaming console, a personal computer, a tablet computer, a smart phone, or other type of processing device. Additionally, the external processing device may communicate with one or more other external processing devices via a network, which may be or include, for example, a Local Area Network (LAN), a Wide Area Network (WAN), an intranet, a Metropolitan Area Network (MAN), the global internet, or a combination thereof.
In one embodiment, the apparatus main body 10 may be provided with a main body, and the main body may include an optical machine component, a camera component, a main board, a speaker component, a microphone component, and the like. The apparatus body 10 may be mounted with a display module, an optical device, a sensor, a processor, and the like. Wherein the display assembly is designed to overlay an image on the user's view of their real world environment, for example, by projecting light into the user's eyes. The device body 10 may also include an ambient light sensor, and may also include electronic circuitry to control at least some of the above-described components and to perform associated data processing functions. The electronic circuitry may include, for example, one or more processors and one or more memories. Of course, the device main body may also be provided with an electrical structure such as a circuit board or a main board for realizing signal transmission. The circuit board and/or motherboard mounts the circuitry of the device body 10 and can provide a series of joints for the processor, memory, external devices, etc. The most important component on the circuit board and/or the motherboard of the device main body 10 is a chipset, which provides a general platform for the circuit board and/or the motherboard to connect with different devices and control communication between different devices. The chipset may provide additional functionality to the circuit board and/or motherboard such as integrated display core, infrared communication technology, bluetooth. The circuit board and/or the main board is electrically connected to a battery provided in the apparatus body 10 to obtain power supply. It should be noted that unless otherwise expressly specified or limited, the terms "mounted," "connected," and the like are to be construed broadly and can include, for example, fixed connections, removable connections, or integral connections; can be mechanically or electrically connected; either directly or indirectly through intervening media, or through the communication between two elements. The specific meaning of the above terms herein can be understood in a specific context to those skilled in the art.
Referring to fig. 2, fig. 2 is a schematic diagram of a handheld control device 20 according to some embodiments of the present disclosure, where the handheld control device 20 may be used in an XR apparatus 100. The handheld control device 20 can be used to connect with the device main body 10, so as to cooperate with the device main body 10 to realize the virtual reality experience of the user. In particular, when the user uses the handheld control device 20 to play a virtual reality experience such as a game using the device body 10, the user can control a virtual character of the game by manipulating a key or the like of the handheld control device 20. The hand-held control device 20 can be used by being fixed to the apparatus main body 10. Of course, the handheld control device 20 and the apparatus main body 10 may be two independent components, and signal connection may be implemented through wired or wireless means. In some embodiments, the handheld control device 20 may be a handle; of course, the handheld control device 20 may also be other handheld devices capable of implementing corresponding control or operation functions in a handheld manner.
It should be noted that all directional indicators (such as upper, lower, left, right, front, rear, horizontal and vertical … …) in the embodiments of the present application are only used to explain the relative position relationship between the components, the movement situation, etc. in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indicator is changed accordingly.
Specifically, the handheld control device 20 roughly includes: a handle body 210 and a binding 220. As shown in fig. 2, the term "handle body" is understood in the present application as "grip", i.e. a member that can be gripped or held by a user's hand. For example, a user may interact with the handheld control device 20 by grasping and/or holding the handle body (or grip) with a hand. The binding 220 is connected with the handle main body 210 and is configured to bind the handle main body 210 to a human body.
The handle main body 210 generally includes a holding portion 211 and a key portion 212 disposed at one end of the holding portion 211, and the holding portion 211 is configured to be held by a human body, that is, the human body can hold the handle main body 210 through the holding portion 211. The key portion 212 is provided with keys, and a user can perform corresponding operations on the keys to realize interaction with the handheld control device 20.
Of course, in some embodiments, only the grip portion 211 may be disposed on the handle main body 210, and the key buttons may be disposed on the grip portion 211, so that the user can interact with the handheld control device 20 through the key buttons on the grip portion 211. For example, when the user grasps and/or holds the grip portion 211 by hand, the palm of the user's hand may contact the grip portion 211, and the user's fingers may perform corresponding operations on the keys on the grip portion 211.
The binding 220 is connected with the handle main body 210 for binding the handle main body 210 to the human body. Wherein, tie-up 220 and the portion 211 interval setting of gripping to form the space that can supply the human palm to pass with the cooperation of the portion 211 of gripping, so that the palm can contact with the portion 211 of gripping when passing above-mentioned space, and cooperate with the finger through the palm and grip the portion 211 of gripping. The binding 220 can be used to secure the handle body 210 to the human body, so that the handle body 210 can be prevented from falling down or falling off the hand after the user releases the grip 211, and the hand-held control device 20 can be prevented from falling down or falling off the hand by the binding 220. The binding 220 on the other hand can be used to achieve separation between the handle body 210 and the human body, i.e. the handle body 210 can be easily removed and put in place by loosening the binding 220.
Specifically, when a human body part (e.g., a palm) holds the handle main body 210 by the holding part 211, the binding 220 is located on a side of the human body part (e.g., the palm) facing away from the holding part 211 and cooperates with the holding part 211 to bind the handle main body 210. When human position unclamped the portion of gripping 211, based on the relation of connection of bandaging 220 and handle main part 210, bandaging 220 can cooperate with the portion of gripping 211 equally and tie up handle main part 210, avoids handle main part 210 to take off the hand landing, causes unnecessary damage. Also, when the grip portion 211 is released from the human body part, the grip main body 210 can be easily separated from the human body part by controlling the degree of tightness of the binding 220. It can be appreciated that the user is generally unable to see the external environment when using the XR device, and at this time, the relative position relationship between the handle body 210 and the handle body 210 under the cooperation of the binding 220 and the grip portion 211 is substantially not changed, and no matter whether the human body part loosens the grip portion 211, the handle body 210 will not slip off the hand, and the defect that the user cannot find the handle body 210 when the human body part needs to be held again is also avoided.
In one embodiment, the tether 220 generally comprises an inflation portion 221 and a deflation portion 222 disposed opposite each other, and a balloon portion 223 disposed between the inflation portion 221 and the deflation portion 222. Wherein the inflation portion 221 and the deflation portion 222 are configured for enabling connection between the tie 220 and the handle body 210. Preferably, the inflation portion 221 and the deflation portion 222 are disposed on opposite sides of the grip portion 211, and the balloon portion 223 is disposed at a distance from the grip portion 211. The air bag portion 223 may be configured to be used to tie the handle main body 210 to a human body, on the one hand, or the air bag portion 223 may be configured to be used to free the handle main body 210, on the other hand.
The airbag section 223 is generally referred to as an inflatable container or an airbag, among others. For example, the bladder portion 223 may be inflated with a fluid such that the bladder portion includes the fluid within the bladder. The fluid may include, for example, a gas, a liquid (e.g., water or other liquid), and the like. In some embodiments, the air bladder portion 223 may be filled with air. However, the embodiments of the present application are not limited thereto. For example, the bladder portion 223 may be inflated with any other gas. In other words, the inside of the bag portion 223 is hollow.
It will be appreciated that the air bag portion 223 is generally made of a flexible material having a certain elasticity, such as rubber, silicone, resin, etc., and the interior thereof may be filled with a fluid to achieve the elastic function. The air bag portion 223 is stretched or contracted during inflation or deflation, and thus, the binding or unbinding of the handle main body 210 can be achieved. For example, when the air bag portion 223 is inflated, the air bag portion 223 can be more comfortably contacted with the human body based on the characteristics of the flexible material, thereby binding the handle main body 210 to the human body; deflation of the bladder portion 223 may allow for de-tethering of the handle body 210.
In an embodiment, the inflation portion 221 and the deflation portion 222 may also be made of a flexible material having a certain elasticity, such as rubber, silicone, resin, etc., and the inflation portion 221, the deflation portion 222, and the balloon portion 223 may be made by an integral molding process. The inflating portion 221 and the deflating portion 222 are respectively communicated with the bag portion 223 so that the bag portion 223 can be inflated by the inflating portion 221 and deflated by the deflating portion 222. Of course, in other embodiments, the inflation portion 221 and the deflation portion 222 can be made of a hollow tubular structure made of a relatively hard material such as plastic, metal, etc. to achieve communication with the air bag portion 223.
It can be understood that the hand-held control device 20 provided in the embodiment of the present application connects the harnessing member 220 with the handle body 210 by providing the inflating portion 221 and the deflating portion 222 of the harnessing member 220 on opposite sides of the grip portion 211 of the handle body 210; the air bag part 223 and the holding part 211 of the binding piece 220 are arranged at intervals, so that the binding or the unbinding of the handle main body 210 can be realized by inflating or deflating the air bag part 223, the functions of falling prevention, hand leaving prevention and the like of the handle main body 210 are realized, and the wearing comfort is better; in addition, when the hand-held control device 20 is actually used, only the airbag portion 223 needs to be inflated or deflated through one-hand operation control to realize the binding or the release of the binding of the handle main body 210, so that the use experience of the user is improved.
Wherein, the mode through aerifing or deflating air bag portion 223 realizes tying up of handle main part 210 and ties up operations such as fixed or loosely tying up, has better travelling comfort when tying up on the one hand, and on the other hand can avoid when the human body loosens handle main part 210 landing to cause the fall down, namely the user is loose hand back handle main part 210 also can not leave the hand, realizes anticreep hand and prevents dropping. In addition, the handle body 210 can be released from the binding by deflating the air bag, so that the hand-held control device 210 can be detached from the human body more conveniently.
When the holding part 211 is held, the air bag part 223 can be inflated and stops inflating when the holding part 211 is inflated to a proper degree, at this time, the handle main body 210 can be bound on the human body, and meanwhile, when the holding part 211 is not held on the human body, the handle main body 210 can not fall off or slip off. When it is necessary to release the handle main body 210 from the binding, the air-bleeding operation of the air bag portion 223 can be performed so as to release the binding and facilitate the free detachment of the handle main body 210 by the human body.
According to the handheld control device and the augmented reality equipment provided by the embodiment of the application, the inflation part and the deflation part of the bandaging piece are arranged on the two opposite sides of the holding part of the handle main body, so that the bandaging piece is connected with the handle main body; the air bag part and the holding part of the binding piece are arranged at intervals, so that the binding or the unbinding of the handle main body can be realized by inflating or deflating the air bag part, the functions of falling prevention, hand leaving prevention and the like of the handle main body are realized, and the wearing comfort is better; in addition, when the hand-held control device of in-service use, only need inflate or deflate the gasbag portion through one-hand operation control and can realize tying up or removing tying up the handle main part, promote user and use experience.
Referring to fig. 3 and 4, fig. 3 is a schematic illustration of the configuration of the present application during inflation of the anchor according to some embodiments of the present application, and fig. 4 is a schematic illustration of the configuration of the present application during deflation of the anchor according to some embodiments of the present application.
During inflation of the harness 220, the spacing between the balloon portion 223 and the grip portion 211 of the harness 220 gradually decreases; during deflation of the tether 220, the spacing between the balloon portion 223 and the gripping portion 211 of the tether 220 gradually increases. The hollow space inside the airbag section 223 forms the air accommodating space of the airbag. As shown in fig. 3, when the binding 220 is inflated, the air bag portion 223 is expanded, so that the distance between the air bag portion 223 and the grip portion 211 is reduced, thereby achieving the binding effect on the human body. As shown in fig. 4, when the binding 220 is deflated, the air bag portion 223 contracts, so that the distance between the air bag portion 223 and the grip portion 211 increases, whereby the hand-held control device 20 can be separated from the human body, i.e., the handle main body 210 can be unbound.
The balloon portion 223 generally includes an inflation end 223a and a deflation end 223b disposed opposite to each other, and a balloon body 223c disposed between the inflation end 223a and the deflation end 223 b. Wherein the airbag body 223c is configured to enclose the air containing space, and the air containing space is communicated with the inflating end 223a and the deflating end 223b, respectively. Further, the inflation end 223a communicates with the inflation portion 221 to effect inflation of the bag portion 223 by the inflation portion 221. The deflation end 223b communicates with the deflation part 222 to allow deflation of the balloon part 223 by means of the deflation part 222.
It is understood that the air containing space defined by the airbag body 223c can be understood as an air passage, and the inflation end 223a and the deflation end 223b can be understood as opposite ends of the air passage.
In one embodiment, the handle body 210 may have the air pump 30 disposed thereon, and the air pump 30 may be disposed inside or on the surface of the handle body 210. Preferably, the air pump 30 is provided inside the handle main body 210, and is connected to the inflating portion 221 so as to be configured to inflate the air bag portion 223. The handle body 210 may also have a release valve 40 disposed thereon, and the release valve 40 may be disposed within or on the surface of the handle body 210. Preferably, the deflation valve 40 is provided inside the handle body 210 and connected to the deflation part 222 to be configured for deflating the air bag part 223.
Further, the handle body 210 may further include a control plate 50, and the control plate 50 may be disposed inside or on the surface of the handle body 210. Preferably, the control plate 50 is provided inside the handle main body 210. Wherein, the control board 50 is connected with the air pump 30 and the air release valve 40, respectively. That is, the control board 50 is configured to control the air pump 30 to inflate the air bag portion 223 in response to the relevant operation or action of the user, and the control board 50 is configured to control the deflation valve 40 to deflate the air bag portion 223. It can be understood that, when a human body part is in contact with or held by the holding part 211, the control board 50 can control the air pump 30 to inflate the air bag part 223 in response to the above-mentioned action of the human body part, so that the handle main body 210 is bound to the human body to avoid falling off or falling off. When it is necessary to release the binding of the handle main body 210 in order to separate the handle main body 210 from the human body, the control board 50 can control the deflation valve 40 to deflate the air bag portion 223 in response to the relevant operation instruction, so that the handle main body 210 can be separated from the human body more conveniently.
The handle body 210 may further include a battery 60 disposed therein, and the battery 60 is electrically connected to the control board 50 to provide power for the control board 50 to perform corresponding functional operations. Of course, the handle body 210 may also be provided with some peripheral interfaces (not shown in the figure), which may be interfaces for connecting with an external power source, and may also be signal interfaces for connecting with external electronic devices, which is not limited in particular. For example, the peripheral interface may be a Type-C interface, a B-5Pin interface, a B-4Pin interface, a B-8Pin-2 × 4 interface, a Micro USB interface, or the like. The peripheral interface may be connected to an external power source or an external electronic device by wire.
In some embodiments, the handheld control device 20 may also be provided with a sensor. As used herein, a "sensor" may refer to a device for detecting events and/or changes in its environment and transmitting the detected events and/or changes for processing and/or analysis. For example, the sensors may detect events/changes associated with a user of the handheld control device 20. As described further herein, the sensors may be touch sensors, pressure sensors, and/or proximity sensors, among others.
Wherein the sensor may include a first sensor 201 provided on the binding 220, the first sensor 201 being connected with the control board 50. The first sensor 201 is used for detecting the pressure of the air bag and sending the acquired pressure signal to the control board 50. In other words, the first sensor 201 may be provided on the balloon portion 223 of the binding 220 for detecting a pressure signal of the balloon portion 223, and the control board 50 may control the air pump 30 to inflate the balloon portion 223 or control the deflation valve 40 to deflate the balloon 223 based on the received pressure signal. For example, the first sensor 201 may be a pressure sensor. It will be understood that "pressure sensor" as used herein refers to a device that detects human-related events and/or changes by detecting a force applied to an air bag. For example, when the air bag is inflated or deflated, the force exerted by the gas on the air bag changes, and the pressure sensor can detect the force exerted by the gas on the air bag. As another example, when the tether 220 is in contact with a human body, the force between the tether 220 and the human body changes when the balloon is inflated or deflated, and the pressure sensor may detect the force between the tether 220 and the human body.
The sensors may also include a second sensor 202 disposed on the handle body 210, the second sensor 202 being coupled to the control panel 50. Preferably, the second sensor 202 is disposed on the grip 211 for detecting a grip state of the grip 211 and transmitting the acquired grip state information of the grip 211 to the control board 50. The control board 50 may control the air pump 30 to inflate the air bag portion 223 or control the deflate valve 40 to deflate the air bag 223 according to the acquired gripping state information. For example, the second sensor 202 may be a touch sensor and/or a proximity sensor.
It will be understood that "touch sensor" as used herein refers to a device that detects human-related events and/or changes based on capacitive sensing by detecting the interaction of an electrical conductor and/or an object having a dielectric different from air. In some examples, the touch sensor may be a metal electrode. In some examples, the metal electrode may be included as part of (e.g., on) the touch sensor in some examples, and the touch sensor (e.g., the metal electrode) may be a stand-alone sensor. The touch sensor may detect when the user is holding the handheld control device 20 based on the user's skin contacting the touch sensor.
As the "proximity sensor" used herein, a sensor for detecting without touching a detection object can convert movement information and presence information of the detection object into an electrical signal. The proximity sensor is a device capable of sensing the proximity of an object, that is, the proximity sensor can determine whether a human body is about to perform a gripping action based on whether a human body part is close to the handle body 210 or the gripping part 211, and output a corresponding detection signal to the control board 50, and the control board 50 can determine whether to inflate or deflate the airbag based on the received detection signal.
It is to be understood that the terms "first", "second" and "third" in the embodiments of the present application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature.
In one embodiment, when the handle body 210 is held by a human body part, a pressure sensor may be disposed on a side of the holding portion 211 close to the human body part, that is, a pressure sensor may be disposed on a side of the holding portion 211 close to the air bag portion 223, so as to obtain an acting force of the human body part on the holding portion 211, and enable the control board 50 to obtain the acting force, thereby determining whether to inflate or deflate the air bag. Of course, the pressure sensor may be disposed on the side of the bag portion 223 close to the grip portion 211, and may detect not only the force applied to the human body by the anchor 220 but also the force applied by the internal gas to the bag.
In yet another embodiment, a proximity sensor may be disposed on a side of the holding portion 211 near the air bag portion 223 to obtain a detection signal of the air bag portion 223 near the human body part and send the detection signal to the control board 50, and the control board 50 may determine whether to inflate or deflate the air bag accordingly.
It will be appreciated that one or more sensors may be provided on the hand-held control device 20, and the types of the sensors may be one type or more types, and may be respectively connected to the control board 50, so that the control board 50 can receive corresponding sensor signals and can control whether to inflate or deflate the air bag based on the acquired sensor signals.
Of course, in other embodiments, the button part 212 of the handle body 210 may be provided with a control button, the control button is connected to the control board 50, and the control board 50 may control the air pump 30 to inflate the air bag part 223 or control the deflation valve 40 to deflate the air bag 223 according to a button signal of the control button. In other words, unlike the sensor automatically sensing the signal to make the control board 50 perform the inflation or deflation operation in the corresponding control manner, the present embodiment can achieve the inflation or deflation operation by manually operating the control keys, thereby improving the flexibility of the operation of the handheld control device 20.
When the augmented reality device provided by the embodiment of the application needs the handheld control device to be matched with the device main body for use so as to improve the virtual reality experience of a user, the use process of the handheld control device is approximately as follows: the hand of the user passes through between the handle main body and the binding and holds the handle main body by the holding part. At this time, the control board may control the air pump to inflate the airbag based on a sensing signal of the sensor (e.g., a proximity sensor, a pressure sensor, and/or a touch sensor). When the pressure sensor senses that the pressure between the hand and the binding piece reaches a preset value, a sensing signal is sent to the control panel, and the control panel controls the air pump to stop inflating the air bag. It can be understood that the preset pressure value refers to that the binding piece can bind the handle main body on the hand more comfortably, so that the handle main body is not too tight to cause uncomfortable wearing or too loose to cause looseness.
It can be understood that, the manner of stopping inflation can be as above, the control panel controls the air pump to stop inflation through the sensing signal of the sensor, namely, the sensor monitors the inflation pressure of the air bag in real time, and the air pump stops inflation automatically after the comfortable pressure of the human body is reached. Of course, in other embodiments, the maximum pressure of the air pump may be set in a preset manner, and when the pressure of the air pump reaches the preset maximum pressure, the air bag is inflated for a preset time and then the inflation is automatically stopped. In addition, in other embodiments, the inflation of the air pump can be stopped manually, that is, the operation of the air pump or the operation of the air release valve can be controlled by operating the control button, so as to realize the inflation or stop the inflation or deflation operation. Obviously, the operation of inflating, stopping inflating or deflating in the above embodiment can be completed by one-hand operation, that is, when the hand-held control device is actually used, the user can tie or release the tie to the handle main body only by inflating or deflating the airbag part through one-hand operation control, so that the user experience can be improved.
When the handheld control device is not required to be matched with the equipment main body for use, namely the handheld control device is required to be separated from the hand, the deflation valve can be opened through a key or a switch arranged on the handheld control device (such as a handle main body) so as to deflate the air bag, and therefore the handheld control device can be separated from the hand. Of course, the air bag can also be deflated by controlling the deflation valve through the control panel in response to the use state of the handheld control device. For example, when the XR device is used up, an instruction to deactivate the hand held control device may be given on the device body, at which point the control panel controls the deflation valve to operate to deflate the air bag, so that the hand held control device can be detached from the hand.
Referring to fig. 5, fig. 5 is a schematic diagram of a handheld control device 20 according to another embodiment of the present application, where the handheld control device 20 may be used in an XR apparatus 100. Wherein, the handheld control device 20 can be used to connect with the device body 10 of the XR device 100, so as to cooperate with the device body 10 to realize the virtual reality experience of the user. The hand-held control device 20 generally comprises: a handle body 210 and a binding 220. The handle main body 210 is configured to be in signal connection with the apparatus main body 10. The binding 220 is connected with the handle main body 210 and is configured to bind the handle main body 210 to a human body.
The embodiment of the present application is different from the foregoing embodiments in that: the construction of the binding 220 is different. Specifically, the tether 220 generally includes an inflation portion 221 and a deflation portion 222 disposed opposite each other, and a balloon portion 223 disposed between the inflation portion 221 and the deflation portion 222, and the balloon portion 223 generally includes an inflation end 223a and a deflation end 223b disposed opposite each other, and a balloon body 223c disposed between the inflation end 223a and the deflation end 223 b. Namely, both ends of the balloon body 223c are respectively connected with the inflation end 223a and the deflation end 223 b.
Wherein, the airbag body 223c may be an arc-shaped hollow structure having a uniform cross section. As shown in fig. 4, when the anchor 220 is inflated, the airbag body 223c is expanded, and the interval between the airbag body 223c and the grip portion is gradually decreased; as shown in fig. 3, when the anchor 220 is deflated, the balloon body 223c contracts, and the distance between the balloon body 223c and the grip portion gradually increases.
In one embodiment, the airbag body 223c includes a plurality of airbag units 223d distributed in an array, and two adjacent airbag units 223d are communicated with each other. Specifically speaking, a plurality of gasbag units 223d can be one or multiseriate interval distribution, can communicate through little pipeline or passageway between two adjacent gasbag units 223d to make gasbag body 223c can be more comfortable laminating in human body surface, thereby promote the comfort that ligature 220 tied up handle main part 210 in the human body.
It is understood that, in the present embodiment, reference may be made to the detailed description in the foregoing embodiments for technical features of the binding 220 that are not described in detail, and thus, no further description is provided in the present embodiment.
Referring now to fig. 6, an XR device is illustrated, fig. 6 schematically illustrating an exemplary XR device 100 according to further embodiments of the present disclosure. The XR device 100 may be, for example, VR glasses, AR glasses, MR glasses, or the like. XR device 100 may include: a data acquisition module 71, a data output module 72, a serial interface 73 and an integrated circuit module 74.
The serial interface 73 may be, for example, a USB interface satisfying USB 2.0 specification, USB3.0 specification, and USB3.1 specification, and may include: micro USB interface or USB TYPE-C interface. Further, the serial interface 73 may also be a signal interface in the handle body. Even the serial interface 73 may be any other type of serial interface capable of being used for serial data transmission.
The integrated circuit module 74 may include: the data conversion module 741 and the interface module 742, the data conversion module 741 is connected to the data acquisition module 71 and the data output module 72 through the interface module 742, respectively. The integrated circuit module 74 may be housed within the device body in the previously described embodiments.
The data conversion module 741 is configured to perform serialization conversion on the data acquired from the data acquisition module 71 through the interface module 742, and output the converted serial data through the serial interface 73, so as to process the converted serial data, for example, transmit the converted serial data to an external electronic device for processing.
The data conversion module 741 is further configured to convert serial data received through the serial interface 73 to convert the received serial data into interface data matching with the interface protocol of the interface module 742, and transmit the converted interface data to the data output module 72 through the interface module 742, so as to output the converted interface data to a user through the data output module 72.
The Integrated Circuit module 74 may be implemented as an ASIC (Application Specific Integrated Circuit) data integration processing chip, for example, or may also be implemented as an FPGA (Field Programmable Gate Array).
According to the XR equipment provided by the embodiment of the application, the integrated circuit chip is used in the XR equipment, the data is acquired through the interface module in the integrated circuit chip, and the acquired data and the data received from the handheld control device are converted in a centralized manner through the data conversion module, so that the space and the volume of the XR equipment can be greatly reduced on one hand, and the XR equipment is light and thin; on the other hand, the power consumption of the chip can be reduced, the heating of XR equipment is reduced, and the user experience is improved; in addition, centralized conversion may also reduce overall data processing delays for the XR device.
Referring to fig. 7, fig. 7 is a schematic diagram of an XR apparatus 100 of fig. 6 in another embodiment. The integrated circuit module 74 in the XR device 100 may include a plurality of Interface modules 742, for example, the plurality of Interface modules 742 may be an I2C Interface module, an SPI Interface module, an I2S Interface module, a SLIMBus Interface module, and an MIPI (Mobile Industry Processor Interface) Interface module, respectively.
The I2C interface module communicates with connected modules using an I2C bus, which is a simple, bi-directional two-wire synchronous serial bus, I2C. It requires only two wires to transfer information between devices connected to the bus. The master device is used to initiate the bus to transfer data and to generate a clock to open up the devices that are transferring, when any addressed device is considered a slave device. If the master device is to send data to the slave device, the master device addresses the slave device first, then actively sends the data to the slave device, and finally the master device terminates data transmission; if the master is to receive data from the slave, the slave is first addressed by the master. The master device is responsible for generating the timing clock and terminating the data transfer. Generally, I2C is a control interface for transmitting control signaling.
And the SPI interface module communicates with the connected modules by using an SPI bus. The SPI bus is a high-speed, full-duplex synchronous communication bus. The SPI communication principle is simple and it works in a master-slave mode, which usually has a master and one or more slaves, requiring 4 lines for master data input, master data output, clock signal transmission, and enable signal transmission for master output, respectively. Usually, the SPI interface is also a control interface for transmitting control signaling.
The I2S interface module communicates with connected modules using an I2S bus. The I2S bus is a bus standard established for audio data transmission between digital audio devices (e.g., CD players, digital sound processors, digital television sound systems). The design of independent wire transmission clock and data signal is adopted, and the data and clock signal are separated, so that the distortion caused by time difference is avoided, the cost for purchasing professional equipment for resisting audio jitter is saved for users, and the method is widely applied to various multimedia systems. A standard I2S bus cable is made up of 3 serial conductors: 1 is a Time Division Multiplexing (TDM) data line; 1 is a word selection line; the 1 root is the clock line.
The SLIMBus interface module communicates with connected modules using a SLIMBus bus. The SLIMBus bus is an audio interface specified by the MIPI alliance for connecting a baseband/application processor and an audio chip, typically for transferring audio data. The two ends of the SLIMbus are composed of an interface device and one to a plurality of functional devices, the interface device and the functional devices are connected by one to a plurality of ports, and the ports can be only input, only output or bidirectional. The SLIMBus bus supports dynamic stop and restart and supports all sampling frequencies.
The MIPI interface module and the connected module adopt MIPI interface specifications for communication. MIPI is an open standard and a specification established by the MIPI alliance for mobile application processors. The purpose is to standardize the interfaces in the mobile phone, such as a camera, a display screen interface, a radio frequency/baseband interface and the like, thereby reducing the complexity of the mobile phone design and increasing the design flexibility. The MIPI multimedia specification is largely divided into three layers, namely an application layer, a protocol layer, and a physical layer. The interface is mainly applied to interfaces of equipment such as a camera, a display and the like, and can comprise a camera interface CSI (Camera Serial interface), a display interface DSI (display Serial interface) and the like.
As shown in fig. 7, XR device 100 may include a plurality of data acquisition modules 71, for example, the plurality of data acquisition modules 71 may be: an audio data acquisition module, a video data acquisition module (the camera assembly in the foregoing embodiment), an eye tracking module, and a sensing data acquisition module.
The audio data acquisition module may include, for example, a microphone and an audio Codec (Codec). The audio codec audio-encodes the data collected by the microphone.
The video data acquisition module may include, for example, a camera such as a lens of a general camera, an IR (Infrared Ray) lens of an IR camera, and the like.
Eye tracking is a scientific application technology, when the eyes of a person look at different directions, the eyes can slightly change, the changes can generate extractable features, and a computer can extract the features through image capture or scanning, so that the changes of the eyes can be tracked in real time, the state and the demand of a user can be predicted, response is carried out, the purpose of controlling equipment by the eyes is achieved, and for example, the user can turn pages without touching a screen. In principle, eye movement tracking mainly studies the acquisition, modeling and simulation of eyeball movement information, and has wide application. Besides the eye tracker, the equipment for acquiring the eye movement information can also be image acquisition equipment, even a camera on a common computer or a mobile phone, and the eye movement information acquisition equipment can also realize eye tracking under the support of software.
The eye tracking module may include an eye tracker, an image capture device, etc., as described above.
The sensing data acquisition module may include, for example: proximity sensors (Proximity Sensor), IMDs (Inertial Measurement units), visible Light sensors (Ambient Light Sensor), and the like.
Among them, the proximity sensor (for example, a distance sensor provided on the first FPC 523) is a generic name of a sensor that is intended to detect without touching a detection object, instead of a contact detection method such as a limit switch. The movement information and the presence information of the detection object can be converted into an electric signal. The detection principle of the induction type proximity sensor is to detect a magnetic loss caused by an eddy current generated on a surface of a conductor by the influence of an external magnetic field. An alternating magnetic field is generated in the detection coil, and a change in impedance due to an eddy current generated in the metal body of the detection body is detected. Alternatively, an aluminum detection sensor that detects a frequency-phase component, an all-metal sensor that detects only an impedance change component by a working coil, or the like may be included.
IMD is a device for measuring the three-axis attitude angle (or angular rate) and acceleration of an object. Generally, an IMU includes three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detect acceleration signals of an object in three independent axes of a carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to a navigation coordinate system, and measure angular velocity and acceleration of the object in three-dimensional space, and then solve the attitude of the object.
The visible light sensor is a device which takes visible light as a detection object and converts the visible light into an output signal. The visible light sensor can sense the regularly measured quantity and convert the regularly measured quantity into a device or a device of a usable output signal.
Referring to fig. 7, the audio data collection module 71 can be connected to the data conversion module 741 through the SLIMBus interface module 742 and the SPI interface module 742, for example. Control signals can be transmitted between the audio data acquisition module 71 and the SPI interface module 742, and audio data can be transmitted between the audio data acquisition module 71 and the SLIMBus interface module 742.
The video data collection module 71 may be connected to the data conversion module 741 through the MIPI interface module 742 and the I2C interface module 742, for example. The video data collection module 71 and the MIPI interface module 742 can transmit video data therebetween, and the video data collection module 71 and the I2C interface module 742 can transmit control signals therebetween.
The eye tracking module 71 may be connected to the data conversion module 741 through the MIPI interface module 742 and the I2C interface module 742, for example. The eye tracking module 71 and the MIPI interface module 742 may transmit eye tracking data therebetween, and the eye tracking module 71 and the I2C interface module 742 may transmit control signals therebetween.
The sensing data collection module 71 may be connected to the data conversion module 741 via the I2C interface module 742, for example. The sensing data acquisition module 71 and the I2C interface module 742 may transmit sensing data and control signals.
With continued reference to fig. 7, XR device 100 may also include a plurality of data output modules 72, for example. The plurality of data output modules 72 may include, for example, a display module 72 and an audio data output module 72. The display module 72 may be, for example, an optical mechanical component in the foregoing embodiments.
The audio data output module 72 may include, for example, a speaker and/or a headphone interface, and outputs audio data through an external headphone.
The display module 72 may be connected to the data conversion module 741 through the MIPO interface module 742 and the I2C interface module 742, for example. The display module 72 and the MIPO interface module 742 can transmit video data to be displayed, and the display module 72 and the I2C interface module 742 can transmit control signals.
The audio data output module 72 can be connected to the data conversion module 741 via the I2S interface module 742 and the I2C interface module 742, for example. The audio data output module 72 and the I2S interface module 742 may transmit audio data to be output therebetween, and the audio data output module 72 and the I2C interface module 742 may transmit control signals therebetween.
The integrated circuit module 74 may further include a clock module 743 connected to the data conversion module 741 and each interface module 742, respectively, for outputting a clock signal to each module.
In some embodiments, the integrated circuit module 74 may further include: a data compression module 744 and a data decompression module 745.
The data compression module 744 and the data decompression module 745 are respectively connected between the data conversion module 741 and the serial interface 73.
The data compression module 744 is configured to compress serial data to be output before the data conversion module 741 outputs the converted serial data through the serial interface 73, and output the compressed serial data through the serial interface 73.
The data decompression module 745 is configured to decompress the serial data received through the serial interface 73 before the data conversion module 741 receives the serial data through the serial interface 73, and transmit the decompressed serial data to the data conversion module 741 for conversion.
By compressing the data to be transmitted, the transmission bandwidth can be saved, and the transmission rate is improved, so that the real-time performance of the data is further ensured, and the user experience is improved. However, it should be noted that the present application is not limited to the data compression/decompression algorithm, and the specific algorithm may be selected according to the requirement in the practical application.
In some embodiments, XR device 100 may further include: and a power management module 75, connected to the serial interface 73, for receiving the power supplied by the power supply device connected to the serial interface 73 through the serial interface 73 to supply power to the XR apparatus 100.
Referring to fig. 8, fig. 8 is a schematic diagram of another embodiment of the XR apparatus 100 of the embodiment of fig. 6. The XR device 100 may further include a host unit 76. The host unit 76 may include: a processing module 761, a serial interface 762, and an integrated circuit module 763.
The processing module 761 is connected to the integrated circuit module 763. The processing module 761 may be, for example, an Application Processor (AP) for processing the received data and returning the processed data (video data and/or audio data) to the integrated circuit module 74 through the integrated circuit module 763 for output.
Corresponding to the serial interface 73, the serial interface 762 may also be a USB interface satisfying USB 2.0 specification, USB3.0 specification, and USB3.1 specification, and may include: micro USB interface or USB TYPE-C interface. The serial interface 762 may be any other type of serial interface capable of serial data transmission. A cable may be connected between the serial interface 762 and the serial interface 73.
The integrated circuit module 763 may include: a data conversion module 7631 and an interface module 7632. The data conversion module 7631 is connected to the processing module 761 via the interface module 7632. The data conversion module 7631 is configured to convert serial data received through the serial interface 762, to convert the received serial data into interface data matching with an interface protocol of the interface module 7632, and to transmit the converted interface data to the processing module 761 through the interface module 7632.
The data conversion module 7631 is also used for serializing the processed data (audio data and/or video data) received from the processing module 761 through the interface module 7632, and outputting the converted serial data to the serial interface 73 through the serial interface 762.
It will be appreciated by those skilled in the art that the host unit 76 may be, for example, a dedicated device associated with the XR device 100, or the host unit 76 may be an electronic device (e.g., a smart phone, a tablet computer, etc.) configured with the integrated circuit module 763 described above. The processor (e.g., CPU or AP) in the electronic device may be the processing module 761, and the processor may perform corresponding processing on the data received by the integrated circuit module 763 by installing a corresponding application program in the electronic device.
Referring to fig. 9, fig. 9 is a schematic structural diagram of the host unit 76 in the embodiment of fig. 8. The integrated circuit module 763 in the host unit 76 may include a plurality of interface modules 7632, and the plurality of interface modules 7632 may also be an I2C interface module, an SPI interface module, an I2S interface module, a SLIMBus interface module, and an MIPI interface module, respectively.
The data conversion module 7631 may transmit the converted audio data to the processing module 761 through the SLIMBus interface module 7632 and the SPI interface module 7632; the data conversion module 7631 may transmit the converted video data to the processing module 761 through the MIPI interface module 7632 and the I2C interface module 7632; the data conversion module 7631 may transmit the converted eye tracking data to the processing module 761 through the MIPI interface module 7632 and the I2C interface module 7632; the data conversion module 7631 may transmit the converted sensing data to the processing module 761 through the I2C interface module 7632.
The integrated circuit module 763 may further include a clock module 7633 for sending clock signals to the data conversion module 7631 and the interface modules 7632.
In some embodiments, the integrated circuit module 763 may further include: a data compression module 7634 and a data decompression module 7635.
The data compression module 7634 and the data decompression module 7635 are respectively connected between the data conversion module 7631 and the serial interface 762.
The data decompression module 7635 is used for decompressing serial data received through the serial interface 762 and transmitting the decompressed serial data to the data conversion module 7631 for conversion before the data conversion module 7631 receives the serial data from the serial interface 73 through the serial interface 762.
The data compression module 7634 is configured to compress serial data to be output before the data conversion module 7631 outputs the converted serial data through the serial interface 762, and output the compressed serial data to the serial interface 73 through the serial interface 762.
It will be appreciated by those skilled in the art that the compression algorithm used by the data compression module 744 should match the decompression algorithm used by the data decompression module 7635 in the host unit 76 of fig. 9, and the compression algorithm used by the data compression module 7634 in the host unit 76 should match the decompression algorithm used by the data decompression module 745 in fig. 9.
By compressing the data to be transmitted, the transmission bandwidth can be saved, and the transmission rate is improved, so that the real-time performance of the data is further ensured, and the user experience is improved. However, it should be noted that the present application is not limited to the data compression/decompression algorithm, and the specific algorithm may be selected according to the requirement in the practical application.
In some embodiments, the host unit 76 may further include: a power management module 764 and a battery 765. The power management module 764 is respectively connected to the battery 765 and the serial interface 762, and is configured to provide the power provided by the battery 765 to the serial interface 762 through the serial interface 762, so as to power the integrated circuit module 74, the data acquisition module 71, and the data output module 72.
As described above, the host unit 76 may also be implemented as an electronic device.
An electronic device 900 according to this embodiment of the present application is described below with reference to fig. 10. The electronic device 900 shown in fig. 10 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 10, fig. 10 is a block diagram illustrating an architecture of an electronic device 900 according to some embodiments of the present application, where the electronic device 900 is in the form of a general purpose computing device. Components of electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one memory unit 920, and a bus 930 that couples various system components including the memory unit 920 and the processing unit 910.
The storage unit 920 may include a readable medium in the form of a volatile storage unit, such as a random access memory unit (RAM)9201 and/or a cache memory unit 9202, and may further include a read only memory unit (ROM)9203, among others.
Storage unit 920 may also include a program/utility 9204 having a set (at least one) of program modules 9205, such program modules 9205 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 930 can be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 900 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet) via the network adapter 960. As shown, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 910 may be a processing module in a host unit as described above, connected to the integrated circuit module 970 in the electronic device 900. The specific structure of the integrated circuit module 970 can be seen in fig. 8 or fig. 9, and is not described herein again. Further input/output interface 950 may be used to implement the serial interface described above.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
It should be noted that the terms "comprises" and "comprising," and any variations thereof, in the embodiments of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
The above description is only a part of the embodiments of the present application, and not intended to limit the scope of the present application, and all equivalent devices or equivalent processes performed by the content of the present application and the attached drawings, or directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. A hand-held control device, usable with an augmented reality apparatus, comprising:
a handle body provided with a grip portion, an
A binding connected with the handle body;
wherein, tie up the binding and include the relative portion of aerifing that sets up and the portion of exitting and locate aerify the portion with gasbag portion between the portion of exitting, the portion of exitting with aerify the portion and locate the relative both sides of the portion of gripping, gasbag portion with the portion interval setting of gripping.
2. The hand-held control device of claim 1, wherein the bladder portion is made of a flexible material and the interior of the bladder portion is hollow.
3. The hand-held control device of claim 2, wherein the balloon portion comprises an inflation end and a deflation end that are oppositely disposed, and a balloon body disposed between the inflation end and the deflation end; the air bag body is enclosed to form an air containing space, the inflating end is communicated with the inflating part, and the deflating end is communicated with the deflating part.
4. The hand-held control device of claim 3, wherein the airbag body comprises a plurality of airbag units distributed in an array, and two adjacent airbag units are communicated with each other.
5. The hand-held control device of claim 1, wherein the handle body is further provided with an air pump and a deflation valve, wherein the air pump is connected with the inflation portion and the deflation valve is connected with the deflation portion.
6. The hand-held control device of claim 5, wherein a control panel is further arranged in the handle body, and the control panel is respectively connected with the air pump and the air release valve; wherein, the control panel can control the air pump is for the gasbag portion is aerifyd, and the control panel can control the bleed valve is to the gasbag portion is deflated.
7. The hand-held control device of claim 6, wherein the tie is provided with a first sensor, the first sensor being connected to the control board; the first sensor is used for detecting a pressure signal of the binding piece, and the control board is used for responding to the pressure signal to control the air pump to inflate the air bag part or control the deflation valve to deflate the air bag.
8. The hand-held control device of claim 6, wherein the handle body is further provided with a second sensor, the second sensor being connected to the control panel; the second sensor is used for detecting the holding state of the holding part and sending the acquired holding state information of the holding part to the control panel, and the control panel controls the air pump to inflate the air bag part or controls the deflation valve to deflate the air bag according to the holding state information.
9. The hand-held control device of claim 6, wherein the handle body is provided with a control button, the control button is connected with the control board, and the control board controls the air pump to inflate the air bag part or controls the deflation valve to deflate the air bag according to a button signal of the control button.
10. An augmented reality device, comprising a device main body and a handheld control apparatus in signal connection with the device main body, wherein the handheld control apparatus comprises:
a handle body provided with a grip portion, an
A binding connected with the handle body;
wherein, tie up the binding and include the relative portion of aerifing that sets up and the portion of exitting and locate aerify the portion with gasbag portion between the portion of exitting, the portion of exitting with aerify the portion and locate the relative both sides of the portion of gripping, gasbag portion with the portion interval setting of gripping.
CN202210295628.3A 2022-03-23 2022-03-23 Handheld control device and augmented reality equipment Pending CN114610165A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210295628.3A CN114610165A (en) 2022-03-23 2022-03-23 Handheld control device and augmented reality equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210295628.3A CN114610165A (en) 2022-03-23 2022-03-23 Handheld control device and augmented reality equipment

Publications (1)

Publication Number Publication Date
CN114610165A true CN114610165A (en) 2022-06-10

Family

ID=81865412

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210295628.3A Pending CN114610165A (en) 2022-03-23 2022-03-23 Handheld control device and augmented reality equipment

Country Status (1)

Country Link
CN (1) CN114610165A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204709109U (en) * 2015-06-25 2015-10-21 王庆章 A kind of intelligent Chinese medicine othopedics special fixing device
KR20150121938A (en) * 2014-04-22 2015-10-30 엘지전자 주식회사 Electronic device and control method thereof
CN206258821U (en) * 2016-11-10 2017-06-16 天津波塞冬文化传播有限公司 One kind immerses interactive banquet design experience apparatus
CN111781729A (en) * 2020-08-05 2020-10-16 苏州素彰光电科技有限公司 Waveguide augmented reality display device based on microstructure
US20220066500A1 (en) * 2019-04-05 2022-03-03 Hewlett-Packard Development Company, L.P. Apparatus having inflation bladders

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150121938A (en) * 2014-04-22 2015-10-30 엘지전자 주식회사 Electronic device and control method thereof
CN204709109U (en) * 2015-06-25 2015-10-21 王庆章 A kind of intelligent Chinese medicine othopedics special fixing device
CN206258821U (en) * 2016-11-10 2017-06-16 天津波塞冬文化传播有限公司 One kind immerses interactive banquet design experience apparatus
US20220066500A1 (en) * 2019-04-05 2022-03-03 Hewlett-Packard Development Company, L.P. Apparatus having inflation bladders
CN111781729A (en) * 2020-08-05 2020-10-16 苏州素彰光电科技有限公司 Waveguide augmented reality display device based on microstructure

Similar Documents

Publication Publication Date Title
JP6844665B2 (en) Terminal devices, terminal device control methods and programs
US20230258956A1 (en) Connecting assembly for wearable device, wearable device, and wearable apparatus
WO2020140832A1 (en) Three-dimensional facial reconstruction method and apparatus, and electronic device and storage medium
US9372535B2 (en) Systems, articles, and methods for electromyography-based human-electronics interfaces
EP3299932B1 (en) Orientation adjustable multi-channel haptic device
JP4829856B2 (en) Interactive system with input control device
JP3176588U (en) Remote touch circuit and attachment
CN104991650B (en) A kind of gesture controller and a kind of virtual reality system
US20210081048A1 (en) Artificial reality devices, including haptic devices and coupling sensors
US20140028547A1 (en) Simple user interface device and chipset implementation combination for consumer interaction with any screen based interface
WO2015120611A1 (en) Intelligent response method of user equipment, and user equipment
CN106888372B (en) A kind of wireless dummy reality helmet
US10786421B2 (en) Image display system, image display method, and program
JP2024516475A (en) SPLIT ARCHITECTURE FOR A WRISTBAND SYSTEM AND ASSOCIATED DEVICES AND METHODS - Patent application
CN202870750U (en) Motion sensing control system based on mobile phone
CN102198331B (en) Game glove
CN107765850A (en) A kind of sign Language Recognition based on electronic skin and multi-sensor fusion
CN114942694A (en) Multi-modal tactile feedback glove and system for virtual handshake
US20190378614A1 (en) Telemedicine device, information acquiring device, and telemedicine system and method
CN114610165A (en) Handheld control device and augmented reality equipment
CN205485058U (en) Formula intelligence glasses are dressed to necklace
CN210378046U (en) Massage teaching test system
GB2589700A (en) System for transferring haptic information
JPH0619618A (en) Input device
CN208582964U (en) A kind of scalable radio motion sensing control device device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination