CN112083798A - Temperature regulation feedback system responsive to user input - Google Patents

Temperature regulation feedback system responsive to user input Download PDF

Info

Publication number
CN112083798A
CN112083798A CN202010451233.9A CN202010451233A CN112083798A CN 112083798 A CN112083798 A CN 112083798A CN 202010451233 A CN202010451233 A CN 202010451233A CN 112083798 A CN112083798 A CN 112083798A
Authority
CN
China
Prior art keywords
user input
empty
semi
virtual object
temperature adjustment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010451233.9A
Other languages
Chinese (zh)
Inventor
金镇勇
赵晨
傅利民
乔纳斯·孔
斯蒂芬妮·陈
迈克尔·李
肯尼思·吴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/681,629 external-priority patent/US20200393156A1/en
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Publication of CN112083798A publication Critical patent/CN112083798A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers

Abstract

Disclosed is a temperature regulation feedback system responsive to user input, comprising: receiving user input regarding a virtual object; updating the virtual object based at least in part on the user input; determining a temperature adjustment based at least in part on the user input; and outputs a temperature adjustment.

Description

Temperature regulation feedback system responsive to user input
Cross Reference to Related Applications
The present application claims priority from united states provisional patent application No. 62/860,687 (attorney docket number ALIBP411+) entitled aerial thermal feedback system for 3D interaction, filed at 2019, 12, which is incorporated herein by reference for all purposes.
Background
Typically, to experience a product in person, one must go to a brick and mortar store to test the item. However, it is not always convenient or practical to go to a brick and mortar store to try out a product. For example, a person may interact with a product remotely by browsing photos on a website, but this experience is not immersive and not close to actually using the product.
Drawings
Various embodiments of the invention are disclosed in the following detailed description and drawings.
FIG. 1A is a diagram illustrating an embodiment of a system for providing a half-empty (mid-air) temperature adjustment output in response to a user input.
FIG. 1B is a functional diagram illustrating an embodiment of a host computer for providing temperature adjustment in response to user input.
Fig. 2 is a diagram showing an example of a semi-air temperature adjusting apparatus.
FIG. 3 is a diagram illustrating an example of a system for providing a semi-empty thermostat output responsive to user input.
FIG. 4 is a flow diagram illustrating an embodiment of a process for providing a semi-empty thermostat output responsive to user input.
FIG. 5 is a flow diagram illustrating an embodiment of a process for providing a semi-empty thermostat output responsive to user input.
FIG. 6 is a flow chart illustrating an example of a process for providing a semi-empty thermostat output responsive to user input.
Fig. 7A and 7B depict examples of 3D virtual objects displayed by a system for providing a semi-empty thermostat output responsive to user input.
Detailed Description
The invention can be implemented in numerous ways, including as a process; a device; a system; a combination of substances; a computer program product presented on a computer-readable storage medium; and/or a processor, e.g., a processor configured to execute instructions stored and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless otherwise specified, a component (e.g., a processor or a memory) described as being configured to perform a task may be implemented as a general-purpose component that is temporarily configured to perform the task at a given time, or as a specific component that is manufactured to perform the task. The term "processor" as used herein refers to one or more devices, circuits, and/or processing cores configured to process data (e.g., computer program instructions).
A detailed description of one or more embodiments of the invention is provided below with reference to the accompanying drawings that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
Embodiments are described herein that provide a half-empty (mid-air) temperature adjustment output in response to a user input. User input is received regarding the virtual object. In some embodiments, the virtual object is a three-dimensional (3D) object that is rendered and rendered by an autostereoscopic display such that the virtual object appears as a 3D hologram. In some embodiments, the user input does not require the user to be in physical contact with a physical object and is detected using a sensor device. For example, the user input may be hand motion, foot motion, head motion, and/or eye motion. In some embodiments, the user input determines to affect the virtual object due to a collision being detected between the user input and the virtual object in the 3D space. A semi-empty temperature adjustment is determined based at least in part on the user input. In various embodiments, "semi-air temperature conditioning" includes temperature conditioning (e.g., hot, cold, and/or air) that can be perceived (e.g., by a user) without direct/physical contact with a physical object (e.g., a source of the temperature conditioning). In some embodiments, the temperature adjustment is determined based at least in part on a measurement determined based on a user input associated with at least a portion of the virtual object in the 3D space. And (5) outputting half-empty temperature adjustment. For example, the semi-air temperature adjustment feedback includes heat blown from the at least one semi-air temperature adjustment device in a direction of the user's hand. In some embodiments, the appearance of the virtual object is also updated in response to user input. In some embodiments, feedback other than half-air temperature regulation, e.g., tactile feedback, is also provided in response to user input.
FIG. 1A is a diagram illustrating an embodiment of a system for providing a semi-empty thermostat output responsive to user input. In this example, the system 100 includes a host computer 112, a display device 102, a user input detection device 104, a semi-empty temperature adjustment device 106, a semi-empty haptic feedback device 108, and an additional feedback device 110. As shown in FIG. 1A, each of the display device 102, the semi-empty haptic feedback device 108, the semi-empty temperature adjustment device 106, and the additional feedback device 110 are connected to a host computer 112. Also, as shown in FIG. 1A, the user input detection device 104 is connected to the display device 102. In other embodiments, the user input detection device 104 may be directly connected to the host computer 112 instead of the display device 102. In some embodiments, one or more of the display device 102, the user input detection device 104, the half-air temperature adjustment device 106, the half-air haptic feedback device 108, and the additional feedback device 110 includes a corresponding driver that interfaces with it and another device with which it communicates.
The host computer 112 is configured to execute a software environment (e.g., a game engine) configured to render 3D virtual objects in a 3D space. For example, the software environment is a Unity game engine. The files/information used to render the 3D virtual objects may be stored locally on the host computer 112 or retrieved from a remote server (not shown) via a network (not shown). In some embodiments, the 3D virtual object rendered by the host computer 112 comprises a product. In some embodiments, the 3D virtual objects rendered by the host computer 112 are at least partially animated. For example, the 3D virtual object may include a showerhead that emits an animation of a water flow. The host computer 112 is configured to output at least one 3D virtual object to the display device 102.
The display device 102 is configured to render 3D virtual objects. In some embodiments, the display device 102 is configured to present the 3D virtual object as a 3D hologram. In various embodiments, display device 102 is an autostereoscopic display device (e.g., Seefront (R) by SeefrontTM) Oddigeideae (Dimenco)TM) Manufactured devices), virtual reality display devices, or devices that display images that can give 3D depth perception without the viewer using special headwear or glasses.
The user input detection device 104 includes one or more sensors that track user input/motion. For example, the user input detection device 104 is configured to track eye movements, hand movements, leg/foot movements, and/or head movements. In response to each detected motion, the user input detection device 104 is configured to send a corresponding message to the display device 102, which the display device 102 may in turn send to the host computer 112. Based on the detected message of the user input, a software environment executing on the host computer 112 is configured to determine whether the user input collides with at least a portion of the 3D virtual object presented on the display device 102, and if so, to update the appearance of the 3D virtual object accordingly and/or to cause the semi-empty temperature adjustment device 106, the semi-empty haptic feedback device 108, and/or the other feedback devices 110 to output semi-empty feedback. In some embodiments, it is determined that the user input "collides" with at least part of the 3D virtual object when a point in 3D space detected at the user input overlaps a volume in 3D space defined for the 3D virtual object. In other words, user gesture interactions sensed by the user input detection device 104 may cause the host computer 112 to generate an updated appearance of the 3D virtual display presented by the display device 102 and/or cause the output of the semi-empty feedback by at least one peripheral feedback device (e.g., the semi-empty thermostat 106, the semi-empty haptic feedback device 108, and/or the additional feedback device 110).
For example, the host computer 112 may render a plurality of 3D virtual objects, including products downloaded from a remote server associated with an online shopping platform. The host computer 112 is configured to send the rendered plurality of 3D virtual objects to the display device 102, where the 3D virtual products are presented in the form of 3D holograms. The user input detection device 104 tracks user gestures with respect to the 3D virtual product. The particular detected user gesture detected by the user input detection device 104 is determined by the host computer 112 to match a predetermined user gesture associated with selecting a particular 3D virtual product. In response to the detected predetermined selection user gesture, a software environment executing on the host computer 112 is configured to cause the selected 3D virtual product to be presented on the display device 102 and interact therewith via user input. In a particular example, the initially rendered 3D virtual product includes a 3D showerhead and/or a faucet, and the particular item selected is a particular model of the showerhead. Once a showerhead is selected, the user may continue to interact with the selected showerhead via user input, thereby obtaining an immersive, semi-empty sensory experience, as described below.
The semi-air temperature regulating device 106 is configured to receive instructions from the host computer 112 to output temperature regulation feedback. In various embodiments, the semi-air temperature conditioning device 106 includes a heating component and a fan. In some embodiments, the semi-air temperature conditioning device 106 includes a heating component, a cooling component, and a fan. In some embodiments, the semi-empty thermostat 106 is instructed by the host computer 112 to activate a fan to produce a cool breeze, thereby reducing the temperature in the vicinity of the system 100 and/or causing the user to perceive the wind (e.g., the user's hand). In some embodiments, the semi-empty temperature adjustment device 106 is configured to activate a heating component (e.g., a Positive Temperature Coefficient (PTC)) to generate heat, thereby raising the temperature near the system 100. In some embodiments, the semi-air temperature conditioning device 106 is configured to activate a cooling component (e.g., a thermoelectric cooler) to cool, thereby reducing the temperature in the vicinity of the system 100. In some embodiments, the semi-air temperature regulating device 106 is configured to activate a fan with at least one of a heating component and a cooling component to distribute heat and/or cold in a particular direction away from the semi-air temperature regulating device 106 and toward the user. In some embodiments, the degree of temperature adjustment (e.g., heat or cold) produced by the semi-empty temperature adjustment device 106 (e.g., temperature) is determined and indicated by the host computer 112. In some embodiments, the intensity and/or velocity of the wind produced by the semi-empty temperature regulating device 106 is determined and indicated by the host computer 112.
When the user perceives the mid-air temperature adjustment feedback, the user may feel warm or cool as if they are near or even directly contacting the temperature adjustment source. In some embodiments, the semi-empty thermoregulation device 106 is configured to output thermoregulation feedback in response to user interaction/input with respect to 3D virtual objects presented by the display device 102. For example, if a user interacts with a presented 3D virtual object in a manner that allows the user to experience heat emanating from the 3D virtual object, semi-empty thermostat 106 is configured to activate its heating components and its fan to transfer heat to the user as if the heat had been emanating from the 3D virtual object.
Although only one instance of a semi-empty thermostat 106 is shown in the system 100, multiple instances of the semi-empty thermostat 106 may be connected to the host computer 112. For example, each instance of the semi-hollow thermostat 106 can be disposed in a different location (e.g., another peripheral device relative to the host computer 112, such as the display device 102 and the semi-hollow haptic feedback device 108) to provide thermostat feedback from different orientations/locations. For example, multiple instances of the semi-air temperature regulating device 106 may each provide different temperature feedback for different orientations/locations thereof. In some embodiments, host computer 112 may select all instances of semi-empty thermoregulation device 106, or only a subset of instances of semi-empty thermoregulation device 106, to output thermoregulation feedback based on factors such as the user's position (e.g., the position of the user's palm, which is more sensitive to detecting wind and/or temperature changes than the back of the hand) and/or the orientation of the selected 3D virtual object currently interacting with the user.
The semi-empty haptic feedback device 108 is configured to receive instructions from the host computer 112 to output haptic feedback. Multiple haptics (or a single haptic) is the science and engineering of applying haptics in computer systems. In some embodiments, the mid-air haptic sensation includes non-contact haptic feedback that the user perceives in mid-air. There are several types of semi-hollow haptic feedback devices including, for example, ultrasonic vibration using a two-dimensional (2D) array of ultrasonic transducers, laser-based haptic feedback, and the use of air vortices with subwoofers driving flexible nozzles. In various embodiments, the semi-hollow haptic feedback device 108 is configured to emit ultrasound waves to generate haptic sensations at one or more focal points. For example, the focus of the ultrasound waves in 3D space corresponds to the detected position of the user's hand, causing the user to perceive tactile feedback. In some embodiments, the degree (e.g., pressure) and/or location of the haptic feedback generated by the semi-empty haptic feedback device 108 is determined and indicated by the host computer 112.
When a user perceives the semi-empty haptic feedback, they may experience pressure if the user directly contacts the physical object. In some embodiments, the half-empty haptic feedback device 108 is configured to output haptic feedback in response to user interaction/input with respect to a 3D virtual object presented by the display device 102. In one example, if a user browses through multiple 3D virtual objects or 3D environments/scenes (e.g., menus), the half-empty haptic feedback device 108 is configured to output haptic feedback, allowing the user to perceive that they are physically moving items or to make a selection (e.g., a particular 3D virtual object). In another example, if the user interacts with the presented 3D virtual object in a manner that allows the user to touch the 3D virtual object, the half-space haptic feedback device 108 is configured to output haptic feedback that causes the user to feel that his or her hand is contacting the object.
Although only one instance of a semi-empty haptic feedback device 108 is shown in system 100, multiple instances of semi-empty haptic feedback device 108 may be connected to host computer 112. For example, each instance of the semi-empty haptic feedback device 108 may be disposed in a different location (e.g., another peripheral device with respect to the host computer 112, such as the display device 102 and the semi-empty temperature adjustment device 106) to provide haptic feedback from a different orientation/location. In some embodiments, the host computer 112 may select all instances of the semi-empty haptic feedback device 108, or only a subset of the semi-empty haptic feedback device 108 instances, to output haptic feedback based on factors such as the user's position (e.g., the position of the user's palm, which is more sensitive to detecting wind and/or temperature changes than the back of the hand) and/or the orientation of the selected 3D virtual object currently interacting with the user.
In an embodiment, the host computer 112 is configured to cause the semi-air temperature regulating device 106 and the semi-air tactile feedback device 108 to output feedback simultaneously to cause the user to perceive that he or she is touching or contacting a tangible item that provides cooling or warming, without ever contacting a physical item or a heat/cold source. Returning to the example above, where the user selected a 3D virtual object that includes a showerhead to further interact with, the display device 102 is configured to magnify or otherwise make more prominent the presentation of the 3D virtual showerhead so that the user may try it more easily. For example, a user may use his or her hands to cause a 3D hologram of a showerhead to spray water, and the 3D hologram of the showerhead will be updated to show that water is sprayed from a 3D virtual showerhead in a manner that approximates a physical version of the same showerhead. For example, the user's hand motion will be tracked by the user input detection device 104 and ultimately input to the host computer 112. If the software environment executing on the host computer 112 determines that the user input matches a predetermined user gesture (e.g., turns the palm in a certain direction) associated with causing the 3D virtual showerhead to spray water, the software environment is configured to update the appearance of the 3D virtual showerhead by causing the display device 102 to additionally present an animation of the water being sprayed from the showerhead. For example, a user may interact with the 3D virtual showerhead and the animated water stream to change the shape of the water stream and trigger the temperature adjustment feedback and tactile feedback outputs so that the user's hands may feel pressure and warmth from the water spray of the 3D virtual showerhead. In another example, if the 3D virtual object includes both a 3D virtual shower head and a 3D virtual bathtub faucet, the 3D virtual shower head and the 3D virtual bathtub faucet may both spray water to interact with the user. For example, the user's hand motion will be tracked by the user input detection device 104 and ultimately input to the host computer 112. If the software environment executing on the host computer 112 determines that the user input collides with an animation of a 3D virtual shower head and/or a 3D virtual bathtub faucet spout in 3D space, the software environment is configured to update the appearance of the 3D virtual shower head and/or 3D virtual bathtub faucet by causing the display device 102 to display an animation of the change in the spout from the shower head and/or bathtub faucet in which the water flow is distorted by the collision with the user's hand. Further, if the software environment executing on the host computer 112 determines that the user input collides with a water spray animation of a 3D virtual shower head and/or a 3D virtual bathtub faucet in 3D space, the host computer 112 is configured to send instructions to the semi-empty thermostat 106 to cause the semi-empty thermostat 106 to generate heat and activate a fan to dispense heat and to send instructions to the semi-empty tactile feedback device 108 to cause the semi-empty tactile feedback device 108 to generate a tactile sensation at a focus corresponding to the position of the user's hand in 3D space. As a result of temperature adjustment and tactile feedback, the user may experience an immersive simulation attempt of spraying water from a selected showerhead without directly contacting or even touching physical objects. If multiple instances of the semi-empty temperature adjustment device 106 and the semi-empty haptic feedback device 108 are used, the host computer 112 may send different signals to each instance of the device to cause the instance of the device to issue different feedback. For example, if the 3D virtual object can produce hot and cold water from different jets simultaneously, the host computer 112 can send one set of signals to cause one instance of the semi-empty thermostat 106 to emit hot air, while the host computer 112 can send another set of signals to cause another instance of the semi-empty thermostat 106 to discharge cold air.
The additional feedback device 110 is configured to provide additional feedback in addition to tactile or temperature-based. In an example, the additional feedback device 110 includes a speaker configured to provide audio-based feedback. In another example, the additional feedback device 110 is configured to provide scent or scent-based feedback. In some embodiments, in response to user input tracked by the user input detection device 104 and forwarded to the host computer 112, the host computer 112 is configured to determine whether the user input and/or the 3D virtual object currently presented by the display device 102 will trigger audio and/or scent-based feedback.
FIG. 1B is a functional diagram illustrating an embodiment of a host computer for providing temperature adjustment in response to user input. It will be apparent that other computer system architectures and configurations may be used to provide temperature adjustment in response to user input. In some embodiments, the host computer 112 of the system 100 of FIG. 1A may be implemented using a computer system 150. The computer system 150, which includes various subsystems as described below, includes at least one microprocessor subsystem (also referred to as a processor or Central Processing Unit (CPU)) 152. For example, the processor 152 may be implemented by a single chip processor or multiple processors. In some embodiments, processor 152 is a general purpose digital processor that controls the operation of computer system 150. Using instructions retrieved from memory 160, processor 152 controls the receipt and operation of input data, as well as the output and display of data on an output device (e.g., display 168).
The processor 152 is bidirectionally coupled to the memory 160, which may include a first main memory area, typically Random Access Memory (RAM), and a second main memory area, typically read-only memory (ROM). As is well known in the art, main storage may be used as a general-purpose memory area and a temporary memory (scratch-pad memory), as well as for storing input data and processing data. Main storage may also store programming instructions and data in the form of data objects and text objects, as well as other data and instructions for processes running on processor 152. As is also well known in the art, main storage typically includes the basic operating instructions, program code, data, and objects used by the processor 152 to perform its functions (e.g., programming instructions). For example, memory 160 may include any suitable computer-readable storage medium, as described below, depending on, for example, whether data access requires bi-directional or uni-directional access. For example, the processor 152 may also directly and very quickly retrieve and store frequently needed data in a cache (not shown).
The removable mass storage device 162 provides additional data storage capacity for the computer system 150 and is either bi-directionally coupled (read/write) or uni-directionally coupled (read only) to the processor 152. For example, storage 162 may also include computer-readable media such as magnetic tape, flash memory, PC cards, portable mass storage devices, holographic storage devices, and other storage devices. For example, the fixed mass storage 170 may also provide additional data storage capacity. The most common example of fixed mass storage 170 is a hard disk drive. Mass storage 162, 170 typically stores additional programming instructions, data, etc., which are not typically actively used by processor 152. It will be appreciated that the information retained within mass storage 162 and 170 may, in a standard manner, be implemented as part of memory 160 (e.g., RAM) as virtual memory, if desired.
In addition to providing the processor 152 with access to the memory subsystem, the bus 164 may also be used to provide access to other subsystems and devices. As shown, these may include a display 168, a network interface 166, a keyboard 154, and a pointing device 158, as well as auxiliary Input/Output (I/O) device interfaces, sound cards, speakers, and other desired subsystems. For example, pointing device 158 may be a mouse, a stylus, a trackball, or a tablet computer, and may be useful for interacting with a graphical user interface.
The network interface 166 allows the processor 152 to be coupled to another computer, computer network, or telecommunications network using a network connection as shown. For example, through the network interface 166, the processor 152 may receive information (e.g., data objects or program instructions) from another network or output information to another network during performance of method/process steps. Information, generally represented as a sequence of instructions executing on a processor, may be received from and output to another network. An interface card or similar device and appropriate software implemented (e.g., executed/executed) by the processor 152 may be used to connect the computer system 150 to an external network and transfer data according to standard protocols. For example, various process embodiments disclosed herein may execute on the processor 152 or may execute over a network in conjunction with a remote processor that shares a portion of the processing, such as the internet, an intranet, or a local area network. Additional mass storage devices (not shown) may also be connected to the processor 166 through the network interface 152.
An auxiliary I/O device interface (not shown) may be used in conjunction with computer system 150. The auxiliary I/O device interfaces may include general purpose and custom interfaces that allow the processor 152 to send and typically receive data from other devices, such as a microphone, touch sensitive display, sensor card reader, tape card reader, voice or handwriting recognizer, biometric card reader, camera, portable mass storage device, and other computers.
Fig. 2 is a diagram showing an example of a semi-air temperature adjusting apparatus. In some embodiments, the semi-air thermostat 106 of the system 100 of fig. 1A is implemented using the semi-air thermostat 206 of fig. 2.
As shown in fig. 2, the semi-air temperature regulating device 206 includes a fan 202 and a heating element 204, such as a Positive Temperature Coefficient (PTC), which are respectively connected to a driving circuit 208 having a microcontroller. The driver circuit 208 with the microcontroller is in turn connected to a host computer (e.g., the host computer 112 of the system 100 of fig. 1A). The driver circuit 208 with the microcontroller is configured to interface with a host computer. For example, instructions from a host computer are received at the microcontroller-equipped drive circuit 208, and the microcontroller-equipped drive circuit 208 is configured to send corresponding instructions to one or both of the fan 202 and the heating assembly 204. In some embodiments, the fan 202 and the heating component 204 may be instructed to be activated at different times. For example, the heating component 204 may be instructed to be activated (e.g., turned on) upon a particular selection by a user of a user input detection device connected to the host computer. For example, the heating assembly 204 may be activated in anticipation of a subsequent output of the temperature regulation feedback because the heating assembly 204 is not capable of generating heat instantaneously. Thus, at some time after the heating assembly 204 is activated, the drive circuit 208 with the microcontroller receives instructions from the host computer to activate the fan 202. Since the heating component 204 was activated in advance and heat has been generated, subsequent activation of the fan 202 will cause heat to be distributed/transferred from the semi-empty temperature regulating device 206.
The following is an example process by which the semi-air temperature adjustment device 206 is configured to generate a temperature adjustment: when the drive circuit 208 with the microcontroller receives an activation signal from the host computer, it may simultaneously or later activate the heating assembly 204 and/or the fan 202, depending on the type of activation signal. When the drive circuit 208 with the microcontroller receives a deactivation signal from the host computer, it may simultaneously deactivate the heating assembly 204 and/or the fan 202, depending on the type of deactivation signal. The host computer can only send an activation signal to the fan 202 in case it is necessary to cool the heated air. In some embodiments, the host computer includes a timer to track the time that the heating component 204 is on and sends a deactivation signal (e.g., to prevent the heating component 204 from overheating) after the timer expires. For example, the timer is configured (e.g., by the user) to start at a certain value (e.g., 10) and to start counting down when the heating assembly 204 starts to generate heat. Once the timer reaches zero, the host computer sends a deactivation signal to the drive circuit 208 with the microcontroller to cause the heating component 204 and the fan 202 to deactivate.
In some embodiments, the drive circuit 208 with a microcontroller is configured to receive instructions relating to a particular degree and/or intensity at which one or both of the fan 202 or the heating assembly 204 will operate. The microcontroller-equipped drive circuit 208 is configured to convert the particular degree and/or intensity into corresponding computer instructions configured to cause one or both of the corresponding fan 202 or heating assembly 204 to operate accordingly and to send the corresponding computer instructions to one or both of the fan 202 or heating assembly 204.
FIG. 3 is a diagram illustrating an example of a system for providing a semi-empty thermostat output responsive to user input. In the example of fig. 3, system 300 includes a display device 310, a motion detection device 308, a semi-empty haptic feedback device 306a, a semi-empty haptic feedback device 306b, a semi-empty temperature adjustment device 304, and a host computer (not shown). Although not shown in fig. 3, the host computer is connected to, receives instructions from, and sends instructions to each of the display device 310, the motion detection device 308, the half-air haptic feedback device 306a, the half-air haptic feedback device 306b, and the half-air temperature adjustment device 304.
In this example, the display device 310 includes an autostereoscopic display that presents a 3D hologram of an object to the user 302 without the user needing to wear special glasses to experience the 3D depth of the hologram. In other embodiments, the display of the virtual display requires special glasses, goggles, or other equipment may be used. The virtual objects rendered by the display device 310 are rendered by a software environment (e.g., a game engine) executing on a host computer. For example, the virtual object comprises two stereoscopic images, which may be displayed in an autostereoscopic 3D display. The motion detection device 308 is built into the display device 310 and is configured to track the user's motion, including, for example, eye and/or hand motion. The motion detection device 308 is configured to track the orientation and posture of the hand of the user 302. A half-air haptic feedback device 306a is placed in front of and above the display device 310 and a half-air haptic feedback device 306b is placed in front of and below the display device 310 to provide haptic feedback to the user 302 from the top and bottom, respectively, to match the binocular illusion (e.g., 3D virtual object) presented in half-air. In fig. 3, each of the semi-hollow haptic feedback devices 306a and 306b comprises an ultrasonic haptic device that includes a 2D array of multiple transducers and these transducers produce focused ultrasonic vibrations so that the user 302 can feel the sense of touch when a hand is placed in its vicinity. As shown in FIG. 3, a semi-hollow temperature regulating device 304 is placed next to a semi-hollow haptic feedback device 306a to provide heated air in conjunction with (or independent of) haptic feedback. For example, the angle of the half-air temperature adjustment device 304 is between 30 degrees and 45 degrees, facing the focal point of the half-air haptic feedback device 306 a. Although not shown in fig. 3, a corresponding semi-empty temperature adjustment device may also be placed next to the semi-empty haptic feedback device 306 b. In some embodiments, when either of the half-empty haptic feedback devices 306a or 306b is activated to generate focused haptic feedback, heated air is also generated by the respective half-empty temperature adjustment device to provide temperature adjustment feedback, which may be perceived by the palm of the user's hand, e.g., user 302 interacting with a 3D object presented by display device 310. For example, haptic and/or temperature adjustment feedback may be output from either of the half-space haptic feedback device 306a and the half-space temperature adjustment device 304 or the half-space haptic feedback device 306b and the corresponding half-space haptic feedback device, depending on the predicted current position of the palm of the user 302 and/or the orientation of the 3D virtual object presented by the display device 310.
FIG. 4 is a flow diagram illustrating an embodiment of a process for providing a semi-empty thermostat output responsive to user input. In some embodiments, process 400 may be implemented on system 100 of FIG. 1A.
At 402, information related to a plurality of virtual objects is obtained from an online platform server. In some embodiments, information corresponding to a virtual object corresponding to a new product is obtained from a server associated with an online shopping platform. For example, the information associated with the virtual object includes images that can be used to render 3D holograms of corresponding new physical products sold on the online shopping platform. The information related to the virtual object also includes a description of the characteristics and/or specifications of the new product to which the virtual object corresponds.
At 404, a plurality of virtual objects are caused to be presented. The virtual object is presented in the form of a 3D hologram on a display device, which a user can browse via user input (e.g. hand movements). For example, user interaction with a presented 3D virtual object may allow a cursor to highlight any one of the 3D virtual objects.
At 406, a selection related to the rendered virtual object is received. In some embodiments, a user selection gesture (e.g., an action of a user performing a pushing motion with his or her hand) for selecting a particular rendered 3D virtual object is determined. If a predetermined user selection gesture is detected for the 3D virtual object currently highlighted by the cursor, the 3D virtual object is selected for further interaction only. In some embodiments, if a 3D virtual object is not detected within a predetermined time since the cursor highlighted the 3D virtual object, the 3D virtual object is automatically selected (e.g., the user's potential interest in the 3D virtual object is expected and/or the system may not be able to detect a predetermined user selection gesture the user is attempting to perform).
One example application of a semi-empty temperature adjustment output in response to user input is a holographic signage kiosk setup (holographic signage kiosk setup) to present a virtual product. Users can interact with the virtual product using the semi-empty gestures and motions and eye movements, and their hands can be simulated warm or cold when they collide with the 3D virtual object in different ways.
FIG. 5 is a flow diagram illustrating an embodiment of a process for providing a semi-empty thermostat output responsive to user input. In some embodiments, process 500 may be implemented on system 100 of FIG. 1A.
At 502, user input regarding a virtual object is received. The user input includes user motion, such as eye motion, hand motion, leg motion, foot motion, and/or head motion. In some embodiments, the virtual object is presented on a display. For example, a virtual object is an object selected by a user in process 400 as in FIG. 4. For example, the display comprises an autostereoscopic display. In various embodiments, if it is determined that the location of the user input collides with (e.g., is located within) a virtual object in 3D space, it is determined that the user input is associated with the virtual object
At 504, the virtual object is updated based at least in part on the user input. In some embodiments, the appearance of the virtual object is updated based on user input. For example, the shape and/or related animation of the virtual object is updated and the updated virtual object is presented according to the user-input location (e.g., collision region) associated with the location of the virtual object in the 3D space. For example, the user input may have deformed at least a portion of the virtual object because the virtual object is configured to be compressible (e.g., the virtual object is a sofa cushion). In another example, the user input may have changed the associated animation of the virtual object (e.g., the virtual object includes a showerhead with an animation of spraying water from the showerhead).
At 506, semi-empty temperature adjustment feedback is determined based at least in part on the user input. A corresponding semi-empty temperature adjustment feedback including at least air and one of warm or cold is determined based on the user input. For example, respective semi-empty thermoregulation feedback is generated based on a user-entered position with respect to a position of at least a portion of the virtual object in the 3D space.
At 508, half-air temperature adjustment feedback is output. The corresponding semi-empty thermostat feedback is output to the direction of user input and/or the current location of the user. In some embodiments, the corresponding semi-empty temperature adjustment feedback is output along with another half of the air feedback (e.g., tactile feedback) that is also determined based on the user input.
An example application for providing temperature adjustment feedback in response to user input is in a shopping area where a user may wish to interact with a virtual product before purchasing a physical version of the product. The user can not only view the product details, but also view the functions that the product can provide. For example, a user may wish to try and feel the pressure of the new water spray characteristics of a newly released luxury spray bathtub. In another example, the user may also want to try and feel different types of warm water waterfall settings (hot water shower settings) for a new showerhead product. By applying a temperature regulation feedback system according to some embodiments described herein, a user may interact with characteristics of a 3D virtual product and receive multi-sensory feedback to experience a more realistic, intuitive simulation of product interaction.
FIG. 6 is a flow chart illustrating an example of a process for providing a semi-empty thermostat output responsive to user input. In some embodiments, process 600 may be implemented on system 100 of FIG. 1A. In some embodiments, process 400 of fig. 4 may be implemented at least in part by flow 600. In some embodiments, process 500 of fig. 5 may be implemented at least in part by flow 600.
At 602, the generated 3D virtual environment is presented. The 3D virtual environment includes a 3D virtual scene. In some embodiments, one or more 3D virtual objects are presented in a 3D virtual environment. The 3D virtual environment and/or the 3D virtual object are rendered by a display device (e.g., autostereoscopic). In some embodiments, the 3D virtual environment and/or 3D virtual objects are created using a game engine (e.g., Unity). For example, the 3D virtual scene includes a bathroom, and the 3D virtual objects in the scene are different bathroom fixtures (e.g., a bathtub, shower head, and/or toilet).
At 604, a user selection of a 3D virtual object is detected in the 3D virtual environment. The user selects a particular 3D virtual object based on, for example, detected user input matching a predetermined selection gesture. The selected 3D virtual object may be enlarged in the presentation through the display device, thereby making its features more visible to the user.
In some embodiments, the 3D virtual object includes an animation. For example, the 3D virtual object is a combination of a bathroom accessory and a liquid (e.g., water) released from the accessory. In particular, the 3D virtual object includes a stream of water flowing from a faucet, shower head, or nozzle. In some embodiments, the 3D virtual object is animated. For example, if the 3D virtual object is a water stream, the animation is that water is constantly being ejected from a source (e.g., a faucet, a shower head, or a nozzle).
At 606, user input is detected. The motion detection device detects motion/input of a user (e.g., hands, eyes, head, feet, and/or legs). In some embodiments, one or more points in 3D space that are relevant to the user input are determined. For example, coordinates relating to the position of the user's hand in 3D space are determined.
At 608, it is determined whether the user input collided with the 3D virtual object. If the user input collides with the 3D virtual object, control transfers to 610. Otherwise, if the user input does not collide with the 3D virtual object, control will return to 606 to wait for the user input.
In some embodiments, to determine whether the user input collides with the 3D virtual object, the point in the 3D space to which the user input relates is associated with a collision region of the 3D virtual object. For example, the collision region of the 3D virtual object may be defined as a portion of the 3D virtual object or the entire volume of the 3D virtual object. For example, if the coordinates associated with the position of the user's hand in 3D space overlap with any defined collision region of the 3D virtual object, it is determined that there is a collision between the user input and the 3D virtual object. However, if the coordinates associated with the position of the user's hand in 3D space do not overlap with any defined collision region of the 3D virtual object, it is determined that there is no collision between the user input and the 3D virtual object, and it is awaited that the next user input is detected at step 606.
At 610, a measurement is determined based at least in part on the user input and the reference point. In some embodiments, the reference point is defined as part of a 3D virtual object. In some embodiments, the reference point is defined as the position of the motion detection device. For example, the distance between the user-input location and a reference point in 3D space may be calculated.
At 612, the appearance of the 3D virtual object is updated based at least in part on the measurements. In some embodiments, the 3D virtual object may be updated by changing the shape and/or animation of the 3D virtual object. For example, if the detected user's hand is near a reference point of a 3D virtual object that includes a water flow source, the distance between the water flow source and the user's hand will decrease, and the 3D virtual object may be displayed as a shorter water flow that appears to be compressed. In another example, depending on the position of the user's hand, the water flow may be shown to both collide with the object (user's hand) and flow around the object.
At 614, temperature adjustment feedback is generated based at least in part on the measurements. In some embodiments, the temperature adjustment feedback may be updated by turning on the heating assembly, changing the temperature of the heating assembly, turning on a fan, and/or combinations thereof. Because the heating component of the temperature regulation feedback device cannot instantaneously generate heat (e.g., generating heat may take two to three seconds), it may be turned on in anticipation of a collision between the user input and the 3D virtual object being detected. For example, when a user selects to interact with a 3D virtual object that provides thermal feedback, the heating component of the temperature regulation feedback device may be turned on even when the 3D virtual environment is first rendered (e.g., at step 604 or 602, respectively). For example, later, when a collision between the user's hand and the 3D virtual object is detected, the fan portion of the temperature regulation feedback device may be turned on to direct the generated heat toward the location of the user's hand (e.g., palm) to simulate the warming (or other temperature regulation) feedback provided by the user's participation in the 3D virtual object.
At 616, temperature adjustment feedback is output. If multiple temperature regulation feedback devices are used, temperature regulation feedback may be provided by a combination of temperature regulation feedback devices depending on the position of the user's hand (e.g., palm).
At 618, non-temperature-related feedback is generated based at least in part on the measurements.
At 620, non-temperature dependent feedback is output.
In some embodiments, the semi-air haptic feedback may be updated by changing the intensity/pressure and/or area of the semi-air focal position. For example, if the user's hand is to be closer to a 3D virtual object that includes a water flow source, the haptic feedback may provide more pressure to the user's hand (e.g., palm) than if the user's hand is away from the water flow source. Further, for example, if the user's hand is to be closer to a reference point of a 3D virtual object that includes a source of water flow, the semi-empty haptic feedback will provide pressure for a smaller contact area of the user's hand (e.g., palm) than if the user's hand were farther away from the source of water flow. If multiple semi-hollow haptic feedback systems are used, haptic feedback may be provided by a combination of haptic feedback systems depending on the position of the user's hand (e.g., palm).
In some embodiments, the audio feedback may be updated by playing multiple recorded audio data and/or at different volumes depending on the position of the user's hand.
In some embodiments, the scent-based feedback may be updated by emitting multiple scents based on detected user input and/or product types associated with the selected 3D virtual object.
In some embodiments, the non-temperature-related feedback is coordinated with the temperature adjustment feedback such that the user may perceive multiple feedbacks together. For example, the half-air haptic feedback may be output from the half-air haptic feedback device in the same direction and at least partially synchronized with the output of the half-air temperature adjustment device from the half-air temperature adjustment feedback. In a particular example, if a user performs user input on a 3D showerhead object that sprays water, the half-space heat and half-space haptic feedback (in addition to audio feedback) may be output to the location of the user's palm so that the user may experience a simulated sensation of warm water contacting his or her palm.
In some embodiments, the type of update to the 3D virtual object and/or the type of feedback generated by the one or more feedback devices is determined based on a predetermined mapping between the current position of the user's hand and the corresponding update and/or feedback rules. In some embodiments, the type of update to the 3D virtual object and/or the feedback type generated by the one or more feedback devices is dynamically determined based on an entity-based simulation. For example, updating to the type of the 3D virtual object includes how the shape and animation of the 3D virtual object change under different user inputs. For example, the types of feedback generated by the one or more feedback devices include length, temperature, intensity, volume, and/or output direction of each feedback device (e.g., a semi-empty temperature regulation feedback device, a semi-empty haptic feedback device).
At 622, it is determined whether to continue providing a semi-empty thermostat output. If the semi-empty thermostat output continues to be provided, control will return to 606. Otherwise, if the semi-empty thermostat output is no longer being provided, the process 600 ends.
For example, if the system implementing process 600 is shut down and/or powered off, the semi-empty thermostat output is no longer provided.
Fig. 7A and 7B depict examples of 3D virtual objects displayed by a system for providing a semi-empty thermostat output responsive to user input. In the example of fig. 7A and 7B, system 700 includes a display device 710, an eye movement detection device 708, a manual detection device 706, a semi-empty haptic feedback device 702a, a semi-empty haptic feedback device 702B, a semi-empty temperature adjustment device 704a, a semi-empty temperature adjustment device 704B, and a host computer (not shown). In the example of system 700, eye movement detection device 708 is embedded in display device 710, but manual detection device 706 is not embedded in display device 710, but is positioned adjacent to semi-empty haptic feedback device 702 b. As shown in fig. 7A and 7B, the semi-empty haptic feedback device 702a and the semi-empty thermoregulation device 704a are disposed above the display device 710, and the semi-empty haptic feedback device 702B and the semi-empty thermoregulation device 704B are disposed below the display device 710. Depending on the location of the detected user input (e.g., with respect to eye movement detection device 708 and manual detection device 706) and/or the orientation of 3D virtual showerhead 712, feedback may be provided by different combinations of half-empty haptic feedback device 702a, half-empty haptic feedback device 702b, half-empty thermostat 704a, and half-empty thermostat 704 b. Although not shown in fig. 7A and 7B, the host computer is connected to, receives instructions from, and transmits instructions to the display device 710, the eye movement detection device 708, the manual detection device 706, the air-half haptic feedback device 702a, the air-half haptic feedback device 702B, the air-half temperature adjustment device 704a, and the air-half temperature adjustment device 704B.
In the example of fig. 7A, a display device 710, which is an autostereoscopic display, is displaying a 3D virtual showerhead 712. For example, the 3D virtual showerhead 712 is selectively displayed by a user from a menu previously displayed on the display device 710. In a particular example, the virtual bathroom scene was previously presented on display device 710, and the user selected 3D virtual showerhead 712 to interact with features of the showerhead model. The 3D virtual showerhead 712 includes an animation of the water flow from the showerhead 716. For example, the animation of the water flow 712 of the 3D virtual showerhead 716 may be modeled based on how water is actually ejected from the physical showerhead on which the 3D virtual showerhead 712 is based. As shown in fig. 7A, without any user interaction with the 3D virtual showerhead 712, the animation of the water stream 716 flows uninterrupted because it does not collide with any object. As described below with reference to fig. 7B, user inputs (e.g., gestures and/or eye movements) detected by the system 700 may cause the 3D virtual showerhead 712 to present in different ways and/or cause the system 700 to output feedback via at least one feedback device (e.g., the half-empty haptic feedback device 702a, the half-empty haptic feedback device 702B, the half-empty temperature adjustment device 704a, and the half-empty temperature adjustment device 704B).
In the example of fig. 7B, a user hand 714 is placed "under" the 3D virtual showerhead 712 to "touch" the animated water flowing from the 3D virtual showerhead 712. The pose and position of user hand 714 (in 3D space where 3D virtual showerhead 712 exists) is detected by manual detection device 706 (and, in some embodiments, the user's eye movement is through eye movement detection device 708). For example, a collision of the current position of the user's hand 714 with a defined collision zone of the 3D virtual showerhead 712 and its accompanying water animation is detected. Thus, a measurement of the length/distance between the current position of the user's hand 714 and the position of the manual detection device 706 (or a reference point on the 3D virtual showerhead 712, e.g., the center of the face of the 3D virtual showerhead 712) is made by the host computer. The measured length/distance is used by the host computer to cause water to be ejected out of the 3D virtual showerhead 712, the water flow animation 720, appearing as if the water flow collided with an object (user hand 714), now appears distorted and flowing around the detected object (user hand 714) rather than flowing like an uninterrupted flow, as shown in fig. 7A.
Further, the measured length/distance is used by the host computer to cause the semi-empty tactile feedback device 702a to provide tactile feedback 722 down to the user's hand 714 and to provide heated air feedback 718 down to the user's hand 714 by the semi-empty temperature adjustment device 704 a. The focal position (e.g., ultrasonic) haptic feedback 722 is determined by the host computer as being near or the current position of the user's hand 714 (e.g., based on the measured length/distance) and/or a known water pressure associated with the physical version of the 3D virtual showerhead 712. The intensity of the heat/air flow of the heated air feedback 718 is also determined by the host computer as being near or the current position of the user's hand 714 (e.g., based on the measured length/distance). The combination of tactile feedback 722 and hot air feedback 714 on the user's hand 718 will simulate the sensation of a warm water spray from 3D virtual shower head 712 and contact the palm of the user's hand 714. For example, the feedback is caused to be output by the semi-empty haptic feedback device 702a and the semi-empty thermoregulation device 704a, both pointing downward because the 3D virtual shower head 712 points downward, and thus it is expected that the user's palm will be facing upward to contact warm water flowing downward from the 3D virtual shower head 712. In a different example, water is ejected from the 3D virtual object up (e.g., a water jet in a hot tub) causing feedback to be output by the semi-empty tactile feedback device 702b and the semi-empty thermostat 704b, both pointing upward because the user's palm is expected to be down in contact with warm water flowing up from the water jet. In addition to the tactile feedback 722 and the hot air feedback 718, additional feedback, such as, for example, underwater sounds and/or fragrances, may be output by the system 700 from corresponding feedback devices (not shown) to obtain an immersive simulated experience.
As the user's hand 714 moves, the focal length and/or pressure of the haptic feedback 722 and the heat and/or wind intensity of the heated air feedback 718 may be updated accordingly by the host computer based on the current position of the user's hand 714 to better simulate the varying degrees of water pressure and warmth that the user should experience when using a physical version of the 3D virtual showerhead 712.
Embodiments are disclosed that provide a semi-empty thermostat output responsive to user input. User input is detected with respect to the virtual object. At least half-air temperature adjustment feedback (sometimes in addition to half-air tactile feedback) is provided in response to detected user input to provide the user with temperature adjustments (e.g., including cold, heat, and/or airflow) that simulate being emitted by/from the virtual object so that the user can obtain an immersive virtual experience without direct contact with the physical object.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.

Claims (20)

1. A system, comprising:
a motion detection device configured to detect a user input with respect to a virtual object;
a temperature regulating device; and
a processor coupled to the motion detection device and the temperature adjustment device, the processor configured to:
determining a temperature adjustment based at least in part on the user input with respect to the virtual object; and is
Causing the temperature adjustment device to output the temperature adjustment.
2. The system of claim 1, wherein the determining the temperature adjustment based at least in part on the user input regarding the virtual object comprises:
determining a measurement based at least in part on the user input and a reference point associated with the motion detection device;
using the measurements to determine instructions related to generating the temperature adjustment; and is
Sending the instructions related to generating the temperature adjustment to the temperature adjustment device.
3. The system of claim 1, wherein the temperature regulating device comprises one or more of: fan, heating element and cooling element.
4. The system of claim 3, wherein the causing the temperature adjustment device to output the temperature adjustment comprises:
sending a first instruction to the temperature regulating device to activate the heating assembly for a first time; and is
Sending a second instruction to the temperature regulating device to activate the fan a second time, wherein the second time lags the first time.
5. The system of claim 1, further comprising:
a semi-empty haptic feedback device configured to generate semi-empty haptic feedback;
wherein the processor is coupled to the semi-empty haptic feedback device and is further configured to control the semi-empty haptic feedback device in response to the user input.
6. The system of claim 5, wherein the semi-empty haptic feedback device comprises an ultrasonic haptic display configured to emit ultrasonic waves to generate haptic sensations at one or more focal points using a plurality of transducers.
7. The system of claim 5, wherein the controlling the semi-empty haptic feedback device in response to the user input comprises:
determining a measurement based at least in part on the user input and a reference point associated with the motion detection device;
using the measurements to determine instructions related to generating the semi-empty haptic feedback; and is
Sending the instructions related to generating the semi-empty haptic feedback to the semi-empty haptic feedback device.
8. The system of claim 5, wherein the processor is configured to control the temperature adjustment device to output the temperature adjustment at least partially in synchronization with the half-empty haptic feedback device outputting the half-empty haptic feedback.
9. The system of claim 1, further comprising:
a display device configured to present the virtual object;
wherein the processor is coupled to the display device and is further configured to determine whether the user input collides with the virtual object; and is
Wherein the processor is configured to determine the temperature adjustment based at least in part on the user input regarding the virtual object based at least in part on whether the user input collided with the virtual object.
10. The system of claim 9, wherein the display device comprises an autostereoscopic display device.
11. The system of claim 1, wherein the user input is obtained without requiring physical contact by the user with a physical object.
12. A method, comprising:
receiving user input regarding a virtual object;
updating the virtual object based at least in part on the user input;
determining a temperature adjustment based at least in part on the user input; and
outputting the temperature adjustment.
13. The method of claim 12, wherein the updating the virtual object comprises updating at least one of a shape and an animation associated with the virtual object.
14. The method of claim 12, wherein the determining the temperature adjustment based at least in part on the user input comprises:
determining a measurement based at least in part on the user input and a reference point; and is
Using the measurement to determine the temperature adjustment.
15. The method of claim 14, wherein the reference point is associated with a portion of the virtual object.
16. The method of claim 14, wherein the reference point is associated with a location of the motion detection device that detected the user input.
17. The method of claim 12, further comprising:
obtaining information related to a plurality of virtual objects from an online platform server;
causing presentation of the plurality of virtual objects; and is
A selection of a presented virtual object is received.
18. The method of claim 12, further comprising determining that the user input collided with a collision zone associated with the virtual object.
19. The method of claim 12, wherein the temperature conditioning comprises a gas flow, and at least one of thermal conditioning and cold conditioning.
20. The method of claim 12, further comprising:
determining a semi-empty haptic feedback based at least in part on the user input; and is
Outputting the semi-empty haptic feedback.
CN202010451233.9A 2019-06-12 2020-05-25 Temperature regulation feedback system responsive to user input Pending CN112083798A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962860687P 2019-06-12 2019-06-12
US62860687 2019-06-12
US16681629 2019-11-12
US16/681,629 US20200393156A1 (en) 2019-06-12 2019-11-12 Temperature adjustment feedback system in response to user input

Publications (1)

Publication Number Publication Date
CN112083798A true CN112083798A (en) 2020-12-15

Family

ID=73735864

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010451233.9A Pending CN112083798A (en) 2019-06-12 2020-05-25 Temperature regulation feedback system responsive to user input

Country Status (1)

Country Link
CN (1) CN112083798A (en)

Similar Documents

Publication Publication Date Title
JP6906580B6 (en) Viewport-based augmented reality tactile effects systems, methods and non-transitory computer-readable media
Rakkolainen et al. A survey of mid-air ultrasound haptics and its applications
US11366512B2 (en) Systems and methods for operating an input device in an augmented/virtual reality environment
US10921890B2 (en) Method and apparatus for providing tactile sensations
US10509468B2 (en) Providing fingertip tactile feedback from virtual objects
Araujo et al. Snake charmer: Physically enabling virtual objects
JP6453448B2 (en) 3D Contextual Feedback
US20200393156A1 (en) Temperature adjustment feedback system in response to user input
EP3093734B1 (en) Systems and methods for distributing haptic effects to users interacting with user interfaces
US20230186578A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
TW201816554A (en) Interaction method and device based on virtual reality
KR20170081225A (en) Sensory feedback systems and methods for guiding users in virtual reality environments
KR20160043503A (en) Haptically-enabled deformable device with rigid component
EP3539087A1 (en) A system for importing user interface devices into virtual/augmented reality
EP3418863A1 (en) Haptic dimensions in a variable gaze orientation virtual environment
CN107943273A (en) Context pressure-sensing haptic response
JP2019519856A (en) Multimodal haptic effect
US20200201437A1 (en) Haptically-enabled media
WO2021259341A1 (en) Interaction system, interaction method and machine readable storage medium
Arafsha et al. Contactless haptic feedback: State of the art
US20210398402A1 (en) Haptic feedback generation
CN112083798A (en) Temperature regulation feedback system responsive to user input
EP2187287A2 (en) 3D interface apparatus and interfacing method using the same
Hosoi et al. Vwind: Virtual wind sensation to the ear by cross-modal effects of audio-visual, thermal, and vibrotactile stimuli
US11430170B1 (en) Controlling joints using learned torques

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination