WO2014045683A1 - 制御装置および記憶媒体 - Google Patents
制御装置および記憶媒体 Download PDFInfo
- Publication number
- WO2014045683A1 WO2014045683A1 PCT/JP2013/068755 JP2013068755W WO2014045683A1 WO 2014045683 A1 WO2014045683 A1 WO 2014045683A1 JP 2013068755 W JP2013068755 W JP 2013068755W WO 2014045683 A1 WO2014045683 A1 WO 2014045683A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- user
- function
- control
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Definitions
- the present disclosure relates to a control device and a storage medium.
- a keyboard and a mouse placed on a desk are used as operation input devices.
- Patent Document 1 discloses an input / output device capable of realizing input / output without occupying a desk space.
- the input / output device described in Patent Literature 1 includes a projector that projects a display screen and operation keys on a desk, and an imaging video camera that captures an input operation performed by an operator. .
- the operator performs an input operation on the display screen projected on the desk.
- mobile terminals such as mobile phone terminals and notebook PCs that have been widely used in recent years have a structure in which a liquid crystal display screen is connected to a keyboard so that the screen can be folded.
- tablet terminals and smartphones display an operation screen on a liquid crystal display screen to detect a user's touch operation.
- Each is a mobile terminal held by a user, and is provided with a device for operation input corresponding to a keyboard or the like.
- HMD head mounted display
- a display unit is arranged in front of the user's eyes using a spectacle-type or head-mounted type mounting unit.
- Such an apparatus has the greatest feature of hands-free and is incompatible with an operation by an operation input device corresponding to a keyboard or a mouse.
- the present disclosure proposes a new and improved control device and storage medium capable of virtually associating an operation function with a predetermined area of an entity object.
- a detection unit that detects at least a part of an entity object existing at a position estimated to be operable by a user as an operation region, and a predetermined function associated with the operation region detected by the detection unit
- a control device comprising: a function setting unit to be set; and a control unit that executes a function associated with the operation region based on a positional relationship between the operation region and the operation tool.
- a detection unit that detects at least a part of an entity object existing at a position estimated to be operable by a user as an operation region, and a predetermined function in the operation region detected by the detection unit
- an operation function can be virtually associated with a predetermined area of an entity object.
- FIG. 1 is a diagram for describing an overview of a control system according to an embodiment of the present disclosure. As shown in FIG. 1, the control system according to the present embodiment is applied to an HMD (Head Mounted Display) 1 having a glasses-type display.
- HMD Head Mounted Display
- the HMD 1 (control device) according to the present embodiment has a mounting unit having a frame structure that, for example, makes a half turn from both sides of the head to the back of the head, and is worn by the user by being put on both ear shells as shown in FIG. .
- the HMD 1 has a pair of display units 2a and 2b for the left eye and the right eye in the wearing state as shown in FIG. It is set as the structure arranged.
- a liquid crystal panel is used for the display unit 2, and by controlling the transmittance, a through state as shown in FIG. 1, that is, a transparent or translucent state can be obtained.
- the display unit 2 can superimpose AR (augmented reality) information on the scenery in the real space by displaying an image such as a text or a figure in a transparent or translucent state.
- AR augmented reality
- the display unit 2 may display a captured image of the real space captured by the imaging lens 3a on the display unit 2, and may superimpose and display AR (augmented reality) information on the captured image of the real space.
- the display unit 2 also displays content received by the HMD 1 from external devices (digital cameras, video cameras, information processing devices such as mobile phone terminals, smartphones, personal computers, etc.) and content stored in the storage medium of the HMD 1 May be reproduced and displayed.
- the content displayed on the display unit 2 may be, for example, moving image content such as a movie or a video clip, still image content captured by a digital still camera, or data such as an electronic book.
- Such content is assumed to be any data to be displayed, such as image data created by a user with a personal computer, text data, computer use data such as spreadsheet data, and game images based on game programs.
- the imaging lens 3a is arranged facing forward so that the direction that the user visually recognizes is taken as the subject direction when the user is wearing.
- the light emission part 4a which performs illumination with respect to the imaging direction by the imaging lens 3a is provided.
- the light emitting part 4a is formed by, for example, an LED (Light Emitting Diode).
- the projector unit 7 is arranged facing forward so that an image is projected with the direction viewed by the user as the projection direction in a state worn by the user.
- a pair of earphone speakers 5a that can be inserted into the right ear hole and the left ear hole of the user in a worn state are provided.
- microphones 6a and 6b for collecting external sound are arranged on the right side of the display unit 2a for the right eye and on the left side of the display unit 2b for the left eye.
- the display unit 2 may be formed of a spectacle-type or head-mounted type mounting unit, and at least in the present embodiment, the display unit 2 may be provided close to the front of the user's eyes. In addition, a pair of display units 2 may be provided corresponding to both eyes, or one display unit 2 may be provided corresponding to one eye.
- the earphone speaker 5a may be provided only for wearing only one ear, not the left and right stereo speakers.
- the microphone may be one of the microphones 6a and 6b.
- FIG. 1 shows an example including the projector unit 7, but an example in which the projector unit 7 is not provided is also conceivable. Further, a configuration in which the microphones 6a and 6b and the earphone speaker 5a are not provided as the HMD 1 is also conceivable. A configuration in which the light emitting unit 4a is not provided is also conceivable.
- a normal HMD has the greatest feature of hands-free, and no operation input device corresponding to a keyboard, a mouse, or the like is provided, and a user operation is, for example, a button or switch provided on the HMD. , Voice input, gesture input, etc.
- the operation input by voice or gesture is an operation different from the operation that the user is accustomed to, and it is assumed that the operation burden and stress are imposed on the user.
- operation input by voice or gestures did not provide an operational feeling when operating a mouse or keyboard.
- control system according to each embodiment of the present disclosure has been created with the above circumstances in mind.
- the control system according to each embodiment of the present disclosure can virtually associate an operation function with a predetermined region of the entity object.
- FIG. 2 is a block diagram illustrating an internal configuration example of the HMD 1 according to the present embodiment.
- the HMD 1 includes a display unit 2, an imaging unit 3, an illumination unit 4, an audio output unit 5, an audio input unit 6, a projector unit 7, a system controller 10, an imaging control unit 11, and a display image processing unit 12.
- the system controller 10 includes, for example, a microcomputer including a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a nonvolatile memory, and an interface unit, and controls each component of the HMD 1.
- a microcomputer including a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a nonvolatile memory, and an interface unit, and controls each component of the HMD 1.
- the system controller 10 functions as an external environment determination unit 10a that determines the external environment and an operation control unit 10b that gives a control instruction to each unit according to the determination result of the external environment determination unit 10a.
- the external environment situation determination unit 10a acquires information on the outside world by the ambient environment sensor 19, the imaging target sensor 20, the GPS reception unit 21, the date and time counting unit 22, the image analysis unit 17, and the communication unit 26. .
- the outside world situation determination unit 10a controls to adjust the illuminance, luminance, and sensitivity of the projector unit 7, the imaging unit 3, or the display unit 2 according to the acquired outside world information (for example, light intensity, time zone, etc.). Also good.
- the external world situation determination unit 10a functions as a detection unit that detects at least a part of an entity object existing at a position operable by the user as an operation region based on information on the external world.
- the external environment determination unit 10a captures an image based on a captured image obtained by capturing an image of the user's surroundings with the image capturing unit 3 (the analysis result obtained by the image analysis unit 17 analyzing the captured image).
- An entity object within a predetermined distance from the unit 3 is extracted. Within the predetermined distance is a distance that is estimated to be directly or indirectly operable (including contact operation and proximity operation) by the user, for example, a distance that can be reached by the user's hand, or an operating body that the user holds This is the distance that the pen or pointer can reach.
- the entity object is an object existing in a real space that can be touched by the user.
- the external environment situation determination unit 10a extracts a desk near the user, a notebook PC placed on the desk, a music playback device, or the like as an entity object. Then, the external environment situation determination unit 10a (detection unit) detects at least a part of the flat region of the extracted entity object as the operation region. For example, when a notebook PC placed on a desk in a closed state is extracted as a real object, the external environment determination unit 10a (detection unit) may detect the top panel portion of the notebook PC as an operation area.
- Operation control unit 10b performs control related to the imaging operation and the display operation according to the acquired information on the outside world and the determination result by the outside world situation determination unit 10a.
- the operation control unit 10b according to the present embodiment functions as a function setting unit that sets so as to associate (associate) a predetermined function with the operation region detected by the external environment determination unit 10a (detection unit).
- the operation control unit 10b (function setting unit) associates an operation area with various operation functions such as a mouse pad function, a keyboard function, a numeric keypad function, and a content operation function.
- the motion control unit 10b (function setting unit) may associate an operation function similar to the function of the entity object that has detected the operation area. For example, when a calculator is extracted as an entity object and a numeric keypad portion of the calculator is detected as an operation area, the operation control unit 10b (function setting unit) associates a numeric keypad function with the operation area.
- the motion control unit 10b functions as a control unit that controls to superimpose / project an operation unit image (operation screen) corresponding to the associated operation function on the operation region of the detected entity object. For example, when a part of a desk is detected as an operation area and a keyboard function is associated therewith, the operation control unit 10b (control unit) displays the projector 2 or projector so as to superimpose / project a keyboard image on the operation area. The unit 7 is controlled. Further, the operation control unit 10b can also control the size of the operation unit image to be displayed / projected according to the size of the operation region.
- the operation control unit 10b functions as a control unit that executes a function associated with (associated with) the operation region based on the positional relationship between the operation region and the operation body.
- the operating body may be a part of a body such as a user's finger or a pen or a pointing stick held by the user.
- the motion control unit 10b (control unit) recognizes the positional relationship between the operation area and the operation body based on external information (for example, a captured image), and whether the operation body overlaps the operation area for a predetermined time. Judge whether or not. And when it overlaps for the predetermined time, the operation control part 10b (control part) performs the operation function linked
- the operation control unit 10b displays the mouse pointer displayed on the display unit 2 in accordance with the movement of the user's finger on the operation area. Control the position.
- the operation control unit 10b executes character input based on the relationship between the position of each key in the operation area and the position of the user's finger part. To do.
- the user when operating the hands-free HMD 1, the user can touch the entity object existing in the real space and perform an operation input. Further, the operation input can be performed by a movement that the user is familiar with such as a mouse operation and a keyboard input. Furthermore, when utilizing a predetermined flat area of the entity object as an operation unit, the entity object and the HMD 1 do not need to be electrically and physically connected, so that communication connection and wiring between them are not necessary. Further, since it is not necessary to transmit input data from the entity object to the HMD 1, it is not necessary to turn on the power even if the entity object is, for example, a wireless keyboard or a mouse. Even if the actual object is a broken keyboard, mouse, or the like, if it is detected as an operation area by the HMD 1 and is associated with an operation function, it can be used as an operation input device, so that ultimate recycling can be realized. .
- the imaging unit 3 photoelectrically captures imaging light obtained by a lens system including an imaging lens 3a, a diaphragm, a zoom lens, and a focus lens, a drive system that causes the lens system to perform a focus operation and a zoom operation, and the lens system. It has a solid-state image sensor array that generates image signals by conversion.
- the solid-state imaging device array may be realized by, for example, a CCD (Charge Coupled Device) sensor array or a CMOS (Complementary Metal Oxide Semiconductor) sensor array.
- the imaging lens 3 a is arranged facing forward so that the direction viewed by the user is taken as the subject direction in a state where the user wears the HMD 1.
- a range including a field of view viewed by the user can be imaged via the display unit 2.
- the imaging signal processing unit 15 includes a sample hold / AGC (Automatic Gain Control) circuit that performs gain adjustment and waveform shaping on a signal obtained by the solid-state imaging device of the imaging unit 3, and a video A / D (analog / digital) converter. . Thereby, the imaging signal processing unit 15 obtains an imaging signal as digital data.
- the imaging signal processing unit 15 also performs white balance processing, luminance processing, color signal processing, blur correction processing, and the like on the imaging signal.
- the imaging control unit controls operations of the imaging unit 3 and the imaging signal processing unit 15 based on an instruction from the system controller 10.
- the imaging control unit 11 controls on / off of operations of the imaging unit 3 and the imaging signal processing unit 15.
- the imaging control unit 11 performs control (motor control) for causing the imaging unit 3 to perform operations such as auto focus, automatic exposure adjustment, aperture adjustment, and zoom.
- the imaging control unit 11 includes a timing generator, and for the solid-state imaging device and the sample hold / AGC circuit of the imaging signal processing unit 15 and the video A / D converter, a signal processing operation is performed by a timing signal generated by the timing generator.
- the imaging frame rate can be variably controlled by this timing control.
- the imaging control unit 11 controls imaging sensitivity and signal processing in the solid-state imaging device and the imaging signal processing unit 15. For example, gain control of a signal read from a solid-state image sensor can be performed as imaging sensitivity control, black level setting control, various coefficient control of imaging signal processing in a digital data stage, correction amount control in blur correction processing, and the like can be performed. .
- imaging sensitivity overall sensitivity adjustment that does not specifically consider the wavelength band, and sensitivity adjustment that adjusts the imaging sensitivity in a specific wavelength band such as the infrared region and the ultraviolet region (for example, imaging that cuts a specific wavelength band) Etc. are also possible.
- Sensitivity adjustment according to the wavelength can be performed by inserting a wavelength filter in the imaging lens system or wavelength filter calculation processing on the imaging signal.
- the imaging control unit 11 can perform sensitivity control by insertion control of a wavelength filter, designation of a filter calculation coefficient, or the like.
- Imaging signal image data obtained by imaging
- the image input / output control 27 controls the transfer of image data in accordance with the control of the system controller 10. That is, it controls the transfer of image data among the imaging system (imaging signal processing unit 15), the display system (display image processing unit 12), the storage unit 25, and the communication unit 26.
- the image input / output control 27 supplies image data as an imaging signal processed by the imaging signal processing unit 15 to the display image processing unit 12, to the storage unit 25, or to the communication unit 26. I do.
- the image input / output control 27 performs an operation of supplying, for example, image data reproduced from the storage unit 25 to the display image processing unit 12 or to the communication unit 26.
- the image input / output control 27 performs an operation of supplying image data received by the communication unit 26 to the display image processing unit 12 or to the storage unit 25.
- the display image processing unit 12 is a so-called video processor, for example, and is a part capable of executing various display processes on the supplied image data. For example, brightness level adjustment, color correction, contrast adjustment, sharpness (contour emphasis) adjustment of image data can be performed.
- the display driving unit 13 includes a pixel driving circuit for displaying the image data supplied from the display image processing unit 12 on the display unit 2 that is a liquid crystal display, for example. That is, a drive signal based on the video signal is applied to each pixel arranged in a matrix in the display unit 2 at a predetermined horizontal / vertical drive timing, and display is executed. Further, the display driving unit 13 can control the transmittance of each pixel of the display unit 2 to enter the through state. Further, the display drive unit 13 may place a part of the display unit 2 in a through state and display AR information on a part thereof.
- the display control unit 14 controls the processing operation of the display image processing unit 12 and the operation of the display driving unit 13 according to the control of the system controller 10. Specifically, the display control unit 14 controls the display image processing unit 12 to perform the above-described brightness level adjustment of the image data. Further, the display control unit 14 controls the display driving unit 13 to switch between the through state and the image display state of the display unit 2.
- the audio input unit 6 includes the microphones 6a and 6b shown in FIG. 1 and a microphone amplifier unit and an A / D converter for amplifying the audio signals obtained by the microphones 6a and 6b. Output to the output control 28.
- the voice input / output control 28 controls the transfer of voice data in accordance with the control of the system controller 10. Specifically, the audio input / output control 28 controls transfer of audio signals among the audio input unit 6, the audio signal processing unit 16, the storage unit 25, and the communication unit 26. For example, the audio input / output control 28 performs an operation of supplying the audio data obtained by the audio input unit 6 to the audio signal processing unit 16, to the storage unit 25, or to the communication unit 26.
- the audio input / output control 28 performs an operation of supplying the audio data reproduced by the storage unit 25 to the audio signal processing unit 16 or the communication unit 26, for example.
- the audio input / output control 28 performs an operation of supplying audio data received by the communication unit 26 to the audio signal processing unit 16 or supplying to the storage unit 25, for example.
- the audio signal processing unit 16 includes, for example, a digital signal processor, a D / A converter, and the like.
- the audio signal processing unit 16 is supplied with audio data obtained by the audio input unit 6 and audio data from the storage unit 25 or the communication unit 26 via the audio input / output control 28.
- the audio signal processing unit 16 performs processing such as volume adjustment, sound quality adjustment, and acoustic effect on the supplied audio data according to the control of the system controller 10.
- the processed audio data is converted into an analog signal and supplied to the audio output unit 5.
- the audio signal processing unit 16 is not limited to a configuration that performs digital signal processing, and may perform signal processing using an analog amplifier or an analog filter.
- the audio output unit 5 includes a pair of earphone speakers 5a shown in FIG. 1 and an amplifier circuit for the earphone speakers 5a. Moreover, the audio
- the storage unit 25 is a part that records and reproduces data on a predetermined recording medium.
- the storage unit 25 is realized as an HDD (Hard Disc Drive), for example.
- various recording media such as a solid-state memory such as a flash memory, a memory card incorporating a fixed memory, an optical disk, a magneto-optical disk, and a hologram memory are conceivable.
- the storage unit 25 performs recording / reproduction according to the recording medium employed. What is necessary is just to be made the structure which can be performed.
- Image data as an imaging signal captured by the imaging unit 3 and processed by the imaging signal processing unit 15 and image data received by the communication unit 26 are supplied to the storage unit 25 via the image input / output control 27.
- audio data obtained by the audio input unit 6 and audio data received by the communication unit 26 are supplied to the storage unit 25 via the audio input / output control 28.
- the storage unit 25 performs encoding processing for recording on the recording medium on the supplied image data and audio data according to the control of the system controller 10 and records the encoded data on the recording medium.
- the storage unit 25 reproduces image data and audio data from the recording medium according to the control of the system controller 10.
- the reproduced image data is output to the image input / output control 27, and the reproduced audio data is output to the audio input / output control 28.
- the communication unit 26 transmits / receives data to / from an external device.
- the communication unit 26 is an example of a configuration for acquiring outside world information.
- the communication unit 26 may be configured to perform network communication via short-range wireless communication with a network access point using a method such as wireless LAN or Bluetooth, or directly with an external device having a corresponding communication function. Wireless communication may be performed.
- any device having information processing and communication functions such as a computer device, a PDA, a mobile phone terminal, a smartphone, a video device, an audio device, and a tuner device is assumed.
- terminal devices and server devices connected to a network such as the Internet are assumed as external devices to be communicated.
- a non-contact communication IC card incorporating an IC chip, a two-dimensional barcode such as a QR code, a hologram memory, or the like may be used as an external device, and the communication unit 26 may be configured to read information from these external devices.
- another HMD 1 is also assumed as an external device.
- the communication unit 26 is supplied with image data as an imaging signal captured by the imaging unit 3 and processed by the imaging signal processing unit 15 and image data reproduced by the storage unit 25 via an image input / output control 27.
- image data as an imaging signal captured by the imaging unit 3 and processed by the imaging signal processing unit 15 and image data reproduced by the storage unit 25 via an image input / output control 27.
- audio data obtained by the audio input unit 6 and audio data reproduced by the storage unit 25 are supplied to the communication unit 26 via the audio input / output control 28.
- the communication unit 26 performs encoding processing, modulation processing, and the like for transmission on the supplied image data and audio data according to the control of the system controller 10 and transmits them to an external device.
- the communication unit 26 performs data reception operation from an external device.
- the received and demodulated image data is output to the image input / output control 27, and the received and demodulated audio data is output to the audio input / output control 28.
- the voice synthesizer 27 performs voice synthesis under the control of the system controller 10 and outputs a voice signal.
- the audio signal output from the audio synthesizing unit 27 is supplied to the audio signal processing unit 16 via the audio input / output control 28 and processed, and then supplied to the audio output unit 5 and output to the user as audio. .
- the illumination unit 4 includes a light emitting unit 4a illustrated in FIG. 1 and a light emitting circuit that emits light from the light emitting unit 4a (for example, an LED).
- the illumination control unit 18 causes the illumination unit 4 to perform a light emission operation according to the control of the system controller 10. Since the light emitting unit 4a in the illuminating unit 4 is attached to illuminate the front as shown in FIG. 1, the illuminating unit 4 performs an illuminating operation with respect to the user's visual field direction.
- the ambient environment sensor 19 is an example of a configuration for acquiring outside world information. Specifically, for example, an illuminance sensor, a temperature sensor, a humidity sensor, and an atmospheric pressure sensor are assumed as the ambient environment sensor 19, and the ambient environment of the HMD 1 is detected to detect ambient brightness, temperature, humidity, weather, or the like. It is assumed that the sensor obtains the information.
- the imaging target sensor 20 is an example of a configuration for acquiring outside world information.
- the imaging target sensor 20 is a sensor that detects information about an imaging target that is a subject of an imaging operation in the imaging unit 3.
- a sensor that detects information and energy such as a specific wavelength of infrared rays emitted from the imaging target, such as a distance measuring sensor that detects information on the distance from the HMD 1 to the imaging target, or an infrared sensor such as a pyroelectric sensor, is assumed. Is done.
- a pyroelectric sensor for example, it is possible to detect whether an imaging target is a living body such as a person or an animal.
- sensors such as various UV (Ultra Violet) sensors that detect information and energy such as a specific wavelength of ultraviolet rays emitted from an imaging target are also assumed.
- the imaging target is a fluorescent material or a phosphor, and to detect the amount of external ultraviolet rays necessary for sunburn countermeasures.
- the GPS receiving unit 21 is an example of a configuration for acquiring outside world information. Specifically, the GPS receiving unit 21 receives radio waves from a GPS (Global Positioning System) satellite and outputs information on latitude and longitude as the current position.
- GPS Global Positioning System
- the date and time counting unit 22 is an example of a configuration for acquiring outside world information.
- the date / time counting unit 22 counts date / time (year / month / day / hour / minute / second) and outputs current date / time information.
- the image analysis unit 17 is an example of a configuration for acquiring outside world information. Specifically, the image analysis unit 17 analyzes the image data and obtains image information included in the image data. Image data is supplied to the image analysis unit 17 via the image input / output control 27.
- the image data to be subjected to image analysis in the image analysis unit 17 is image data as a captured image obtained by the imaging unit 3 and the imaging signal processing unit 15, image data received by the communication unit 26, or a storage unit. 25 is image data reproduced from the recording medium.
- the surrounding environment sensor 19 the imaging target sensor 20, the GPS receiving unit 21, the date and time counting unit 22, the image analysis unit 17, and the communication unit 26 are shown as the configuration for acquiring the outside world information, these are not necessarily all. It is not necessary to have.
- other sensors such as a voice analysis unit that detects and analyzes surrounding voice may be provided.
- the HMD 1 according to the present embodiment detects at least a part of an entity object existing at a position operable by the user as an operation area, and associates a predetermined function with the detected operation area. Then, the HMD 1 recognizes the positional relationship between the detected operation area and the operating tool based on the captured image and the like, and controls to execute a function associated with the operation area, thereby accepting an operation input by the user. As a result, the HMD 1 according to the present embodiment can use an actual object that is not electrically and physically connected and exists in real space as an operation input device.
- the operation example of the HMD 1 according to the present embodiment will be described in detail with reference to a plurality of embodiments.
- FIG. 3 is a flowchart showing a control operation process according to the first embodiment.
- step S ⁇ b> 103 the outside world situation determination unit 10 a (detection unit) detects a real object (hereinafter referred to as an object) within a predetermined distance from the imaging unit 3 from the captured image captured by the imaging unit 3. Extract).
- the object is extracted by applying a technique called SLAM that can simultaneously estimate the position and orientation of the camera (imaging unit 3) and the position of the feature point reflected in the captured image. ) May be used to dynamically generate an environment map that represents the three-dimensional position.
- SLAM Simultaneous Localization and Mapping
- the external environment determination unit 10a may extract each object by referring to an object pattern (such as three-dimensional data) stored in advance in the storage unit 25 and performing pattern matching with the captured image. .
- the external environment determination unit 10a extracts an object or grasps the distance from the object based on stereo vision. Also good.
- the external environment situation determination unit 10a may grasp the distance from the imaging unit 3 to the object based on information acquired by the imaging target sensor 20 in addition to analyzing the captured image.
- the external environment determination unit 10a may extract a plurality of objects within a predetermined distance from the imaging unit 3.
- FIG. 4 shows a diagram for explaining the control operation according to the first embodiment.
- the display unit 2 is in the through state, and the user is viewing the scenery in the real space through the transparent display unit 2.
- the operation control unit 10b may display a part of the captured image of the real space imaged by the imaging unit 3 on the display unit 2 in real time, and may control the image so that it looks almost the same as in the through state.
- the imaging range of the imaging unit 3 is wider than the range of the field of view of the user viewed through the display unit 2.
- the imaging unit 3 may use a wide-angle lens as the imaging lens or may include a plurality of imaging lenses in order to capture a range wider than the range of the user's field of view.
- the imaging unit 3 can also capture an object at hand of the user who does not enter the user's field of view seen through the display unit 2.
- the outside world situation determination unit 10a extracts, for example, the objects 50A, 50B, and 50C that are within the reach of the user's hand.
- step S106 the external environment determination unit 10a (detection unit) detects at least a part of the extracted object having the largest flat area as an operation area.
- the operation area is used as a user operation space.
- the external environment situation determination unit 10a (detection unit) normalizes the areas of the flat regions of the objects using the extracted distances to the respective objects and compares them to determine an object having the maximum flat region area.
- the object 50A has the largest flat area (top plate portion).
- the external environment determination unit 10a detects the top plate portion of the object 50A as the operation region 33.
- step S109 the operation control unit 10b associates a predetermined operation function, for example, a mouse pad function, with the operation area detected by the external environment determination unit 10a. More specifically, the operation control unit 10b assumes a rectangular mouse pad, and sets the coordinates of the rectangular mouse pad in the operation area 33 shown in FIG.
- a predetermined operation function for example, a mouse pad function
- the operation control unit 10b displays the mouse pointer 31 on the display unit 2 when the mouse pad function is associated. In the example shown in FIG. 4, most of the display unit 2 is in the through state, and the mouse pointer 31 is displayed on a part of the display unit 2. As shown in FIG. 4, the mouse pointer 31 is used when selecting an object (for example, an object 50D) existing in the real space.
- the motion control unit 10 b displays the information related to the object selected with the mouse pointer 31 as the AR information 52 in a superimposed manner on the display unit 2.
- step S112 the motion control unit 10b recognizes the user's finger part from the image feature amount (image feature amount of the finger part) based on the captured image, and the finger part and the operation area are set for a predetermined time (for example, It is determined whether or not they overlap.
- step S115 when it is determined that the finger part and the operation area overlap each other for a predetermined time (S112 / YES), in step S115, the motion control unit 10b corresponds to the coordinate position set in the operation area corresponding to the position of the finger part. Based on the above, the display position of the mouse pointer 31 is controlled.
- the user can control the display position of the mouse pointer 31 by moving the finger 40 in the operation area 33 of the object 50A associated with the function of the mouse pad.
- the user since the user can control the display position of the mouse pointer 31 while touching the operation area 33 with the finger 40, the user feels more like a normal mouse operation than the operation of moving the finger in the air. Can be operated.
- the operation area 33 is out of the user's field of view (display unit 2), the user can blindly operate the display position of the mouse pointer 31 while viewing the display unit 2.
- step S118 the operation control unit 10b determines whether or not the function association is released. If the function association is released (S118 / YES), the operation process ends. For example, the operation control unit 10b may determine that the application is released when an application that uses the associated operation function is terminated. Further, the operation control unit 10b may determine that the operation has been canceled when the user moves. In this case, after the user moves to another place, the operation control unit 10b may perform the processes shown in S103 to S109 again to associate the operation function with the newly detected operation area. Note that the movement of the user can be determined based on current position information acquired by the GPS receiver 21 provided in the HMD 1 or a value acquired by an acceleration sensor (not shown).
- the operation region 33 is out of the user's field of view (display unit 2).
- the scope of the present disclosure includes the operation region 33 in the user's field of view (display unit 2).
- the second embodiment in which the operation area 33 is included in the user's field of view (display unit 2) will be specifically described below with reference to FIGS.
- FIG. 5 is a flowchart showing a control operation process according to the second embodiment.
- the external environment determination unit 10a detection unit performs the same process as that described with reference to FIG.
- FIG. 6 shows a diagram for explaining the control operation according to the second embodiment.
- the object 50A within the range of the user's field of view seen through the transparent display unit 2 is determined as the object having the largest flat area, and part of the object 50A (top plate portion) Is detected as the operation area 33.
- the imaging unit 3 captures a range including the user's field of view seen through the display unit 2 and outputs the captured image.
- step S110 the operation control unit 10b associates a step mouse pad function with a predetermined operation function, for example, the case described in S109 of FIG. 3, to the operation area detected by the external environment determination unit 10a. Also good.
- the operation control unit 10b assumes a rectangular mouse pad, for example, and sets the coordinates of the rectangular mouse pad in the operation area 33 shown in FIG.
- the operation control unit 10b displays the mouse pointer 31 on the display unit 2 when the mouse pad function is associated. In the example shown in FIG. 6, most of the display unit 2 is in the through state, and the mouse pointer 31 is displayed on a part of the display unit 2. Similar to the first embodiment, the mouse pointer 31 is used when selecting an object (for example, the object 50D) existing in the real space.
- the motion control unit 10 b displays the information related to the object selected with the mouse pointer 31 as the AR information 52 in a superimposed manner on the display unit 2.
- the operation control unit 10b controls the display control unit 14 so that the operation unit image (operation screen) corresponding to the operation function associated with the operation region is displayed on the operation region in a superimposed manner on the display unit 2. Specifically, for example, the operation control unit 10b performs control so that an image of a rectangular mouse pad is superimposed and displayed on the operation area 33 as illustrated in FIG. Thereby, the user can visually recognize the operation area 33.
- the operation region 33 is included in the user's field of view (display unit 2), and the operation unit image corresponding to the associated operation function is superimposed on the operation region 33. Is displayed on the display unit 2. Thereby, the user can perform an operation while visually recognizing the operation area 33.
- an operation unit image mouse pad image, keyboard image, numeric keypad image, etc.
- the user can intuitively understand what operation function is associated.
- FIG. 7 is a flowchart showing a control operation process according to the third embodiment.
- the external environment determination unit 10a detection unit performs the same process as that described with reference to FIG. 3.
- step S111 the operation control unit 10b associates a predetermined operation function, for example, a mouse pad function, for example, as described in step S110 in FIG. 5, with the operation area detected by the external environment determination unit 10a. Also good.
- a predetermined operation function for example, a mouse pad function, for example, as described in step S110 in FIG. 5, with the operation area detected by the external environment determination unit 10a. Also good.
- the operation control unit 10b controls the projector unit 7 to project an operation unit image (operation screen) corresponding to the operation function associated with the operation region onto the operation region. Specifically, for example, the operation control unit 10 b controls the projector unit 7 so that an image of a rectangular mouse pad is projected on the operation area 33. Thereby, the user can visually recognize the operation area 33.
- the user can intuitively grasp the operation area 33 by projecting the operation unit image onto the operation area 33.
- the operation area is detected from the object having the largest flat area among the extracted objects.
- the scope of the present disclosure is not limited to operation elements (operations such as physical switches and buttons for operation). This includes the case where the operation area is detected preferentially from the object having the (part).
- a fourth embodiment for detecting an operation region from an object having a physical operation element for operation will be specifically described with reference to FIGS.
- FIG. 8 is a flowchart showing a control operation process according to the fourth embodiment.
- the external environment situation determination unit 10a detection unit extracts an entity object in the same manner as in the above-described step.
- the external environment determination unit 10a detects a part of the extracted object whose flat area is a predetermined size or larger as an operation area candidate. For example, in the example illustrated in FIG. 9, among the objects extracted based on the captured image, the flat areas of the objects 50A (notebook PC), 50E (desk), and 50F (calculator) are larger than a predetermined size. Judged as an object. Also, a part of the flat area of each object is detected as operation area candidates 35A (top plate part), 35E (part of the desk), and 35F (tenkey part). In the example shown in FIG.
- step S120 the motion control unit 10b compares the image feature amount of the operation region candidate with the image feature amount of the device having an operation feeling stored in advance in the storage unit 25, and performs the operation with the highest degree of matching. Select an area. For example, when image feature amounts such as a keyboard, a mouse, and a numeric keypad are stored as image feature amounts of a device having a feeling of operation in the storage unit 25, the operation control unit 10b displays the calculator (object 50F) illustrated in FIG.
- the operation region candidate 35F can be selected as the operation region 33F.
- the motion control unit 10b applies a predetermined operation function to each button (or a physical operator such as a switch or key) in the selected operation area (for example, the operation area 33F shown in the lower part of FIG. 9).
- a numeric keypad function is associated.
- the operation function to be associated is a function corresponding to each button (operator) in the operation area 33F (in the case of a character key, the input function of the corresponding character data, and in the case of the number key, the input of the corresponding numerical data) Function).
- the operation control unit 10b can determine the function corresponding to each button based on the image analysis result of each button (operator) in the selected operation region.
- step S126 the motion control unit 10b recognizes the user's finger part 40 from the image feature amount based on the captured image including the range of the user's field of view, and the finger part 40 and the operation area 33F. It is determined whether or not the button is overlapped for a predetermined time (for example, approximately 0.5 seconds).
- step S129 when it is determined that the finger 40 and the button in the operation area 33F overlap with each other for a predetermined time (S126 / YES), in step S129, the motion control unit 10b performs an operation corresponding to the position of the finger 40.
- the key function associated with the button in the area 33F is executed. That is, data corresponding to the numeric keypad associated with the button on which the finger unit 40 overlaps is received as an input value.
- the operation control unit 10b may control the display control unit 14 to display data received as an input value on the display unit 2. For example, as illustrated in the lower part of FIG. 9, the operation control unit 10 b displays an image 37 indicating data received as an input value on a part of the display unit 2. In the example shown in the lower part of FIG.
- a dotted line indicating the operation area 33 ⁇ / b> F is illustrated for the sake of explanation. However, such a dotted line is not displayed on the display unit 2, and the user can use the transparent display unit 2. Only the scenery of the real space and the image 37 can be seen.
- the user can input data (operation) while actually touching a calculator, keyboard, mouse, mobile phone terminal, etc. that exist in real space. It can be carried out.
- the object (operation input device) used for the operation does not need to be electrically or physically connected to the HMD 1, communication connection or wiring between them is not necessary.
- the operation input device for example, the calculator (object 50F) shown in FIG. 9) does not need to be turned on, and there may be no remaining battery power or the display function or the input function may be broken. Can be realized.
- step S118 the operation control unit 10b determines whether or not the function association is released. If the function association is released (S118 / YES), the operation process ends.
- FIG. 10 is a flowchart showing a control operation process according to the fifth embodiment.
- the external environment determination unit 10a detection unit performs the same process as that described with reference to FIG.
- the operation control unit 10b controls the display control unit 14 so that the operation region candidates detected in S107 are AR-displayed on the display unit 2.
- the motion control unit 10 b includes the operation region candidates 35 ⁇ / b> A of the objects 50 ⁇ / b> A (notebook PC), 50 ⁇ / b> E (desk), and 50 ⁇ / b> F (calculator) that can be seen from the transparent display unit 2.
- the AR image is displayed on the display unit 2 so as to appear to overlap with 35E and 35F.
- the AR image displayed on the display unit 2 may be a color image or a blinking image superimposed on the operation region candidate 35 or a frame image overlapping so as to surround the operation region candidate 35.
- the HMD 1 can present the detected operation area candidate to the user.
- step S138 the motion control unit 10b recognizes the user's finger 40 from the image feature amount based on the captured image including the range of the user's field of view, and the finger 40 and the operation region candidate 35 are predetermined. It is determined whether or not time (for example, approximately 0.5 seconds) overlaps.
- the operation control unit 10b determines in step S141 that the operation region candidate 35 corresponds to the position of the finger unit 40. Is selected as the operation area 33.
- the motion control unit 10b can select the operation region candidate 35E as the operation region 33E.
- region can be selected by a user pointing a finger
- the operation control unit 10b may control the display control unit 14 so that the operation unit image (operation screen) is superimposed and displayed as AR information on the display unit 2 in the operation region selected according to the user's instruction.
- the operation control unit 10 b causes the keyboard image 38 to appear to overlap the operation area 33 ⁇ / b> E when the user views a real space scenery from the transparent display unit 2. Displayed on a part of the display unit 2.
- a dotted line indicating the operation area 33 ⁇ / b> E is illustrated for the sake of explanation. However, such a dotted line is not displayed on the display unit 2, and the user can use the transparent display unit 2. Only the scenery of the real space and the keyboard image 38 can be seen.
- step S144 the operation control unit 10b associates each region in the operation unit image superimposed and displayed as AR information with a predetermined operation function.
- the operation control unit 10b associates each area in the keyboard image 38 with the function of each key on the keyboard.
- step S147 the motion control unit 10b recognizes the user's finger 40 from the image feature amount based on the captured image including the range of the user's field of view, and the region of the finger 40 and the keyboard image 38. Is determined to overlap for a predetermined time (for example, approximately 0.5 seconds).
- step S150 the operation control unit 10b moves to the area of the keyboard image 38 where the finger 40 overlaps. Perform the associated key function. That is, data corresponding to a key associated with a region (key image) of the keyboard image 38 where the finger part 40 overlaps is received as an input value.
- the operation control unit 10b may control the display control unit 14 to display data received as an input value on the display unit 2. For example, as illustrated in FIG. 11, the operation control unit 10 b displays an image 39 indicating data received as an input value on a part of the display unit 2.
- the user can select an operation area from a plurality of operation area candidates, and can perform data input (operation) while actually touching the selected operation area.
- the display indicating the operation region candidate and the operation unit image are AR-displayed on the display unit 2, but the present embodiment is not limited to the AR display on the display unit 2, and the third embodiment Similarly to the above, the projector unit 7 may project a display indicating an operation region candidate or an operation unit image into the real space.
- step S118 the operation control unit 10b determines whether or not the function association is released. If the function association is released (S118 / YES), the operation process ends.
- the function corresponding to each button is associated and executed based on the image analysis result of the selected operation region.
- the function of the numeric key is associated with the numeric key area, and numeric data is received as an input value.
- the function association according to the present disclosure is not limited to the fourth embodiment, and the scope of the present disclosure includes the HMD 1 according to the type of the object (operation input device) for which the operation area is selected. This includes the case of associating similar functions.
- a sixth embodiment in which similar functions of the HMD 1 are associated according to the type of object will be described in detail with reference to FIGS.
- FIG. 12 is a flowchart showing the process of the control operation according to the sixth embodiment.
- the external environment determination unit 10a (detection unit) performs the same process as that described with reference to FIG. Specifically, the external environment situation determination unit 10a (detection unit), as shown in FIG. 13, based on a captured image obtained by capturing the range including the user's field of view seen through the transparent display unit 2 with the imaging unit 3. The objects 50A, 50E, and 50G are extracted (S103). Furthermore, the external environment situation determination unit 10a (detection unit) detects operation area candidates 35A, 35E, and 35G from the objects 50A (notebook PC), 50E (desk), and 50G (music player), respectively (S107).
- step S153 the motion control unit 10b compares the image feature amount of the operation region candidate with the image feature amount of the device corresponding to the function of the HMD 1 stored in the storage unit 25 in advance, and the degree of matching is the highest. Select the operation area of the object with high. For example, a case is assumed where image feature amounts of a music player, a TV remote controller, a mobile phone terminal, a telephone, and the like are stored in the storage unit 25 as image feature amounts of devices corresponding to the functions of the HMD 1. In this case, the motion control unit 10b selects, as the operation region 33G, the operation region candidate 35G of the object 50G having a high degree of match of the music player among the objects 50A, 50E, and 50G illustrated in FIG.
- step S156 the motion control unit 10b associates the function of the HMD 1 similar to the function of the object that detected the operation area with each button (physical operation element) in the selected operation area. For example, in the example shown in the lower part of FIG. 13, since the object 50G in which the operation area 33G is detected is a music player, the motion control unit 10b associates the music playback function of the HMD 1 with each button in the operation area 33G. .
- step S159 the motion control unit 10b recognizes the user's finger 40 from the image feature amount based on the captured image including the range of the user's field of view, and the finger 40 and the operation area 33G It is determined whether or not the button is overlapped for a predetermined time (for example, approximately 0.5 seconds).
- step S162 when it is determined that the finger 40 and the button in the operation area 33G overlap each other for a predetermined time (S159 / YES), in step S162, the motion control unit 10b performs an operation corresponding to the position of the finger 40.
- the key function associated with the button in the area 33G is executed. That is, a playback instruction corresponding to a playback button in the operation area 33 ⁇ / b> G associated with the button on which the finger unit 40 overlaps is received as an operation input, and the motion control unit 10 b performs voice playback so as to perform music playback from the audio output unit 5.
- the input / output control 28 is controlled.
- the operation control unit 10b may control the display control unit 14 to display the received operation input feedback on the display unit 2.
- the operation control unit 10 b displays an image 44 showing the name and playback time of the music being played in response to the playback button operation on a part of the transparent display unit 2. Also good.
- a dotted line indicating the operation region 33 ⁇ / b> G is illustrated for the sake of explanation. However, such a dotted line is not displayed on the display unit 2, and the user can use the transparent display unit 2. Only the scenery of the real space and the image 44 can be seen.
- step S118 the operation control unit 10b determines whether or not the function association is released. If the function association is released (S118 / YES), the operation process ends.
- the sixth embodiment has been described above. According to this, for example, when a button in the operation area is a numeric key and the numeric key is provided on a TV (television) remote controller, the channel of the TV program to be reproduced on the display unit 2 of the HMD 1 An operation function is associated. On the other hand, when the numeric key is provided in the mobile phone terminal, the telephone number input function of the HMD 1 is associated.
- an operation function can be virtually associated with a predetermined area of the entity object.
- an entity object existing in the real space can be used as an operation input device.
- the user can operate the HMD 1 by touching a real object existing in the real space.
- the user can perform an operation input with a feeling similar to a mouse operation or a keyboard input that is familiar to the user.
- the object used as the operation input device and the HMD 1 do not need to be electrically and physically connected, communication connection or wiring between them is unnecessary. Further, since it is not necessary to transmit input data from the entity object to the HMD 1, it is not necessary to turn on the power even if the entity object is, for example, a wireless keyboard or a mouse. Furthermore, even if the actual object is a broken keyboard, mouse, or the like, if it is detected as an operation area by the HMD 1 and associated with an operation function, it can be used as an operation input device, so that ultimate recycling can be realized.
- the fact that the AR information is displayed on a part of the display unit 2 so as to overlap the scenery in the real space from the transparent display unit 2 has been described with reference to FIGS.
- a captured image obtained by capturing the user's line-of-sight direction as a subject direction is displayed, and the AR information is superimposed and displayed, so that it looks the same as in the case illustrated in FIGS.
- the present embodiment is not limited to this. Even when content (photographs, videos, etc.) is reproduced on the display unit 2 and most of the display unit 2 is in an opaque state, the operation area is detected from the entity object. Operation functions may be associated with each other. In this case, the user can touch the entity object and perform a blind operation.
- the actual object whose operation area is detected includes an object having a flat area such as a desk or a top panel of a node PC, an object provided with a physical operation element such as a keyboard or a calculator, a keyboard or a numeric keypad. It may be a paper medium on which a picture of an operator such as is shown. Further, the entity object from which the operation area is detected may be a part of the body such as a palm.
- the user arbitrarily selects an operation area from a plurality of operation area candidates, but the scope of the present disclosure is not limited to this.
- the external environment situation determination unit 10a detection unit
- the present embodiment is present at a position where it is estimated that the user can operate an object designated (pointed with a finger, hit, touched with a hand, etc.) (predetermined). It may be extracted as an object (within a distance), and a part thereof may be detected as an operation area.
- the keyboard when a keyboard (entity object 50H) and a mobile phone terminal (entity object 50I) exist in the real space, as shown in the center of FIG.
- the keyboard may be selected by touching (pointing) with the finger unit 40.
- the external environment situation determination unit 10 a shows the keyboard (substance object 50 ⁇ / b> H) touched by the finger unit 40 based on the captured image captured by the imaging unit 3.
- the operation area 33H is detected.
- a dotted line indicating the operation region candidate 33 ⁇ / b> H is illustrated for the sake of explanation. However, such a dotted line is not displayed on the display unit 2, and the user displays the transparent display unit 2. You can see the scenery in the real space.
- the operation control unit 10b associates each key function of the keyboard with each key of the detected operation area 33H. Thereby, when the finger 40 and the key of the operation area 33H overlap each other for a predetermined time, the operation control unit 10b executes a key function associated with the key of the operation area 33H where the finger 40 overlaps. That is, data corresponding to the key associated with the key of the operation area 33H where the finger unit 40 overlaps is received as an input value.
- the operation control unit 10b may control the display control unit 14 to display data received as an input value on the display unit 2. For example, as illustrated in the lower part of FIG. 14, the operation control unit 10 b displays an image 39 indicating data received as an input value on a part of the display unit 2.
- the keyboard entity object 50H
- the user selects the cellular phone terminal (entity object 50I). Also good.
- the external environment determination unit 10a detection unit selects the mobile phone terminal (the actual object 50I) that is touched by the finger unit 40 based on the captured image captured by the imaging unit 3 in the operation area 33I. Detect as.
- the operation control unit 10b associates the input function of the mobile phone terminal with each key of the detected operation area 33I. Thereby, when the finger 40 and the key of the operation area 33I overlap each other for a predetermined time, the operation control unit 10b executes the input function associated with the key of the operation area 33I where the finger 40 overlaps. That is, data corresponding to the input function associated with the key of the operation area 33I on which the finger unit 40 overlaps is received as an input value. At this time, the operation control unit 10b may control the display control unit 14 to display data received as an input value on the display unit 2. For example, as illustrated in FIG. 15, the operation control unit 10 b displays an image 39 indicating data received as an input value on a part of the display unit 2. In the example shown in FIG.
- a dotted line indicating the operation region candidate 33 ⁇ / b> I is illustrated for the sake of explanation, but such a dotted line is not displayed on the display unit 2, and the user displays on the transparent display unit 2.
- the real space scenery can be seen through the displayed image 39 and the display unit 2.
- keyboards and mobile phone terminals that are used on a daily basis can be used as the operation input device of the HMD 1.
- keyboards and mobile phone terminals have key layouts (key assignments) that are familiar to users or are easy to use, and there is a user need to continue to use them even if they are electrically broken.
- a keyboard or a mobile phone terminal that is familiar but broken can also be used as an operation input device of the HMD 1 to satisfy the user.
- this technique can also take the following structures.
- a detection unit that detects, as an operation region, at least a part of an entity object existing at a position estimated to be operable by the user;
- a function setting unit configured to associate a predetermined function with the operation region detected by the detection unit;
- a control unit that executes a function associated with the operation area based on a positional relationship between the operation area and the operation body;
- a control device comprising: (2) The control device according to (1), wherein the detection unit detects, as an operation area, at least a part of an entity object existing at a position estimated to be directly or indirectly operable by the user.
- the detection unit preferentially detects at least a part of an entity object provided with an operation unit among entity objects existing at a position estimated to be operable by the user as the operation region.
- the control device according to any one of (1) to (3).
- the control device is a head-mound display device, The control device according to (5), wherein the control unit controls the operation screen to be superimposed and displayed on the operation area of the entity object displayed on the display screen of the head-mound display device.
- the control device further includes an image projection unit, The control device according to any one of (1) to (6), wherein the control unit controls the image projection unit to project an operation screen onto the operation region detected by the detection unit.
- the function setting unit sets a predetermined function to be associated with one operation area selected by the user when a plurality of operation areas are detected by the detection, any of (1) to (7)
- the control device according to claim 1. (9) The function setting unit according to any one of (1) to (8), wherein the function setting unit sets a function similar to the function of the entity object in which the operation area is detected as a function of the operation area. Control device. (10) 10.
- the control device according to any one of (1) to (9), wherein the detection unit detects at least a part of an entity object designated by the user as an operation area.
- the detection unit detects at least a part of an entity object designated by the user as an operation area.
- Computer A detection unit that detects, as an operation region, at least a part of an entity object existing at a position estimated to be operable by the user;
- a function setting unit configured to associate a predetermined function with the operation region detected by the detection unit;
- a control unit that executes a function associated with the operation area based on a positional relationship between the operation area and the operation body;
- a storage medium storing a program for functioning as a computer.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Position Input By Displaying (AREA)
- Accessory Devices And Overall Control Thereof (AREA)
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380047236.4A CN104620212B (zh) | 2012-09-21 | 2013-07-09 | 控制装置和记录介质 |
US14/428,111 US9791948B2 (en) | 2012-09-21 | 2013-07-09 | Control device and storage medium |
JP2014536639A JP6256339B2 (ja) | 2012-09-21 | 2013-07-09 | 制御装置および記憶媒体 |
EP13839207.1A EP2899618B1 (en) | 2012-09-21 | 2013-07-09 | Control device and computer-readable storage medium |
BR112015005692A BR112015005692A2 (pt) | 2012-09-21 | 2013-07-09 | dispositivo de controle, e, meio de armazenamento. |
IN2046DEN2015 IN2015DN02046A (GUID-C5D7CC26-194C-43D0-91A1-9AE8C70A9BFF.html) | 2012-09-21 | 2013-07-09 | |
RU2015108948A RU2015108948A (ru) | 2012-09-21 | 2013-07-09 | Устройство управления и носитель информации |
US15/698,841 US10318028B2 (en) | 2012-09-21 | 2017-09-08 | Control device and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-208029 | 2012-09-21 | ||
JP2012208029 | 2012-09-21 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/428,111 A-371-Of-International US9791948B2 (en) | 2012-09-21 | 2013-07-09 | Control device and storage medium |
US15/698,841 Continuation US10318028B2 (en) | 2012-09-21 | 2017-09-08 | Control device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014045683A1 true WO2014045683A1 (ja) | 2014-03-27 |
Family
ID=50341009
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/068755 WO2014045683A1 (ja) | 2012-09-21 | 2013-07-09 | 制御装置および記憶媒体 |
Country Status (8)
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015212849A (ja) * | 2014-05-01 | 2015-11-26 | 富士通株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
JP2015213212A (ja) * | 2014-05-01 | 2015-11-26 | セイコーエプソン株式会社 | 頭部装着型表示装置、制御システム、および、頭部装着型表示装置の制御方法 |
JP2016053907A (ja) * | 2014-09-04 | 2016-04-14 | 株式会社ニコン | プログラム及び電子機器 |
JP2016148968A (ja) * | 2015-02-12 | 2016-08-18 | セイコーエプソン株式会社 | 頭部装着型表示装置、制御システム、頭部装着型表示装置の制御方法、および、コンピュータープログラム |
JP2017102768A (ja) * | 2015-12-03 | 2017-06-08 | セイコーエプソン株式会社 | 情報処理装置、表示装置、情報処理方法、及び、プログラム |
WO2019021566A1 (ja) * | 2017-07-26 | 2019-01-31 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2019075175A (ja) * | 2019-02-07 | 2019-05-16 | 京セラ株式会社 | ウェアラブル装置、制御方法及び制御プログラム |
JP2020520522A (ja) * | 2017-05-16 | 2020-07-09 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 拡張現実におけるユーザインタラクションのための仮想カバー |
WO2021235316A1 (ja) * | 2020-05-21 | 2021-11-25 | ソニーグループ株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
JP2022539313A (ja) * | 2019-06-24 | 2022-09-08 | マジック リープ, インコーポレイテッド | 仮想コンテンツのための仮想場所の選択 |
US11544968B2 (en) | 2018-05-09 | 2023-01-03 | Sony Corporation | Information processing system, information processingmethod, and recording medium |
WO2023157057A1 (ja) * | 2022-02-15 | 2023-08-24 | マクセル株式会社 | ヘッドマウントディスプレイ、ウェアラブル端末、及び紫外線監視方法 |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150041453A (ko) * | 2013-10-08 | 2015-04-16 | 엘지전자 주식회사 | 안경형 영상표시장치 및 그것의 제어방법 |
WO2015102658A1 (en) * | 2014-01-03 | 2015-07-09 | Intel Corporation | Systems and techniques for user interface control |
CN105874528B (zh) * | 2014-01-15 | 2018-07-20 | 麦克赛尔株式会社 | 信息显示终端、信息显示系统以及信息显示方法 |
JP6177998B2 (ja) * | 2014-04-08 | 2017-08-09 | 日立マクセル株式会社 | 情報表示方法および情報表示端末 |
EP3138284A4 (en) * | 2014-04-30 | 2017-11-29 | Aurasma Limited | Augmented reality without a physical trigger |
US9959591B2 (en) * | 2014-07-31 | 2018-05-01 | Seiko Epson Corporation | Display apparatus, method for controlling display apparatus, and program |
JP6304145B2 (ja) * | 2015-06-30 | 2018-04-04 | 京セラドキュメントソリューションズ株式会社 | 情報処理装置、画像形成装置の設定条件指定方法 |
FI20155599A7 (fi) * | 2015-08-21 | 2017-02-22 | Konecranes Global Oy | Nostolaitteen ohjaaminen |
JP6947177B2 (ja) * | 2016-07-05 | 2021-10-13 | ソニーグループ株式会社 | 情報処理装置、情報処理方法およびプログラム |
DE102016116774A1 (de) * | 2016-09-07 | 2018-03-08 | Bundesdruckerei Gmbh | Datenbrille zum Interagieren mit einem Nutzer |
CN110168615B (zh) * | 2017-01-16 | 2024-06-21 | 索尼公司 | 信息处理设备、信息处理方法和存储介质 |
JP2019164420A (ja) * | 2018-03-19 | 2019-09-26 | セイコーエプソン株式会社 | 透過型頭部装着型表示装置および透過型頭部装着型表示装置の制御方法、透過型頭部装着型表示装置の制御のためのコンピュータープログラム |
US10839603B2 (en) * | 2018-04-30 | 2020-11-17 | Microsoft Technology Licensing, Llc | Creating interactive zones in virtual environments |
US11500452B2 (en) | 2018-06-05 | 2022-11-15 | Apple Inc. | Displaying physical input devices as virtual objects |
US10809910B2 (en) * | 2018-09-28 | 2020-10-20 | Apple Inc. | Remote touch detection enabled by peripheral device |
US11137908B2 (en) * | 2019-04-15 | 2021-10-05 | Apple Inc. | Keyboard operation with head-mounted device |
US11275945B2 (en) * | 2020-03-26 | 2022-03-15 | Varjo Technologies Oy | Imaging system and method for producing images with virtually-superimposed functional elements |
JP2023047853A (ja) * | 2021-09-27 | 2023-04-06 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置、情報処理システム、およびプログラム |
US12141364B2 (en) | 2021-11-15 | 2024-11-12 | Samsung Electronics Co., Ltd. | Wearable device for communicating with at least one counterpart device according to trigger event and control method therefor |
CN116795263A (zh) * | 2022-03-15 | 2023-09-22 | 北京字跳网络技术有限公司 | 交互方法、装置、设备及计算机可读存储介质 |
US12361476B2 (en) * | 2022-07-29 | 2025-07-15 | Ncr Voyix Corporation | Augmented reality order assistance |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000298544A (ja) | 1999-04-12 | 2000-10-24 | Matsushita Electric Ind Co Ltd | 入出力装置と入出力方法 |
JP2008304268A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2011159163A (ja) | 2010-02-02 | 2011-08-18 | Sony Corp | 画像処理装置、画像処理方法及びプログラム |
JP2011209965A (ja) * | 2010-03-29 | 2011-10-20 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
US8228315B1 (en) * | 2011-07-12 | 2012-07-24 | Google Inc. | Methods and systems for a virtual input device |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090153468A1 (en) * | 2005-10-31 | 2009-06-18 | National University Of Singapore | Virtual Interface System |
US8007110B2 (en) * | 2007-12-28 | 2011-08-30 | Motorola Mobility, Inc. | Projector system employing depth perception to detect speaker position and gestures |
JP2009245392A (ja) * | 2008-03-31 | 2009-10-22 | Brother Ind Ltd | ヘッドマウントディスプレイ及びヘッドマウントディスプレイシステム |
US20100177035A1 (en) * | 2008-10-10 | 2010-07-15 | Schowengerdt Brian T | Mobile Computing Device With A Virtual Keyboard |
US20100199228A1 (en) * | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Gesture Keyboarding |
US9569001B2 (en) * | 2009-02-03 | 2017-02-14 | Massachusetts Institute Of Technology | Wearable gestural interface |
JP2010205031A (ja) * | 2009-03-04 | 2010-09-16 | Kddi Corp | 入力位置特定方法、入力位置特定システムおよび入力位置特定用プログラム |
US8860693B2 (en) * | 2009-07-08 | 2014-10-14 | Apple Inc. | Image processing for camera based motion tracking |
JP5728159B2 (ja) * | 2010-02-02 | 2015-06-03 | ソニー株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP2011170585A (ja) * | 2010-02-18 | 2011-09-01 | Sony Corp | ポインティングデバイス及び情報処理システム |
WO2011106798A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
JP5564300B2 (ja) * | 2010-03-19 | 2014-07-30 | 富士フイルム株式会社 | ヘッドマウント型拡張現実映像提示装置及びその仮想表示物操作方法 |
CN102200881B (zh) * | 2010-03-24 | 2016-01-13 | 索尼公司 | 图像处理装置以及图像处理方法 |
JP5418386B2 (ja) * | 2010-04-19 | 2014-02-19 | ソニー株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP5521727B2 (ja) * | 2010-04-19 | 2014-06-18 | ソニー株式会社 | 画像処理システム、画像処理装置、画像処理方法及びプログラム |
JP5574854B2 (ja) * | 2010-06-30 | 2014-08-20 | キヤノン株式会社 | 情報処理システム、情報処理装置、情報処理方法及びプログラム |
JP2012108577A (ja) * | 2010-11-15 | 2012-06-07 | Panasonic Corp | 3d−ui操作空間協調装置 |
US20120195461A1 (en) * | 2011-01-31 | 2012-08-02 | Qualcomm Incorporated | Correlating areas on the physical object to areas on the phone screen |
US9069164B2 (en) | 2011-07-12 | 2015-06-30 | Google Inc. | Methods and systems for a virtual input device |
JP5780142B2 (ja) * | 2011-12-07 | 2015-09-16 | 富士通株式会社 | 画像処理装置、画像処理方法 |
-
2013
- 2013-07-09 IN IN2046DEN2015 patent/IN2015DN02046A/en unknown
- 2013-07-09 EP EP13839207.1A patent/EP2899618B1/en not_active Not-in-force
- 2013-07-09 RU RU2015108948A patent/RU2015108948A/ru not_active Application Discontinuation
- 2013-07-09 JP JP2014536639A patent/JP6256339B2/ja not_active Expired - Fee Related
- 2013-07-09 WO PCT/JP2013/068755 patent/WO2014045683A1/ja active Application Filing
- 2013-07-09 BR BR112015005692A patent/BR112015005692A2/pt not_active IP Right Cessation
- 2013-07-09 US US14/428,111 patent/US9791948B2/en active Active
- 2013-07-09 CN CN201380047236.4A patent/CN104620212B/zh not_active Expired - Fee Related
-
2017
- 2017-09-08 US US15/698,841 patent/US10318028B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000298544A (ja) | 1999-04-12 | 2000-10-24 | Matsushita Electric Ind Co Ltd | 入出力装置と入出力方法 |
JP2008304268A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2011159163A (ja) | 2010-02-02 | 2011-08-18 | Sony Corp | 画像処理装置、画像処理方法及びプログラム |
JP2011209965A (ja) * | 2010-03-29 | 2011-10-20 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
US8228315B1 (en) * | 2011-07-12 | 2012-07-24 | Google Inc. | Methods and systems for a virtual input device |
Non-Patent Citations (1)
Title |
---|
See also references of EP2899618A4 |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015212849A (ja) * | 2014-05-01 | 2015-11-26 | 富士通株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
JP2015213212A (ja) * | 2014-05-01 | 2015-11-26 | セイコーエプソン株式会社 | 頭部装着型表示装置、制御システム、および、頭部装着型表示装置の制御方法 |
JP2016053907A (ja) * | 2014-09-04 | 2016-04-14 | 株式会社ニコン | プログラム及び電子機器 |
JP2016148968A (ja) * | 2015-02-12 | 2016-08-18 | セイコーエプソン株式会社 | 頭部装着型表示装置、制御システム、頭部装着型表示装置の制御方法、および、コンピュータープログラム |
JP2017102768A (ja) * | 2015-12-03 | 2017-06-08 | セイコーエプソン株式会社 | 情報処理装置、表示装置、情報処理方法、及び、プログラム |
JP7353982B2 (ja) | 2017-05-16 | 2023-10-02 | コーニンクレッカ フィリップス エヌ ヴェ | 拡張現実におけるユーザインタラクションのための仮想カバー |
JP2020520522A (ja) * | 2017-05-16 | 2020-07-09 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 拡張現実におけるユーザインタラクションのための仮想カバー |
WO2019021566A1 (ja) * | 2017-07-26 | 2019-01-31 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
US11544968B2 (en) | 2018-05-09 | 2023-01-03 | Sony Corporation | Information processing system, information processingmethod, and recording medium |
JP2019075175A (ja) * | 2019-02-07 | 2019-05-16 | 京セラ株式会社 | ウェアラブル装置、制御方法及び制御プログラム |
JP2022539313A (ja) * | 2019-06-24 | 2022-09-08 | マジック リープ, インコーポレイテッド | 仮想コンテンツのための仮想場所の選択 |
JP7525524B2 (ja) | 2019-06-24 | 2024-07-30 | マジック リープ, インコーポレイテッド | 仮想コンテンツのための仮想場所の選択 |
JP2024138030A (ja) * | 2019-06-24 | 2024-10-07 | マジック リープ, インコーポレイテッド | 仮想コンテンツのための仮想場所の選択 |
US12118687B2 (en) | 2019-06-24 | 2024-10-15 | Magic Leap, Inc. | Virtual location selection for virtual content |
JP7717237B2 (ja) | 2019-06-24 | 2025-08-01 | マジック リープ, インコーポレイテッド | 仮想コンテンツのための仮想場所の選択 |
WO2021235316A1 (ja) * | 2020-05-21 | 2021-11-25 | ソニーグループ株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
WO2023157057A1 (ja) * | 2022-02-15 | 2023-08-24 | マクセル株式会社 | ヘッドマウントディスプレイ、ウェアラブル端末、及び紫外線監視方法 |
Also Published As
Publication number | Publication date |
---|---|
BR112015005692A2 (pt) | 2017-07-04 |
EP2899618A1 (en) | 2015-07-29 |
CN104620212A (zh) | 2015-05-13 |
US9791948B2 (en) | 2017-10-17 |
IN2015DN02046A (GUID-C5D7CC26-194C-43D0-91A1-9AE8C70A9BFF.html) | 2015-08-14 |
RU2015108948A (ru) | 2016-10-10 |
US20150227222A1 (en) | 2015-08-13 |
JP6256339B2 (ja) | 2018-01-10 |
JPWO2014045683A1 (ja) | 2016-08-18 |
US20170371439A1 (en) | 2017-12-28 |
CN104620212B (zh) | 2018-09-18 |
US10318028B2 (en) | 2019-06-11 |
EP2899618A4 (en) | 2016-04-13 |
EP2899618B1 (en) | 2019-05-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6256339B2 (ja) | 制御装置および記憶媒体 | |
US10635182B2 (en) | Head mounted display device and control method for head mounted display device | |
CN106210184B (zh) | 移动终端及其控制方法 | |
US20140123015A1 (en) | Information processing system, information processing apparatus, and storage medium | |
KR102212030B1 (ko) | 글래스 타입 단말기 및 이의 제어방법 | |
KR20170006559A (ko) | 이동단말기 및 그 제어방법 | |
US10884498B2 (en) | Display device and method for controlling display device | |
KR20150142516A (ko) | 글래스 타입 단말기 및 그것의 제어 방법 | |
KR20160125674A (ko) | 이동 단말기 및 그 제어 방법 | |
KR20160127606A (ko) | 이동 단말기 및 그 제어 방법 | |
KR20170020069A (ko) | 이동 단말기 및 그의 영상촬영 방법 | |
US9215003B2 (en) | Communication apparatus, communication method, and computer readable recording medium | |
KR20180023785A (ko) | 복수의 발광소자를 포함하는 전자 장치 및 전자 장치의 동작 방법 | |
KR20170136759A (ko) | 홈 오토메이션 시스템 및 그 제어방법 | |
JP2024123273A (ja) | 遠隔制御方法 | |
KR20160017463A (ko) | 이동 단말기 및 그 제어 방법 | |
KR20150091608A (ko) | 이동 단말기 및 그 제어 방법 | |
JP2018206080A (ja) | 頭部装着型表示装置、プログラム、及び頭部装着型表示装置の制御方法 | |
KR20150111199A (ko) | 이동 단말기 및 그것의 제어방법 | |
KR20170082228A (ko) | 이동 단말기 및 그 제어방법 | |
KR20170046947A (ko) | 이동 단말기 및 제어 방법 | |
KR20160029348A (ko) | 글래스형 이동 단말기 | |
KR20170085358A (ko) | 이동 단말기 및 그 제어방법 | |
CN117130472B (zh) | 虚拟空间操作指引显示方法、移动设备及系统 | |
KR102135377B1 (ko) | 이동 단말기 및 그 제어방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13839207 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014536639 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2015108948 Country of ref document: RU Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14428111 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015005692 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112015005692 Country of ref document: BR Kind code of ref document: A2 Effective date: 20150313 |