CN106020620B - Display device, control method and control program - Google Patents
Display device, control method and control program Download PDFInfo
- Publication number
- CN106020620B CN106020620B CN201610531692.1A CN201610531692A CN106020620B CN 106020620 B CN106020620 B CN 106020620B CN 201610531692 A CN201610531692 A CN 201610531692A CN 106020620 B CN106020620 B CN 106020620B
- Authority
- CN
- China
- Prior art keywords
- dimensional object
- display device
- image
- display
- control unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 56
- 210000001508 eye Anatomy 0.000 claims abstract description 88
- 238000012360 testing method Methods 0.000 claims abstract description 36
- 230000008859 change Effects 0.000 claims description 137
- 210000003128 head Anatomy 0.000 claims description 12
- 238000004519 manufacturing process Methods 0.000 claims description 6
- 238000001514 detection method Methods 0.000 abstract description 125
- 238000012545 processing Methods 0.000 description 152
- 210000003811 finger Anatomy 0.000 description 126
- 238000003825 pressing Methods 0.000 description 100
- 230000006870 function Effects 0.000 description 49
- 238000003860 storage Methods 0.000 description 43
- 230000008569 process Effects 0.000 description 38
- 238000004891 communication Methods 0.000 description 28
- 239000000463 material Substances 0.000 description 18
- 230000009471 action Effects 0.000 description 16
- 230000015572 biosynthetic process Effects 0.000 description 11
- 239000004927 clay Substances 0.000 description 11
- 238000006073 displacement reaction Methods 0.000 description 11
- 230000010354 integration Effects 0.000 description 11
- 238000003786 synthesis reaction Methods 0.000 description 11
- 230000004888 barrier function Effects 0.000 description 8
- 230000006399 behavior Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 8
- 229920001971 elastomer Polymers 0.000 description 8
- 241000406668 Loxodonta cyclotis Species 0.000 description 6
- 230000008034 disappearance Effects 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 238000011017 operating method Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 241000207961 Sesamum Species 0.000 description 5
- 235000003434 Sesamum indicum Nutrition 0.000 description 5
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 5
- 230000005484 gravity Effects 0.000 description 5
- 239000007788 liquid Substances 0.000 description 5
- 235000013550 pizza Nutrition 0.000 description 4
- 239000000806 elastomer Substances 0.000 description 3
- 210000004247 hand Anatomy 0.000 description 3
- 235000012149 noodles Nutrition 0.000 description 3
- 239000004033 plastic Substances 0.000 description 3
- 239000004576 sand Substances 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 235000007688 Lycopersicon esculentum Nutrition 0.000 description 2
- 240000003768 Solanum lycopersicum Species 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 210000000887 face Anatomy 0.000 description 2
- 235000013305 food Nutrition 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 238000002386 leaching Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000004579 marble Substances 0.000 description 2
- 235000012054 meals Nutrition 0.000 description 2
- 230000007935 neutral effect Effects 0.000 description 2
- 230000000284 resting effect Effects 0.000 description 2
- 230000035939 shock Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- TVEXGJYMHHTVKP-UHFFFAOYSA-N 6-oxabicyclo[3.2.1]oct-3-en-7-one Chemical compound C1C2C(=O)OC1C=CC2 TVEXGJYMHHTVKP-UHFFFAOYSA-N 0.000 description 1
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 235000010724 Wisteria floribunda Nutrition 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 235000013351 cheese Nutrition 0.000 description 1
- 235000019504 cigarettes Nutrition 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 235000012779 flatbread Nutrition 0.000 description 1
- 239000000796 flavoring agent Substances 0.000 description 1
- 235000019634 flavors Nutrition 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 235000021552 granulated sugar Nutrition 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002045 lasting effect Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000005293 physical law Methods 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000008929 regeneration Effects 0.000 description 1
- 238000011069 regeneration method Methods 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 235000015067 sauces Nutrition 0.000 description 1
- 238000006748 scratching Methods 0.000 description 1
- 230000002393 scratching effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 235000014347 soups Nutrition 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 238000012559 user support system Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
- H04N13/117—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/317—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using slanted parallax optics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/349—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
- H04N13/351—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
- G02B2027/0134—Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/21—Collision detection, intersection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/001—Constructional or mechanical details
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Processing Or Creating Images (AREA)
Abstract
A kind of display device (1), comprising: display unit (32a and 32b) shows predetermined object with two corresponding images of eyes of user respectively by display when display device is worn in three dimensions;Test section (44), the detection object in the display space of the stereoscopic display object;And control unit (22), when detected in the display space above-mentioned object it is mobile when, above-mentioned object is changed in the display space according to the movement of the object.The variation of object includes such as moving, rotate, deform, disappear.
Description
The application is the divisional application of the Chinese patent application of entitled " display device, control method and control program ", should
Chinese patent application application No. is 201380050460.9, the applying date is on March 26th, 2015.
Technical field
The present invention relates to display device, control method and control programs.
Background technique
In the display device for including display unit, existing can be with the device of stereoscopically displaying images etc. (for example, referenced patent is literary
It offers 1).Stereoscopic display is realized using the parallax of eyes.
Existing technical literature
Patent document
Patent document 1: Japanese Patent Open Publication " special open 2011-95547 "
Summary of the invention
The technical problems to be solved by the invention
Although stereoscopic display is the display format for being easy to receive for user, in existing display device, stand
Body, which is shown, is served only for audiovisual purpose, rather than improves the convenience of operation.The purpose of the present invention is to provide: it can be to using
Person provides the display device, control method and control program of the high operating method of convenience.
Means needed for solving the problems, such as
A kind of display device of mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes show predetermined object in three dimensions;Test section detects the display sky in the object
Between in predetermined object displacement;And control unit, according to the displacement of the predetermined object detected by the test section, into
The capable and associated movement of the object.
The display device of another way, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes show predetermined object in three dimensions;Test section, it is empty in the display for showing the object
Between the first object of middle detection and the second object;And control unit, it is described to detect that the object is located in the display space
When between the first object and second object, it is changed the object.
The display device of another mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes show predetermined object in three dimensions;Test section, it is empty in the display for showing the object
Between the first object of middle detection and the second object;And control unit, when detected in the display space first object or
When at least one of second object is located at the position with the object contact, change the object.
The display device of another mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes show predetermined object in three dimensions;And control unit, when in the display object
When the first object and the second object are located at the position for clamping the object in display space, make the object variation.
The display device of another mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes show predetermined object in three dimensions;Test section detects the first object in display surface
With the second object;And control unit, when detecting that the object is located at first object and described in the display surface
When between two objects, make the object variation.
The display device of another mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes show predetermined object in three dimensions;Test section detects the first object in display surface
With the second object;And control unit, when detecting first object or second object in the display surface at least
One in position with the object contact when, make the object variation.
The display device of another mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes show predetermined object in three dimensions;Test section, the object described in stereoscopic display are shown
Show detection object in space;And control unit, when detected in the display space object it is mobile when, according to described
The movement of object changes the object in the display space.
The display device of another mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes show predetermined object in three dimensions;And control unit, when detecting object in solid
Show that it is described right to change in the display space according to the movement of the object when moving in the display space of the object
As.
A kind of control method of mode, when display device is worn, display device passes through display and two eyes of user
Corresponding image shows predetermined object in three dimensions characterized by comprising by the display device with three-dimensional
Mode shows predetermined object;Detect the displacement of the predetermined object in the display space of the object;And according to detecting
The displacement of the predetermined object carries out and the associated movement of the object.
A kind of control program of mode, follows the steps below in a display device, wherein when the display device is worn
When, include: to show predetermined object, the step in three dimensions with two corresponding images of eyes of user by display
Predetermined object is shown in three dimensions by display unit;Detect the displacement of the predetermined object in the display space of the object;
And the displacement according to the predetermined object detected, it carries out and the associated movement of the object.
A kind of display device of mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes show object corresponding with commodity in three dimensions;Test section, detection operation are described right
The real-world object of elephant;And control unit, change the position of the object according to the operation that the real-world object carries out, when described
When real-world object no longer operates the object, the object is stayed put.
A kind of control method of mode, as the control method of display device, including: pass through display and user two
Eyes corresponding images shows object in three dimensions;Detection operates the real-world object of the object;According to described true
What real object carried out operates to change the position of the object;It, will and when the real-world object no longer operates the object
The object stays put.
A kind of display device of mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes configure object in imaginary space to show in three dimensions;Sensor, described in detection
Direction change of the display device in real space;And control unit, the direction change detected according to the sensor,
Change the object.
A kind of control method of mode, the control method as display device, comprising: pass through display and two eyes of user
Corresponding image configures object in imaginary space to show in three dimensions;The display device is detected true empty
Between in direction change;And change the object according to the direction change.
A kind of display device of mode, comprising: display unit passes through display and user when the display device is worn
Two corresponding images of eyes configure object in imaginary space to show in three dimensions;Sensor, described in detection
Change in location of the display device in real space;And control unit, the change in location detected according to the sensor,
Change the object.
A kind of control method of mode, the control method as display device, comprising: pass through display and two eyes of user
Corresponding image configures object in imaginary space to show in three dimensions;The display device is detected true empty
Between in change in location;And change the object according to the change in location.
Invention effect
The present invention, which is played, can provide the effect of the high operating method of convenience to user.
Brief Description Of Drawings
Fig. 1 is the perspective view of the display device of first embodiment.
Fig. 2 is the figure for being viewed from the front the display device worn by user.
Fig. 3 is the figure for showing the variation of display device.
Fig. 4 is the figure for showing other variations of display device.
Fig. 5 is the figure for showing other variations of display device.
Fig. 6 is the block diagram of the display device of first embodiment.
Fig. 7 is the figure for showing the embodiment controlled according to the function that control program provides.
Fig. 8 is the figure for showing the one embodiment for the information being stored in object data.
Fig. 9 is the figure for showing the one embodiment for the information being stored in worked upon data.
Figure 10 is the figure for showing the one embodiment for the information being stored in worked upon data.
Figure 11 is the figure for showing the one embodiment for the information being stored in worked upon data.
Figure 12 is the figure for showing the one embodiment for the information being stored in worked upon data.
Figure 13 is the figure for showing the one embodiment for the information being stored in worked upon data.
Figure 14 is the figure for showing the one embodiment for the information being stored in worked upon data.
Figure 15 is for illustrating that detection presses the operation of three dimensional object and according to detected operation change three dimensional object
First embodiment figure.
Figure 16 is for illustrating that detection presses the operation of three dimensional object and according to detected operation change three dimensional object
First embodiment figure.
Figure 17 is the flow chart for showing the processing sequence of the contact detection processing in first embodiment.
Figure 18 is the flow chart for showing the processing sequence of the operation detection processing in first embodiment.
Figure 19 is for illustrating that detection presses the operation of three dimensional object and according to detected operation change three dimensional object
Second embodiment figure.
Figure 20 is the flow chart for showing the processing sequence of the operation detection processing in second embodiment.
Figure 21 is for illustrating that detection presses the operation of three dimensional object and according to detected operation change three dimensional object
3rd embodiment figure.
Figure 22 is for illustrating that detection presses the operation of three dimensional object and according to detected operation change three dimensional object
3rd embodiment figure.
Figure 23 is the flow chart for showing the processing sequence of the operation detection processing in 3rd embodiment.
Figure 24 is for illustrating that the figure of the first embodiment of the operation of three dimensional object progress is caught in detection.
Figure 25 is the flow chart for showing the processing sequence of the selection detection processing in first embodiment.
Figure 26 is the flow chart for showing the processing sequence of the operation detection processing in first embodiment.
Figure 27 is for illustrating that the figure of the variation of the first embodiment of the operation of three dimensional object progress is caught in detection.
Figure 28 is for illustrating that the figure of the second embodiment of the operation of three dimensional object progress is caught in detection.
Figure 29 is the flow chart for showing the processing sequence of the selection detection processing in second embodiment.
Figure 30 is for illustrating that the figure of the variation of the second embodiment of the operation of three dimensional object progress is caught in detection.
Figure 31 is for illustrating that the figure of the 3rd embodiment of the operation of three dimensional object progress is caught in detection.
Figure 32 is for illustrating that the figure of the 3rd embodiment of the operation of three dimensional object progress is caught in detection.
Figure 33 is the flow chart for showing the processing sequence of the selection detection processing in 3rd embodiment.
Figure 34 is the flow chart for showing the processing sequence of the operation detection processing in 3rd embodiment.
Figure 35 is for illustrating that the figure of the variation of the 3rd embodiment of the operation of three dimensional object progress is caught in detection.
Figure 36 is the perspective view of the display device of second embodiment.
Figure 37 is the block diagram of the display device of first embodiment.
Figure 38 is the figure shown with one embodiment of the display control of the variation linkage of three dimensional object.
Figure 39 is the figure for showing the one embodiment for the operation trace for making finger contact three dimensional object in a flash.
Figure 40 is the figure for showing the one embodiment for the operation trace for moving finger along three dimensional object.
Figure 41 is the figure for showing the one embodiment for the operation trace for damaging three dimensional object by pressure with finger.
Figure 42 is the flow chart for showing the processing sequence that the variation with three dimensional object carries out display control in linkage.
Figure 43 is the block diagram of the display device of 3rd embodiment.
Figure 44 is the figure for showing the embodiment for change in location changing in linkage three dimensional object.
Figure 45 is to conceptually illustrate the figure for configuring the operation screen around user.
Figure 46 is the figure for showing the embodiment for direction change changing in linkage three dimensional object.
Figure 47 is the processing sequence for showing the control in linkage changing three dimensional object with change in location and direction change
Flow chart.
Figure 48 is to show the figure that electronic directory is shown in the room of the commodity of setting purchase.
Figure 49 is the figure for illustrating to select the scene of commodity from catalogue.
Figure 50 is the figure of the scene of the size and setting place for illustrating to consider television set.
Figure 51 is the figure for illustrating to select the scene of cabinet for television set.
Figure 52 is the figure for illustrating the scene of mobile real-world object.
Figure 53 is the flow chart for showing the processing sequence of subscription process.
Figure 54 is the block diagram for showing the display device of the 5th embodiment.
Figure 55 is the figure for illustrating to start the subscription process of Pizza.
Figure 56 is the figure of the process of the size and thickness for illustrating to determine cake bottom.
Figure 57 is the figure for illustrating to put the process of liftout.
Figure 58 is the figure for illustrating to order the process of Pizza.
Figure 59 is the figure for dispensing the embodiment of Pizza.
Figure 60 is the flow chart for showing the processing sequence of subscription process.
Specific embodiment
Hereinafter, the present invention is described in detail with reference to the accompanying drawings.In addition, the present invention is not limited to following the description.Moreover, below
It include content, substantially equivalent content and the institute that those skilled in the art can be readily apparent that in constituent element in explanation
The content of the equivalency range of meaning.
Embodiment 1
Firstly, illustrating the overall structure of the display device 1 of first embodiment with reference to Fig. 1 and Fig. 2.Fig. 1 is display device 1
Perspective view.Fig. 2 is the figure for being viewed from the front the display device 1 worn by user.As depicted in figs. 1 and 2, display device 1 is
It is worn on the wear-type device on user head.
Display device 1 has front face 1a, side surface part 1b and side surface part 1c.When worn, front face 1a is configured in and makes
The positive eyes to cover user of user.Side surface part 1b is connect with the end of the side front face 1a, side surface part 1c and front
The end of the portion other side 1a connects.When worn, side surface part 1b and side surface part 1c are as leg of spectacles by the ear branch of user
Support stablizes display device 1.When worn, side surface part 1b can also be connected with side surface part 1c at the head back side of user.
Front face 1a has display unit 32a and display unit 32b on face opposite with the eyes of user when worn.It is aobvious
The portion 32a of showing is configured in position opposite with the right eye of user when worn, display unit 32b be configured in when worn with make
The opposite position of the left eye of user.Display unit 32a shows that the image of right eye, display unit 32b show the image of left eye.This
Sample, by having the display unit 32a and display unit that are shown when worn with two corresponding images of eyes of user
32b, so that the Three-dimensional Display using binocular parallax may be implemented in display device 1.
If different images, display unit 32a and display unit 32b can be provided separately to the right eye and left eye of user
It can also be made of a display equipment.For example, it is also possible to by being quickly converted switching shield so that only one eye eyeball
The image that can see display is thus configured to right eye from a display equipment to user and left eye and different figures is provided separately
Picture.Front face 1a can also cover the eyes of user so that extraneous light does not enter the eyes of user when worn.
Front face 1a on the face for the face opposite side for being provided with display unit 32a and display unit 32b have the ministry of photography 40 and
The ministry of photography 42.The ministry of photography 40 is arranged near the end (right eye side when wearing) of the side front face 1a, and the ministry of photography 42 is set
It sets near the end (left eye side when wearing) of the other side portion 1a in front.The acquisition of the ministry of photography 40 is equivalent to user's right eye perspective
Range image.The ministry of photography 42 obtains the image for being equivalent to the range of user's left eye perspective.Here the visual field refers to for example
User sees visual field when front.
Display device 1 shows the image shot by the ministry of photography 40 as right eye with image in display unit 32a, will be by taking the photograph
The image that shadow portion 42 is shot is shown in display unit 32b as left eye with image.Therefore, even if blocking the visual field by front face 1a,
Display device 1 can also provide and scene identical when not wearing display device 1 to the user in wearing.
Display device 1, which has, provides the function of practical scene to user as above-mentioned, also has and shows vacation in three dimensions
Think information, user is allowed to operate the function of imaginary information.By display device 1, imaginary information is as physical presence
Display Chong Die with true scene.Moreover, user can be for example as operating, to vacation with the imaginary information of the practical touch of hand
Think that information carries out the variation for moving, rotating, deforming etc..In this way, display device 1 can provide intuitive and convenience about imaginary information
The high operating method of property.In following discussion, the imaginary information shown in three dimensions by display device 1 is known as sometimes
" three dimensional object ".
Display device 1 provides and wide visual field same when not wearing display device 1 to user.Moreover, display device 1
Three dimensional object can be configured with arbitrary size in any position in the wide visual field.In this way, display device 1 is not set by display
Standby size limitation, can be in the three dimensional object of the various positions of broad space display all size.Furthermore, it is possible to will be seen
The people of three dimensional object is limited to the user of display device 1, so that it is guaranteed that the safety of height.
In fig. 1 and 2, showing display device 1 has the embodiment for being similar to glasses (goggles) shape, but aobvious
The shape of showing device 1 is without being limited thereto.Fig. 3 is the figure for showing the variation of display device.Fig. 4 and Fig. 5 show display device
The figure of other variations.For example, display device 1 display device 2 like that, can also have covering user's head as shown in Figure 3
The shape of the helmet-type of the roughly upper half in portion.Alternatively, display device 1 can also display device 3 as shown in Figure 4 like that,
The shape of mask type with the covering substantially entire face of user.Display device 1 can also display device 4 as shown in Figure 5 that
Sample is configured to and the wired or wireless connections of external device (ED)s 4d such as information processing unit, cell apparatus.
Then, the functional structure of display device 1 is illustrated with reference to Fig. 6.Fig. 6 is the block diagram of display device 1.As shown in fig. 6, aobvious
Showing device 1 has operation portion 13, control unit 22, storage unit 24, display unit 32a and display unit 32b, the ministry of photography 40 and the ministry of photography
42, test section 44 and ranging unit 46.Starting, stopping, change of action mode of 13 receiving and displaying device 1 of operation portion etc. are grasped substantially
Make.
Display unit 32a and display unit 32b has liquid crystal display (Liquid Crystal Display), organic EL
The display equipment of (Organic Electro-Luminescence) panel etc., it is aobvious according to the control signal inputted from control unit 22
Show various information.Display unit 32a and display unit 32b is also possible to the retina upslide using light sources such as laser beams in user
The projection arrangement of shadow image.
The ministry of photography 40 and the ministry of photography 42 use CCD (Charge Coupled Device Image Sensor), CMOS
The imaging sensor of (Complementary Metal Oxide Semiconductor) etc. electronically shoots image.And
And the image of shooting is converted to signal and exported to control unit 22 by the ministry of photography 40 and the ministry of photography 42.
Test section 44 detects the real-world object being present in the coverage of the ministry of photography 40 and the ministry of photography 42.Test section 44
Shape (for example, shape of manpower) the matched object for such as detecting and pre-registering in the real-world object for being present in coverage
Body.Test section 44 is also configured to, can also brightness according to pixel, color even for the object not pre-registered
Degree, tone the detection images such as edge in real-world object range (shapes and sizes).
Ranging unit 46 measures the distance away from the real-world object in the coverage for being present in the ministry of photography 40 and the ministry of photography 42.With
Distance of each eye away from real-world object is measured on the basis of the position of each eye of the user of wearing display device 1.Therefore,
When the position of the base position of the measurement distance of ranging unit 46 and each eye is deviateed, to be indicated according to the bias away from eyes position
The mode for the distance set makes corrections the measured value of measurement portion 46.
In the present embodiment, the ministry of photography 40 and the ministry of photography 42 double as test section 44 and ranging unit 46.That is, in the present embodiment
In, by parsing the image shot by the ministry of photography 40 and the ministry of photography 42, detect the object in coverage.Moreover, by comparing
Including in the object in the image shot by the ministry of photography 40 and include the object in the image shot by the ministry of photography 42, measurement
(calculating) is at a distance from object.
Display device 1 also can have the test section 44 different from the ministry of photography 40 and the ministry of photography 42.Test section 44 can also be with
It is to be present in using the detection of at least one of such as visible light, infrared ray, ultraviolet light, electric wave, sound wave, magnetic force, electrostatic capacitance
The sensor of real-world object in coverage.Display device 1 also can have the survey different from the ministry of photography 40 and the ministry of photography 42
Away from portion 46.Ranging unit 46 is also possible to using in such as visible light, infrared ray, ultraviolet light, electric wave, sound wave, magnetic force, electrostatic capacitance
Distance of at least one detection away from the real-world object being present in coverage sensor.Display device 1 also can have
It such as like that can be as test section 44 and ranging unit 46 using the sensor of TOF (Time-of-Flight, flight time) method
Sensor.
Control unit 22 has as the CPU (Central Processing Unit) of arithmetic unit and as storage device
Memory, execute program by using these hardware resources and realize various functions.Specifically, control unit 22 reads storage unit
The program and data that store in 24 and it is loaded onto memory, making CPU execution includes the order in the program for being loaded onto memory.
Moreover, implementing result of the control unit 22 according to the order executed by CPU, to memory and storage unit 24 carry out data read-write,
Or the movement of control display unit 32a etc..When CPU, which is executed, to be ordered, it is loaded onto the data of memory and is detected by test section 44
Operation used as a part of of parameter or Rule of judgment.
Storage unit 24 by flash memory etc. there is non-volatile storage device to constitute, and store various programs and data.Storage unit
The program stored in 24 includes control program 24a.The data stored in storage unit 24 include object data 24b, worked upon data 24c
With imaginary space data 24d.Storage unit 24 can also by combination storage card etc. pockets storage medium and to storage medium into
The read-write equipment of row read-write is constituted.In this case, it is empty that program 24a, object data 24b, worked upon data 24c and imagination are controlled
Between data 24d also can store in storage medium.In addition, control program 24a, object data 24b, worked upon data 24c and vacation
Think that spatial data 24d can also by wireless communication or wire communication is obtained from other devices of server unit etc..
Control program 24a provides function relevant to the various controls for operating display device 1.In control program
It include overlapping three dimensional object on the image that the ministry of photography 40 and the ministry of photography 42 obtain and in display unit in the function that 24a is provided
The function of showing in 32a and display unit 32b detects the function of the operation to three dimensional object, makes three-dimensional according to the operation detected
The function etc. of object variation.
Control program 24a includes detection processing portion 25, display object control portion 26 and image combining unit 27.Detection processing portion
25 provide the function for detecting the real-world object being present in the coverage of the ministry of photography 40 and the ministry of photography 42.Detection processing portion
It include the function of distance of the measurement away from each object detected in 25 functions of providing.
Display object control portion 26 is provided configures which type of three dimensional object, each three-dimensional for managing in imaginary space
Object is in the function of which type of state.Show in the function of the offer of object control portion 26 to include according to by detection processing portion
Operation of the motion detection of 25 function real-world object detected to three dimensional object, and keep three-dimensional right according to the operation detected
As the function of variation.
Image combining unit 27 provides a mean for the image of synthesis real space and the image of imaginary space is showing to generate
The image for showing the image shown in portion 32a and being shown in display unit 32b.The function that image combining unit 27 provides includes: basis
The distance away from real-world object that is measured by the function in detection processing portion 25 and from the viewpoint in imaginary space to three-dimensional right
The distance of elephant judges the context of real-world object and three dimensional object, adjusts the function of overlapping.
Object data 24b includes the information of the shape and property about three dimensional object.Object data 24b is for showing three-dimensional
Object.Worked upon data 24c includes the information that three dimensional object how is acted on about the operation of the three dimensional object to display.Work as detection
To when operation to the three dimensional object of display, worked upon data 24c makes how three dimensional object changes for judging.Change mentioned here
Change includes mobile, rotation, deformation, disappearance etc..Imaginary space data 24d keeps the three dimensional object with configuration in imaginary space
The relevant information of state.The state of three dimensional object includes such as position, posture, the situation of deformation.
Then, with reference to Fig. 7, illustrate the embodiment that the function of providing according to control program 24a is controlled.Fig. 7 is to show
According to the figure for the embodiment that the function that control program provides is controlled.Image P1a is the image obtained by the ministry of photography 40, that is,
It is equivalent to the image that the scene of real space is seen with right eye.Appear before one's eyes out the hand H1 of desk T1 and user in image P1a.Display dress
It sets 1 and also obtains the image for shooting same scene by the ministry of photography 42, that is, be equivalent to the figure for seeing the scene of real space with left eye
Picture.
Image P2a is the image according to the imaginary space data 24d and object data 24b right eye generated.In this implementation
In example, imaginary space data 24d keeps information relevant to the state of blocky three dimensional object BL1 being present in imaginary space,
Object data 24b keeps information relevant to the shape of three dimensional object BL1 and property.Display device 1 is according to these information regenerations
Imaginary space generates the image P2a that the imaginary space of reproduction is seen with the viewpoint of right eye.It is determined according to pre-defined rule in imaginary space
In right eye (viewpoint) position.Similarly, display device 1 also generates the figure that the imaginary space of reproduction is seen with the viewpoint of left eye
Picture.That is, display device 1 also shows the image of three dimensional object BL1 in three dimensions by generating in conjunction with image P2a.
The composograph P1a and image P2a in the step S1 shown in Fig. 7 of display device 1 generates image P3a.Image P3a
It is the image shown in display unit 32a as the image of right eye.At this point, display device 1 is with the position of user's right eye
Benchmark, real-world object of the judgement in the coverage of the ministry of photography 40 is with the three dimensional object BL1's being present in imaginary space
Context.Moreover, adjustment is overlapped such that and can more be leaned on from being previously seen when real-world object is Chong Die with three dimensional object BL1
The object of the right eye of nearly user.
By the range of each predefined size in the region on the real-world object image Chong Die with three dimensional object BL1 (for example, every
One pixel) carry out the adjustment of above-mentioned overlapping.Therefore, it measures by the range of each predefined size on image from real space
Distance of the middle viewpoint to real-world object.Moreover, it is contemplated that the position of three dimensional object, shape, posture etc., are made a reservation for as each of on image
The range computation of the size distance of viewpoint to three dimensional object BL1 in imaginary space.
In the scene of the step S1 shown in Fig. 7, three dimensional object BL1 is configured in imaginary space is equivalent to desk T1
The position of the surface of the position existing for real space.Moreover, in the scene of the step S1 shown in Fig. 7, the hand of user
H1 and three dimensional object BL1 are present in approximately the same distance in substantially common direction on the basis of the right eye position of user
Place.Therefore, by adjusting overlapping by the range of each predefined size, so that in image P3a in post synthesis, in hand H1 and three-dimensional
Among the region of object BL1 overlapping, hand H1 is exposed at front at the part of thumb for being equivalent to hand H1, three at other parts
Dimensional object BL1 is exposed at front.Moreover, three dimensional object BL1 is exposed at front at the region that desk T1 and three dimensional object BL1 is overlapped.
It is adjusted by above-mentioned overlapping, in the step S1 shown in Fig. 7, is obtained as three dimensional object BL1 is placed in desk
T1 is upper and user catches the image P3a of three dimensional object BL1 with hand H1.Display device 1 is synthesized by identical processing by taking the photograph
The image that shadow portion 42 is shot and image from imaginary space in terms of the viewpoint of left eye, generate the image that is shown in display unit 32b with
As left eye image.When generating the image of left eye, real-world object and three is adjusted on the basis of the left eye position of user
The overlapping of dimensional object BL1.
Display device 1 shows the composograph such as above-mentioned generation in display unit 32a and display unit 32b.As a result, user
It can be seen that catching the scene of three dimensional object BL1 with the hand H1 of oneself as three dimensional object BL1 is placed on desk T1.
In the scene of the step S1 shown in Fig. 7, user is mobile to the direction of arrow A1 by hand H1.In such case
Under, in the scene of the step S2 shown in Fig. 7, after the position that the image change obtained by the ministry of photography 40 is hand H1 moves right
Image P1b.Moreover, the movement of hand H1 is judged as the behaviour to catch the state of three dimensional object directly to move right by display device 1
Make, according to operation, the position of the three dimensional object in imaginary space is moved right.The movement of three dimensional object in imaginary space
Reflect in imaginary space data 24d.As a result, according to the figure of the imaginary space data 24d and object data 24b right eye generated
Image P2b after moving right as variation for the position of three dimensional object BL1.The behaviour carried out by display device 1 is described in detail below
The detection of work.
Display device 1 composograph P1b and image P2b and the image P3b for generating right eye.Compared with image P3a, image
P3b is the image that position such as user on desk T1 more on the right side catches three dimensional object BL1 with hand H1.Display device 1
It is equally generated the composograph of left eye.Moreover, display device 1 shows such as above-mentioned life in display unit 32a and display unit 32b
At composograph.As a result, user can see the feelings as catching three dimensional object BL1 to move right with the hand H1 of oneself
Scape.
The composograph of above-mentioned display is carried out more with frequency identical with common animation frame per second (for example, 30 times per second)
Newly.As a result, the image that display device 1 is shown generally reflects the change of the three dimensional object BL1 according to the operation of user in real time
Change, user can operate three dimensional object BL1 in phase as physical presence.Moreover, in the construction of the present embodiment, behaviour
The hand H1 for making the user of three dimensional object BL1 is not necessarily between the eyes and display unit 32a and display unit 32b in user, because
This, user can operate in the case where the display for being not concerned about three dimensional object BL1 is blocked by hand.
Then, object data 24b and worked upon data 24c shown in Fig. 6 are described in more detail with reference to Fig. 8 to Figure 14.Fig. 8 is
The figure of the one embodiment for the information being stored in object data 24b is shown.Fig. 9 to Figure 14 is to show to be stored in worked upon data
The figure of one embodiment of the information in 24c.
As shown in figure 8, in object data 24b by each three dimensional object storage include type, shape information, color, thoroughly
The information of lightness etc..The physical property of type expression three dimensional object.Type is, for example, the value of " rigid body ", " elastomer " etc..Shape
Information is to indicate the information of three dimensional object shape.Shape information is, for example, to constitute the set of the apex coordinate in face of three dimensional object.
Color is the color of three-dimensional object surface.Transparency is the degree that three dimensional object allows light to pass through.Object data 24b can be kept
Information about multiple three dimensional objects.
In the embodiment shown in Fig. 9 to Figure 14, in worked upon data 24c by the storage of every kind of three dimensional object with detect by
The relevant information of the variation of the case where press operation.As shown in figure 9, detecting pressing when the type of three dimensional object is " rigid body "
The case where operation variation according to whether there is or not fulcrum, whether there is or not pressing direction on barrier and pressing speed and it is different.Here institute
The barrier said can be other three dimensional objects, be also possible to true object.In addition, pressing the fast of speed according to threshold decision
Slowly.
When three dimensional object do not have fulcrum and press direction on there is no barrier when, three dimensional object be shown as according to by
Pressure amount is mobile to pressing direction.If the three dimensional object of above-mentioned display is, for example, building blocks, pen, book.For move mode, Ke Yigen
Shape determination according to three dimensional object is sliding or rotation.In addition, for three dimensional object being moved together with the object of pressing, also
It is that can be determined according to pressing speed as being flicked the object as pressing far from object movement, it can also be according to three-dimensional right
As the calculated value or setting value of the frictional resistance with bottom surface determine.
When three dimensional object does not have fulcrum and on pressing direction there are when fixed barrier, three dimensional object is shown
It is, when with bar contact stopping movement mobile to pressing direction according to pressing quantity.As above-mentioned display three dimensional object for example
It is building blocks, pen, book.When pressing speed is fast, three dimensional object can also be set and destroy barrier and continue to move to.In addition, working as
When three dimensional object contacts barrier during moving as being flicked the object as pressing far from the object, three dimensional object can also be made
It is mobile to opposite direction in a manner of rebounding.
When three dimensional object does not have fulcrum, there are other loose rigid bodies on pressing direction, and press speed it is slow when,
Three dimensional object is shown as being moved together with other rigid bodies after contacting with other rigid bodies according to pressing quantity to the movement of pressing direction
It is dynamic.In addition, not having fulcrum when three dimensional object, there are other loose rigid bodies on pressing direction, and press speed it is fast when,
Three dimensional object is shown as mobile to pressing direction according to pressing quantity.Then, after three dimensional object and other rigid bodies contact, other
Rigid body is shown as being flicked and being moved.It after three dimensional object is contacted with other rigid bodies, can also stop at that time, can also reduce
Speed continues to move to.Such as the group of the combination e.g. bowling and bowling pin of the three dimensional object and other rigid bodies of above-mentioned display
Conjunction or the combination of marble.
When three dimensional object does not have fulcrum, there are other loose rigid bodies on pressing direction, but other rigid bodies can be with
When squeezing through, three dimensional object is shown as according to pressing quantity to the movement of pressing direction, after contacting with other rigid bodies, has squeezed through other just
Body and continue to move to.Rigid body will not squeeze through rigid body in reality, but by make it possible it is above-mentioned squeeze through, can be mentioned to user
For brand-new experience.Such as the group of the combination e.g. bowling and bowling pin of the three dimensional object and other rigid bodies of above-mentioned display
Conjunction or the combination of marble.The threshold value of setting pressing speed, when pressing speed below threshold value, three dimensional object can not also be squeezed
Cross other rigid bodies.
When three dimensional object has fulcrum, three dimensional object is shown as being revolved centered on fulcrum according to pressing direction and pressing quantity
Turn.Rotation mentioned here can be with the rotation of 360 degree of one circle circle, the reciprocating rotary being also possible in predetermined rotating range
It is dynamic.If the three dimensional object of above-mentioned display is, for example, swing component, Boxing sand bag, windmill.
In addition, as shown in Figure 10, when the type of three dimensional object is " elastomer ", three-dimensional when detecting pressing operation is right
The variation of elephant according to material, whether there is or not the limitation of variable quantity and pressing speed and it is different.Material mentioned here is three dimensional object vacation
If material, defined in object data 24b.
When the material of three dimensional object is rubber, variable quantity there is no limit, and press speed it is slow when, three dimensional object is shown
, to pressing Direction distortion, to return to original shape when discharging from pressing state according to pressing quantity.In addition, working as the material of three dimensional object
Matter is rubber, variable quantity there is no limit, and press speed it is fast when, three dimensional object is shown as according to pressing quantity to pressing direction
Deformation, is flicked later, mobile to pressing direction while returning to original shape.As the three dimensional object of above-mentioned display is, for example,
Ball, rubber.
When the material of three dimensional object is rubber, and variable quantity is restricted, three dimensional object be shown as according to pressing quantity to by
Until pressing Direction distortion to the range that can be deformed, if there are also pressing operations after detecting, original shape is returned on one side
It is mobile to pressing direction on one side.If the three dimensional object of above-mentioned display is, for example, ball, rubber.
When the material of three dimensional object is metal, three dimensional object is shown as according to pressing quantity to pressing Direction distortion to can
Until the range of deformation, (vibrated) repeatedly between return original-shape or deformation if from pressing state release.When to
When direction except the direction that can be deformed presses, three dimensional object is equally moved with rigid body.Such as the three dimensional object of above-mentioned display
E.g. leaf spring, helical spring.
In addition, as shown in figure 11, when the type of three dimensional object is " plastic body ", three dimensional object is shown as pressed position
It is recessed and overall shape change.If the three dimensional object of above-mentioned display is, for example, clay.
In addition, as shown in figure 12, when the type of three dimensional object is " liquid ", variation when detecting pressing operation according to
Press speed and it is different.When pressing speed is slow, it is shown as pressing object and is immersed in three dimensional object, that is, soak in a liquid.
When pressing speed is medium speed, it is shown as the leaching of pressing object in a liquid, and ripple expands in a liquid.When pressing speed
When spending fast, it is shown as the leaching of pressing object in a liquid, and splash spray.As the three dimensional object of above-mentioned display is for example loaded into cup
In water.
In addition, as shown in figure 13, when the type of three dimensional object is " gas ", variation when detecting pressing operation according to
Press speed and it is different.When press speed it is slow when, be shown as three dimensional object (namely gas) be pressed object blocking and to week
Enclose floating.When pressing speed is medium speed, it is shown as gas and is pressed object upsetting.When pressing speed is fast, it is shown as
In the rear side for pressing object moving direction because turbulent flow generates whirlpool in gas.If the three dimensional object of above-mentioned display is, for example, cigarette.
In addition, as shown in figure 14, when the type of three dimensional object is " aggregate ", detecting variation root when pressing operation
It is different according to the bonding state of aggregate element.When the element of aggregate is not bound with, three dimensional object is shown as being pressed
The overall shape change of place's recess and aggregate.If the three dimensional object of above-mentioned display is, for example, sand, granulated sugar.
When the element of aggregate is combined, three dimensional object is shown as the whole shape of pressed position recess and aggregate
Shape variation.Moreover, the element other than pressed position is shown as being pulled and being moved by the element of pressed position.Such as above-mentioned display
Three dimensional object is, for example, to lock.
When the element of aggregate is not bound with, but has gravitation or repulsion to act between pressing object, three dimensional object
It is shown as moving not with pressing object contact.When pressing has graviational interaction between object, even if three dimensional object
It is not contacted with pressing object, if being also pressed object attraction away from pressing object within preset distance.In addition, working as and pressing
When having repulsion effect between pressure object, even if three dimensional object is not contacted with pressing object, if away from pressing object in preset distance
Within, then also far from pressing object.Such as the combination of the three dimensional object of above-mentioned display and pressing object e.g. iron powder and magnet
Combination.
In this way, it is three-dimensional right to make based in the information that object data 24b is stored and the information stored in worked upon data 22c
As being changed, to make three dimensional object carry out the variation of multiplicity according to pressing operation.Object data 24b and worked upon data 22c
The information of middle storage is not limited to above example, can also wait and suitably be changed depending on the application.For example, according to pressing object
Type and size or pressing object and three dimensional object contact area size, come set transformation three dimensional object variation side
Formula.
In the following, illustrating the operation of detection pressing three dimensional object with reference to Figure 15 and Figure 16 and according to detected operation change
Three dimensional object.In following discussion, the space that the user for having worn display device 1 sees is known as display space.Display
Device 1, can be (vertical in three dimensions in display space by providing image corresponding with the right eye of user and left eye
Body) show real-world object and three dimensional object.Display device 1 will be reproduced based on imaginary space data 24d according to pre-defined rule
Imaginary space and the real space foundation shot by the ministry of photography 40 and 42 correspond to, and using the space of these space overlaps as display
Spatial display.
The detection of operation that Figure 15 and Figure 16 is used to illustrate to press three dimensional object and right according to detected operation three-dimensional
The variation of elephant.In the step S11 shown in Figure 15, display device 1 is in 50 Stereo display three dimensional object OB1 of display space.Three
Dimensional object OB1 is, for example, the object for imitating ball.In addition, in step s 11, the bottom surface B1 of display support three dimensional object OB1.
In step s 12, finger F 1 is placed on the position contacted with three dimensional object OB1 by user, is remain stationary motionless.When
Display device 1 detects real-world object in display space, and the object keeps the state contacted with three dimensional object OB1 to continue
When more than the predetermined time, judge to have selected three dimensional object OB1 as operation object.Moreover, display device 1 changes three dimensional object
The display mode etc. of OB1 notifies three dimensional object OB1 to be selected as operation object to user.
According to the shape of object space and three dimensional object OB1 in real space, posture, in imaginary space
Position etc. carries out the judgement whether object contacts with three dimensional object OB1.It can also be according to above-mentioned pre-defined rule, by the sky of a side
Meta position sets the spatial position for being scaled another party, can also be with to compare the position in position and imaginary space in real space
The spatial position of both sides is scaled to the spatial position compared to compare the position in position and imaginary space in real space
It sets.It, can also be using the position of finger tip as object space processing when detecting finger as real-world object.Since people is grasping
Make to use finger tip when something, therefore by that can provide to user using the position of finger tip as object space processing more
More natural operation sense.
In addition, for example, by change three dimensional object OB1 integral color or change three dimensional object OB1 surface in object
Color near the position of body contact, realizes the notice for being selected as operation object.Display device 1 can also pass through sound or vibration
It is notified, to replace the notice of above-mentioned vision, or the notice and sound or the notice of vibration of the above-mentioned vision of progress.
In this way, when display device 1 detects that the real-world objects such as finger continue pre- timing with the three dimensional object OB1 state contacted
Between it is above when, judge to have selected three dimensional object OB1 as operation object.Whether continue pre- timing by additional detected contact condition
Between it is above be used as condition so that for example can reduce selection during mobile finger is to operate other three dimensional objects and be not desired to
A possibility that three dimensional object of choosing is as operation object.
Select three dimensional object OB1 as operation object after, as shown in step S13, it is three-dimensional that user enters finger F 1
The inside of object OB1 is to press three dimensional object OB1.Enter object as selected by operation object when display device 1 detects
Three dimensional object in operation when, change three dimensional object according to operation.According to three dimensional object kind defined in object data 24b
The rule change that the type defines is corresponded in class and worked upon data 24c, makes how three dimensional object changes to determine.
For example, defining three dimensional object OB1 in object data 24b is elastomer, definition is when elasticity in worked upon data 24c
When body is pressed according to pressing quantity to pressing Direction distortion.In this case, as shown in step S14, display device 1 makes three-dimensional
Object OB1 by the part that finger F 1 enters is pressed be recessed in a manner of be changed.
In addition, defining three dimensional object OB1 in object data 24b is rigid body, rigid body quilt is worked as in definition in worked upon data 24c
It is mobile to pressing direction according to pressing quantity when pressing.In this case, as shown in the step S15 of Figure 16, display device 1 is with three
Dimensional object OB1 makes three dimensional object OB1 mobile to the direction of advance of finger F 1 as being pressed by finger F 1.Figure 16 the step of
In S15, since three dimensional object OB1 is supported by bottom surface B1, the power that is applied along rigid body in the B1 horizontal direction of bottom surface
Component and moved.
In this way, making three-dimensional according to object data 24b and worked upon data 24c when detecting the operation of pressing three dimensional object
Object OB1 variation, so as to make three dimensional object carry out various change according to operation.So-called pressing operation is in real world
The operation used in various situations presses the operation of three dimensional object OB1 by detection and carries out corresponding processing, may be implemented straight
It sees and the high operability of convenience.
Object for operating three dimensional object is not limited to finger, is also possible to hand, foot, stick, instrument etc..According to pressing operation
The mode for changing three dimensional object follows true physical laws, is also possible to the mode that cannot achieve in reality.
The space that display device 1 can also will test the operation to three dimensional object is limited to can opereating specification 51.It can operate
Range 51, which may, for example, be, wears the range that the hand of the user of display device 1 can be got at.In this way, by limiting detection to three
The space of the operation of dimensional object can reduce display device 1 for detecting the load for operating carried out calculation processing.
In the following, illustrating the processing sequence that display device 1 is carried out about the operation of pressing three dimensional object with reference to Figure 17 and Figure 18
First embodiment.Figure 17 is the flow chart for showing the processing sequence of contact detection processing of three dimensional object.Place shown in Figure 17
Sequence is made in order to realize by the execution control of control unit 22 program 24a.
As shown in figure 17, firstly, as step SA01, control unit 22 synthesis include three dimensional object imaginary space image and
Real space image.
Then, as step SA02, control unit 22 judge be by test section 44 (namely the ministry of photography 40 and the ministry of photography 42)
It is no to detect predetermined object.Predetermined object is, for example, the finger of user.(the step SA02 "No" when predetermined object is not detected
The case where), as step SA08, control unit 22 judges whether to detect that operation terminates.
Such as detect that operation terminates when having carried out scheduled operation to operation portion 13.At the end of detecting operation
(the case where step SA08 "Yes"), control unit 22 terminate contact detection processing.(the step SA08 at the end of operation is not detected
The case where "No"), control unit 22 is executed from step SA02 again.
When detecting predetermined object (the case where step SA02 "Yes"), as step SA03, the judgement of control unit 22 is predetermined
The type of object.Such as it is determined according to dimension of object, shape, the color etc. in the image taken by the ministry of photography 40 and 42 predetermined
The type of object.Then, as step SA04, control unit 22 finds the three dimensional object contacted with predetermined object.When not with it is pre-
When the three dimensional object of earnest body contact (the case where step SA05 "No"), control unit 22 enters step SA08.
When the three dimensional object that discovery is contacted with predetermined object (the case where step SA05 "Yes"), as step SA06, control
The type for the three dimensional object that portion 22 processed is contacted according to object data 24b judgement with predetermined object.Then, as step SA07, control
Portion 22 processed executes aftermentioned operation detection processing.Later, control unit 22 enters step SA08.
Figure 18 is the flow chart for showing the processing sequence of operation detection processing.Processing sequence shown in Figure 18 passes through control unit
22 execute control program 24a and realize.
As shown in figure 18, firstly, as step SB01, when control unit 22 obtains contact of the predetermined object with three dimensional object
Between.Then, as step SB02, control unit 22 judges whether predetermined object is moved to the inside of three dimensional object.Work as predetermined object
When not moving to three-dimensional inside (the case where step SB02 "No"), control unit 22 is executed from step SB01 again.
When predetermined object is moved to three-dimensional inside (the case where step SB02 "Yes"), as step SB03, control unit
Whether 22 judge time of contact more than the predetermined time.(feelings of step SB03 "No" in short-term than the predetermined time between when contacting
Condition), due to judging that the three dimensional object is not operation object, 22 end operation detection processing of control unit.
When between when contacting more than the predetermined time (the case where step SB03 "Yes"), as step SB04, control unit 22
Calculate the speed of predetermined object.Then, as step SB05, control unit 22 according to the type of predetermined object, position and speed with
And type of three dimensional object etc. changes three dimensional object.Specific variation pattern is determined according to worked upon data 24c.
Then, as step SB06, control unit 22 judges whether predetermined object is moved to the outside of three dimensional object.When predetermined
When object does not move to the outside of three dimensional object, that is, when pressing operation continuation (the case where step SB06 "No"), control unit
22 execute from step SB04 again.
When predetermined object is moved to the outside of three dimensional object, that is, (step SB06 "Yes" when three dimensional object is released
The case where), as step SB07, control unit 22 judges whether the variation of three dimensional object continues.For example, if in worked upon data
The also vibration of predetermined hold-time after definition release, then judge that the variation of three dimensional object continues in 24c.
When the variation of three dimensional object continues (the case where step SB07 "Yes"), as step SB08, control unit 22 makes three
Dimensional object variation, executes from step SB07 again later.(the feelings of step SB07 "No" when the variation of three dimensional object does not continue
Condition), control unit 22 terminates pressing operation detection processing.
As described above, in the first embodiment, so that three dimensional object is carried out various change according to pressing operation, thus, it is possible to
There is provided convenience high operating method to user.
Illustrate the second embodiment of the processing sequence of the operation about pressing three dimensional object.Contact in a second embodiment
Detection processing is identical as contact detection processing in the first embodiment.Therefore, it for second embodiment, omits and implements with first
Example repeat description mainly illustrates to operate detection processing.
Firstly, with reference to Figure 19 illustrate detection pressing three dimensional object operation and according to detected operation change three-dimensional it is right
As.Figure 19 is for illustrating that detection presses the operation of three dimensional object and according to the figure of detected operation change three dimensional object.
In the step S21 shown in Figure 19, user is contacted with finger F 1 with three dimensional object OB1, and in step S22, user makes hand
Refer to that F1 enters the inside of three dimensional object OB1.
When display device 1 detects real-world object in display space, and the object contacts backward with three dimensional object OB1
When the mobile state in the inside of three dimensional object OB1 continue for the predetermined time or more, judge to have selected three dimensional object OB1 as operation
Object.Moreover, display device 1 changes the display mode etc. of three dimensional object OB1, three dimensional object OB1 has been selected to user's notice
As operation object.Moreover, as shown in step S23, display device 1 make three dimensional object OB1 as step S21 stage
The object for being selected as pressing operation is the same, is changed according to the operation carried out by the later finger F 1 of contact detection.
In this way, being not left at this and can also examine even if object by making after detecting object and three dimensional object contact
Pressing operation is surveyed, so that user can start rapidly to press the operation of three dimensional object.In addition, by object after supplementary contact to
More than the mobile state predetermined hold-time in the inside of three dimensional object OB1 it is used as condition, so that for example in mobile finger to operate
During other three dimensional objects, a possibility that three dimensional object for selecting to be not desired to select is as operation object can be reduced.
Then, the processing sequence of the operation detection processing in second embodiment is illustrated with reference to Figure 20.Figure 20 is to show operation
The flow chart of the processing sequence of detection processing.Processing sequence shown in Figure 20 executes control program 24a reality by control unit 22
It is existing.The processing sequence for contacting detection processing is identical with the sequence shown in Figure 17.
As shown in figure 20, firstly, as step SC01, control unit 22 judges whether predetermined object is moved to three dimensional object
It is internal.When predetermined object does not move to three-dimensional inside (the case where step SC01 "No"), due to judging the three dimensional object not
It is operation object, therefore 22 end operation detection processing of control unit.
When predetermined object is moved to three-dimensional inside (the case where step SC01 "Yes"), as step SC02, control unit
22 judge from detect contact elapsed time whether more than the predetermined time.When elapsed time in short-term than the predetermined time
(the case where step SC02 "No"), control unit 22 are executed from step SC01 again.
When by the time more than the predetermined time (the case where step SC02 "Yes"), as step SC03, control unit 22
Calculate the speed of predetermined object.Then, as step SC04, control unit 22 according to the type of predetermined object, position and speed with
And type of three dimensional object etc. changes three dimensional object.Specific variation pattern is determined according to worked upon data 24c.
Then, as step SC05, control unit 22 judges whether predetermined object is moved to the outside of three dimensional object.When predetermined
When object does not move to the outside of three dimensional object, that is, when pressing operation continuation (the case where step SC05 "No"), control unit
22 execute from step SC03 again.
When predetermined object is moved to the outside of three dimensional object, that is, (step SC05 "Yes" when three dimensional object is released
The case where), as step SC06, control unit 22 judges whether the variation of three dimensional object continues.For example, if in worked upon data
The also vibration of predetermined hold-time after definition release, then judge that the variation of three dimensional object continues in 24c.
When the variation of three dimensional object continues (the case where step SC06 "Yes"), as step SC07, control unit 22 makes three
Dimensional object variation, executes from step SC06 again later.(the feelings of step SC06 "No" when the variation of three dimensional object does not continue
Condition), 22 end operation detection processing of control unit.
As described above, in a second embodiment, even if the state that the objects such as finger are contacted with three dimensional object does not make a reservation for persistently
More than the time, pressing operation is also identified, therefore user can start rapidly to press the operation of three dimensional object.
Illustrate the 3rd embodiment of the processing sequence of the operation about pressing three dimensional object.Contact in the third embodiment
Detection processing is identical as contact detection processing in the first embodiment.Therefore, it for 3rd embodiment, omits and implements with first
Example repeat description mainly illustrates to operate detection processing.
Firstly, illustrating the operation of detection pressing three dimensional object with reference to Figure 21 and Figure 22 and according to detected operation change
Three dimensional object.Figure 21 and Figure 22 is for illustrating that detection presses the operation of three dimensional object and according to detected operation change three
The figure of dimensional object.In the step S31 shown in Figure 21, three dimensional object OB1 is shown in display space neutral body.In addition, using
Person is contacted with finger F 1 with three dimensional object OB1.
Here, the inside that user makes finger F 1 enter three dimensional object OB1.When display device 1 detects and three dimensional object
When the object of OB1 contact is moved to the inside of three dimensional object OB1, as shown in step S32, carried out from the time according to finger F 1
Operation changes three dimensional object OB1.In the embodiment shown in Figure 21, in step s 32, three dimensional object OB1 starts to cooperate hand
Refer to the movement of F1 and moves.
Then, as shown in step S33, movement of the display device 1 in finger F 1 towards the inside of three dimensional object OB1 continue for pre-
Stage more than fixing time determines three dimensional object OB1 as operation object.Then, display device 1 changes three dimensional object OB1's
Display mode etc. notifies to determine three dimensional object OB1 as operation object to user.Later, detecting finger F 1 to three-dimensional
During the medial movement of object OB1, display device 1 persistently changes three dimensional object OB1.
As shown in the step S34 of Figure 22, when by can't detect finger F 1 into three dimensional object OB1 before the predetermined time
When side is mobile, display device 1 to three dimensional object OB1 apply in the opposite variation of the variation applied at that time.As a result, with step
The stage of S31, identical position was with identical status display three dimensional object OB1.Apply the speed of inverse variation to three dimensional object OB1
Degree can also be faster than applying the speed changed to three dimensional object OB1 at that time.That is, three dimensional object can also be made as putting upside down at a high speed
OB1 does inverse variation.
In this way, being become by applying since detecting the stage that object enters to the inside of three dimensional object to three dimensional object
Change, user is allowed to select to determine preconsciousness to selecting three dimensional object.As a result, user can know in advance is
The no three dimensional object for having selected to think choosing.When having selected to be not desired to the three dimensional object of choosing, user can be by by pre- timing
Between it is preceding terminate operation so that be not desired to selection three dimensional object return original state.
Finger F 1 to until more than the mobile predetermined hold-time on the inside of three dimensional object OB1, can also by with it is usual
State mode (such as translucent) display different with the state after being selected as operation object is determined is applied the three-dimensional of variation
Object.By such as above-mentioned change display mode, so that user is easy to judge the state of three dimensional object.
Then, the processing sequence of operation detection processing in the third embodiment is illustrated with reference to Figure 23.Figure 23 is to show behaviour
Make the flow chart of the processing sequence of detection processing.Processing sequence shown in Figure 23 executes control program 24a by control unit 22
It realizes.The processing sequence for contacting detection processing is identical as the sequence shown in Figure 21.
As shown in figure 23, firstly, as step SD01, control unit 22 judges whether predetermined object is moved to three dimensional object
It is internal.When predetermined object does not move to the inside of three dimensional object (the case where step SD01 "No"), due to judging that the three-dimensional is right
As not being operation object, therefore 22 end operation detection processing of control unit.
When predetermined object is moved to the inside of three dimensional object (the case where step SD01 "Yes"), as step SD02, control
Portion 22 processed calculates the speed of predetermined object.Then, as step SD03, control unit 22 according to the type of predetermined object, position and
Speed and the type of three dimensional object etc. change three dimensional object.Specific variation pattern is determined according to worked upon data 24c.
Then, as step SD04, since control unit 22 judge whether to pass through the time in the predetermined time contacting detection
More than.(the step when passing through the time in short-term than the predetermined time, that is, not determining object of the three dimensional object as pressing operation
The case where SD04 "No"), as step SD05, control unit 22 judges whether predetermined object continues the internal direction to three dimensional object
It is mobile.
When continuing mobile to the internal direction of three dimensional object (the case where step SD05 "Yes"), control unit 22 is again from step
Rapid SD02 is executed.When not continuing mobile to the internal direction of three dimensional object (the case where step SD05 "No"), as step
SD06, control unit 22 make three dimensional object carry out opposite variation and return to original state.Moreover, 22 end operation of control unit detects
Processing.
When since the process time contacting detection more than the predetermined time (the case where step SD04 "Yes"), as step
Rapid SD07, control unit 22 judge whether predetermined object is moved to the outside of three dimensional object.When predetermined object do not move to it is three-dimensional right
When the outside of elephant, that is, when pressing operation continuation (the case where step SD07 "No"), control unit 22 is held from step SD02 again
Row.
When predetermined object is moved to the outside of three dimensional object, that is, (step SD07 "Yes" when three dimensional object is released
The case where), as step SD08, control unit 22 judges whether the variation of three dimensional object continues.For example, if in worked upon data
The also vibration of predetermined hold-time after definition release, then judge that the variation of three dimensional object continues in 24c.
When the variation of three dimensional object continues (the case where step SD08 "Yes"), as step SD09, control unit 22 makes three
Dimensional object variation, executes from step SD08 again later.(the feelings of step SD08 "No" when the variation of three dimensional object does not continue
Condition), 22 end operation detection processing of control unit.
As described above, in the third embodiment, due to right according to operation change three-dimensional since when detecting pressing operation
As, therefore the readily identified three dimensional object as pressing operation object of user.
As the relevant operation of three dimensional object, although the description of the operation of pressing three dimensional object, but display device 1 about
The operation of three dimensional object detection is not limited to pressing operation.Display device 1 also can detecte user and three dimensional object caught to carry out
Operation.The operation for catching three dimensional object to carry out is illustrated below.
Illustrate the operation that detection catches three dimensional object to carry out with reference to Figure 24.Figure 24 is for illustrating that three dimensional object is caught in detection
The figure of the operation of progress.In the step S41 shown in Figure 24, three dimensional object OB1 is shown in 50 neutral body of display space.
Here, setting user catches three dimensional object OB1 to want to carry out some operations.In order to catch three dimensional object OB1 to carry out
Some operations, it is necessary first to select three dimensional object OB1 as operation object.As shown in step S42, in order to select three dimensional object
OB1, user moves finger F 1 and finger F 2 so that three dimensional object OB1 is between finger F 1 and finger F 2, and keeping should
It is more than state predetermined hold-time.
When display device 1 detects 2 real-world objects in display space, and three dimensional object OB1 is located at 2 objects
Between state when continue for the predetermined time or more, judge to have selected three dimensional object OB1 as operation object, by three dimensional object
OB1 is set as selection state.Moreover, display device 1 changes the display mode etc. of three dimensional object OB1, selected to user's notice
Three dimensional object OB1 is as operation object.
According to the shape of the position of 2 objects in real space and three dimensional object OB1, posture, in imaginary space
Position etc. carry out whether three dimensional object OB1 is located at the judgement between 2 objects.It can be according to above-mentioned pre-defined rule by a side's
Spatial position is scaled the spatial position of another party to compare the position in position and imaginary space in real space, can also be with
The spatial position of both sides is scaled to the spatial position compared to compare the position in position and imaginary space in real space
It sets.It, can also be using the position of finger tip as object space processing when detecting finger as real-world object.
In this way, display device 1 is detecting that it is pre- that state that three dimensional object OB1 is located between the real-world objects such as finger continue for
When fixing time above, judge to have selected three dimensional object OB1.Finger is configured so that clamping three dimensional object OB1's between the finger
Operation catches the operation of the object similar with the artificial something for selecting real space.Therefore, aforesaid operations are used as selecting
The operation of three dimensional object is intuitive and easy to understand.In addition, whether condition more than predetermined hold-time is used as by additional detected state, so that
Such as during mobile finger is to operate other three dimensional objects, the three dimensional object for selecting to be not desired to choosing can be reduced as operation
A possibility that object.
Display device 1 is after judging that three dimensional object OB1 becomes selection state, according to the movement of finger F 1 and finger F 2 to three
Dimensional object OB1 applies the variations such as mobile, deformation, disappearance.
Then, illustrate processing of the display device 1 about the operation progress for catching three dimensional object to carry out with reference to Figure 25 and Figure 26
The first embodiment of sequence.Figure 25 is the flow chart for showing the processing sequence of selection detection processing of three dimensional object.Figure 25 is shown
Processing sequence by control unit 22 execute control program 24a realize.
As shown in figure 25, firstly, as step SE01, control unit 22 synthesis include three dimensional object imaginary space image and
Real space image simultaneously makes its display.
Then, as step SE02, control unit 22 judge be by test section 44 (namely the ministry of photography 40 and the ministry of photography 42)
It is no to detect the first object and the second object.First object and the second object are real-world object, such as the finger of user.When not
When detecting the first object and the second object (the case where step SE02 "No"), as step SE10, control unit 22 judges whether
Detect that operation terminates.
Such as detect that operation terminates when having carried out predetermined operation to operation portion 13.(the step at the end of detecting operation
The case where rapid SE10 "Yes"), control unit 22 terminates selection detection processing.(the step SE10 "No" at the end of operation is not detected
The case where), control unit 22 is executed from step SE02 again.
When detecting the first object and the second object (the case where step SE02 "Yes"), as step SE03, control unit
22 find the three dimensional object shown between the first object and the second object among the three dimensional object of display.When what is do not met
When three dimensional object (the case where step SE04 "No"), control unit 22 enters step SE10.
When finding the three dimensional object shown between the first object and the second object (the case where step SE04 "Yes"), make
For step SE05, control unit 22 obtains time of the three dimensional object between the first object and the second object.When the time of acquisition
Not up to the predetermined time when (the case where step SE06 "No"), control unit 22 enters step SE10.
When the time of acquisition is more than the predetermined time (the case where step SE06 "Yes"), as step SE07, control unit
22 calculate the distance of the first object and the second object.In addition, control unit 22 will be in the first object and the second object as step SE08
The three dimensional object shown between body is set as selection state.Then, as step SE09, control unit 22 carries out aftermentioned operation inspection
Survey processing, wherein change the three dimensional object in selection state according to the operation detected.Terminate in operation detection processing
Afterwards, control unit 22 enters step SE10.
Figure 26 is the flow chart for showing the processing sequence of operation detection processing.Processing sequence shown in Figure 26 passes through control unit
22 execute control program 24a and realize.
As shown in figure 26, firstly, as step SF01, control unit 22 calculates the distance of the first object and the second object.So
Afterwards, as step SF02, the distance that the judgement of control unit 22 operates the first object and the second object after detection processing starts is
It is no approximately fixed.It is so-called to refer to that for example the distance change amount of the first object and the second object and operation were examined at that time apart from approximately fixed
Distance when processing starts is surveyed compared to (distance when the first object and the second object move under usual speed is most in preset range
± 10% etc. of big variable quantity) within.In addition, when the distance of the first object and the second object is held after operation detection processing starts
When continuous diminution (when the first object and the second object mobile to the direction for squeezing three dimensional object), also it may determine that as apart from substantially solid
It is fixed.In addition, also may determine that when the distance of the two only changes in handshaking range as apart from approximately fixed.
When the distance of the first object and the second object is approximately fixed (the case where step SF02 "Yes"), as step
SF03, control unit 22 calculate the movement speed of the first object and the second object.Then, as step SF04, control unit 22 judges
Whether calculated movement speed is below threshold value.Threshold value used herein above be, for example, people dish out object when finger tip movement
Speed.In addition, the movement speed with threshold value comparison is also possible to the movement speed of the first object and the movement speed of the second object
Average value, be also possible to either fast, be also possible to either slow.
When movement speed is below threshold value (the case where step SF04 "Yes"), as step SF05,22 basis of control unit
The movement of the first object and the second object that detect applies three dimensional object and changes.For example, when detect the first object and the
When two objects move to the right, control unit 22 cooperates the movement of the first object and the second object to move three dimensional object to the right.
In addition, when detecting that the first object and the second object rotate counterclockwise, control unit 22 cooperates the first object and the second object
Rotation rotates three dimensional object counterclockwise.When detecting mobile and rotation at the same time, is then moved and rotated simultaneously.Work as presence
When three dimensional object is mobile and the barrier of rotation, can also make in three dimensional object and bar contact three dimensional object movement and
Rotation stops.Barrier can be real-world object, be also possible to other three dimensional objects.Then, control unit 22 is again from step
SF01 is executed.
When movement speed is faster than threshold value (the case where step SF04 "No"), as step SF06, control unit 22 eliminates three
Dimensional object.When eliminating three dimensional object, can also quickly be moved with three dimensional object to the moving direction of the first object and the second object
Dynamic mode carries out animation and shows.Then, 22 end operation detection processing of control unit.In this way, by with three dimensional object of dishing out
Mode eliminates three dimensional object in the first object and mobile the second objects at high speed, can realize that elimination is three-dimensional by intuitively operating
Object.In addition to the operation of high-speed mobile the first object and the second object, such as broken three dimensional object can also will be held as eliminating
The operation of three dimensional object.It replaces eliminating three dimensional object alternatively, it is also possible to make three dimensional object return to initial configuration position.Display device
1 can not also execute the processing of step SF03, SF04 and SF06.That is, display device 1 judges first in step SF02
When whether the distance of object and the second object is approximately fixed, no matter how the movement speed of 2 objects can carry out step
SF05。
When the distance of the first object and the second object is not approximately fixed (the case where step SF02 "No"), as step
Whether SF07, control unit 22 judge with select three dimensional object when compared with when detection processing (namely start operation) apart from expanding.
When distance expands (the case where step SF07 "Yes"), as step SF08, control unit 22 releases the selection state of three dimensional object.
The operation for expanding the distance of the first object and the second object is similar with the operation for decontroling the real object caught.Therefore, as
For releasing the operation of the selection of three dimensional object, aforesaid operations are intuitive and easy to understand.
Then, as step SF09, it is mobile that control unit 22 makes the three dimensional object for relieving selection state follow gravity etc..So
Afterwards, 22 end operation detection processing of control unit.Here movement is shown as such as three dimensional object and follows gravity fall and on floor
Or stop on desk.It, can also be according to the elasticity of three dimensional object or floor or desk before stopping the movement of three dimensional object
Hardness makes three dimensional object rebound.Shock dynamics when three dimensional object and floor or desk collision is calculated, it is bigger than predetermined value when hitting
When can also be shown as three dimensional object damage.Alternatively, it is also possible to make the mobile shifting than under A/W effect of three dimensional object
It is dynamic slower.
Compared to (the feelings of step SF07 "No" when being reduced at a distance from the first object and the second object when with selection three dimensional object
Condition), as step SF10, control unit 22 deforms three dimensional object according to distance.Then, control unit 22 is held from step SF01 again
Row.Can also for example according to as attribute to three dimensional object set by elasticity come change make three dimensional object deform degree.It is right
It is set to the object of soft in the such attribute of three dimensional object for imitating ball, control unit 22 can also correspond to the first object
The degree of deformation is reduced and improved with the distance of the second object.In addition, being set for the such attribute of three dimensional object for imitating building blocks
It is set to the object of high rigidity, even if the distance of the first object and the second object reduces, control unit 22 can also keep the journey of deformation
It spends smaller.
When with selection three dimensional object compared to reducing at a distance from the first object and the second object, display device 1 can also be with
Three dimensional object is not set to deform and make its diminution.When the distance of the first object and the second object is below predetermined value, display device 1
Three dimensional object can also be shown as damaging.
As described above, in the first embodiment, setting continue for when the state that three dimensional object is located between the objects such as finger
Three dimensional object is selected when more than the predetermined time, therefore the selection of three dimensional object can be realized by operation intuitive and easy to understand.
As shown in figure 27, display device 1 can also connect at least one in the first object and the second object with three dimensional object
More than the state predetermined hold-time of the touching alternatively condition of three dimensional object.By by the contact with three dimensional object alternatively
Condition, so that user is readily selected desired three dimensional object when multiple three dimensional objects are shown as close to each other.
Illustrate the second embodiment of the processing sequence about the operation for catching three dimensional object to carry out.In a second embodiment
It is identical as operation detection processing in the first embodiment to operate detection processing.Therefore, it for second embodiment, omits and first
Embodiment repeat description mainly illustrates to select detection processing.
8 illustrate that the operation of three dimensional object progress is caught in detection first refering to fig. 2.Figure 28 is for illustrating that three-dimensional is caught in detection
The figure for the operation that object carries out.In the step S51 shown in Figure 28, three dimensional object is shown in a manner of three-dimensional in display space
OB1.In addition, the mobile finger F 1 of user and finger F 2 are so that three dimensional object OB1 is located at hand in order to select three dimensional object OB1
Refer between F1 and finger F 2.
When detect in display space there are 2 real-world objects and when three dimensional object OB1 is located between 2 objects,
Display device 1 monitors the distance change of 2 objects.Moreover, if judging to select more than approximately fixed predetermined hold-time
Three dimensional object OB1 has been selected, three dimensional object OB1 is set as selection state.Moreover, display device 1 changes the display of three dimensional object OB1
Mode etc. notifies three dimensional object OB1 to become selection state to user.
During the distance change that display device 1 monitors 2 objects, 2 objects are without resting on clamping three dimensional object OB1
Position.That is, user mobile finger F 1 and finger F 2 as shown in step S51 are so that three dimensional object OB1 is located at
After between finger F 1 and finger F 2, the state can not also be kept, finger F 1 and finger F 2 is mobile to other positions.
As shown in step S52, user is from the mobile finger F 1 of the state of step S51 and finger F 2 and keeps finger F 1 and hand
Refer to that the distance D1 of F2 is approximately fixed.In this case, it as shown in step S53, is kept in the distance D1 of finger F 1 and finger F 2
Approximately fixed state continue for the stage of predetermined time or more, and three dimensional object OB1 is set as selection state by display device 1.And
And display device 1 by three dimensional object OB1 as the stage of step S51 by selection, be moved to finger F 1 and finger
Between F2.The movement of finger F 1 and finger F 2 from step S51 to step S53 can also be stored in advance in display device 1, and cooperation is pre-
The movement rotated three dimensional object OB1 first stored.Later, display device 1 is according to the movement of finger F 1 and finger F 2 to three dimensional object
OB1 applies the variations such as mobile, deformation, disappearance.
In this way, 2 objects once be moved to clamping three dimensional object position after, even if by making object not stay in this
Place also can choose three dimensional object, so that user can start rapidly to select the operation after three dimensional object.
Then, the processing sequence of selection detection processing in this second embodiment is illustrated with reference to Figure 29.Figure 29 is to show
The flow chart of the processing sequence of the selection detection processing of three dimensional object.Processing sequence shown in Figure 29 executes control by control unit 22
Processing procedure sequence 24a and realize.
As shown in figure 29, firstly, as step SG01, control unit 22 synthesis include three dimensional object imaginary space image and
Real space image simultaneously makes its display.Then, as step SG02, control unit 22 judges by test section 44 (namely by photographing
Portion 40 and the ministry of photography 42) whether detect the first object and the second object.(the step when the first object and the second object is not detected
The case where rapid SG02 "No"), as step SG14, if there is in the three dimensional object for assuming selection state, then control unit 22
Release the hypothesis selection state of the three dimensional object.So-called hypothesis selection state refers to: showing three between 2 objects detecting
After the state of dimensional object, the state whether distance of 2 objects is substantially maintained fixed is monitored.
Then, as step SG15, control unit 22 judges whether to detect that operation terminates.At the end of detecting operation
(the case where step SG15 "Yes"), control unit 22 terminate selection detection processing.(the step SG15 at the end of operation is not detected
The case where "No"), control unit 22 is executed from step SG02 again.
When detecting the first object and the second object (the case where step SG02 "Yes"), as step SG03, control unit
22 judge whether there is in the three dimensional object for assuming selection state.When being not on the three dimensional object for assuming selection state
(the case where step SG03 "No"), as step SG04, control unit 22 is found among the three dimensional object of display in the first object
And second three dimensional object that shows between object.When the three dimensional object not met (the case where step SG05 "No"), control
Portion 22 enters step SG15.
When the three dimensional object that discovery is shown between the first object and the second object (the case where step SG05 "Yes"), make
For step SG06, the three dimensional object shown between the first object and the second object is set as assuming selection state by control unit 22.
Then, as step SG07, control unit 22 calculates the distance of the first object and the second object.Then, control unit 22 enters step
SG15。
(the step when detecting the first object and the second object and existing in the three dimensional object for assuming selection state
The case where SG03 "Yes"), as step SG08, control unit 22 calculates the distance of the first object and the second object.Then, as step
Rapid SG09, control unit 22 judge whether distance is approximately fixed.When distance is not approximately fixed (the case where step SG09 "No"),
As step SG14, control unit 22 releases the hypothesis selection state of the three dimensional object.Then, control unit 22 enters step SG15.
When the distance of the first object and the second object is approximately fixed (the case where step SG09 "Yes"), as step
SG10, control unit 22 judge distance keep it is approximately fixed during whether more than the predetermined time.When distance keeps approximately fixed
During not up to the predetermined time when (the case where step SG10 "No"), control unit 22 enters step SG15.
When during distance keeping approximately fixed more than the predetermined time (the case where step SG10 "Yes"), as step
The three dimensional object shown between the first object and the second object is set as selection state by SG11, control unit 22.In addition, conduct
Three dimensional object is moved between the first object and the second object by step SG12, control unit 22.Then, as step SG13, into
Operation detection processing shown in the above-mentioned Figure 26 of row, wherein becoming the three dimensional object in selection state according to the operation detected
Change.After operation detection processing terminates, control unit 22 enters step SG15.
As described above, in a second embodiment, after three dimensional object is located between the objects such as finger, if the distance of object
Approximately fixed predetermined hold-time or more is kept then to select three dimensional object, therefore user can start rapidly to select three dimensional object
Operation later.
Figure 30 is for illustrating that the figure of the variation of the second embodiment of the operation of three dimensional object progress is caught in detection.Such as figure
Step S61 to step S63 shown in 30, display device 1 can also set the condition of selection three dimensional object are as follows: in the first object and
After at least one is contacted with three dimensional object among second object, the distance of the first object and the second object keeps approximately fixed lasting
It is more than the predetermined time.By with the alternatively condition of the contact with three dimensional object, so that being shown as that in multiple three dimensional objects
This close to when, user is readily selected desired three dimensional object.
Illustrate the 3rd embodiment of the processing sequence about the operation for catching three dimensional object to carry out.For 3rd embodiment,
Omission and first embodiment repeat description mainly illustrate to select detection processing and operate detection processing.
Illustrate the operation that detection catches three dimensional object to carry out referring initially to Figure 31 and Figure 32.Figure 31 and Figure 32 is for illustrating
The figure of the 3rd embodiment for the operation that detection catches three dimensional object to carry out.In the step S71 shown in Figure 31, in display space
Three dimensional object OB1 is shown in a manner of three-dimensional.In addition, in order to select three dimensional object OB1, the mobile finger F 1 of user and finger F 2 with
So that three dimensional object OB1 is between finger F 1 and finger F 2.
When display device 1 detects the presence of 2 real-world objects and three dimensional object OB1 is located at this 2 in display space
When between object, the distance change of this 2 objects is monitored.Moreover, if sentencing more than approximately fixed predetermined hold-time
It is disconnected to have selected three dimensional object OB1, three dimensional object OB1 is set as selection state.Moreover, display device 1 changes three dimensional object OB1's
Display mode etc. notifies three dimensional object OB1 to become selection state to user.
During the distance change that display device 1 monitors 2 objects, 2 objects are without resting on clamping three dimensional object OB1
Position.That is, user mobile finger F 1 and finger F 2 as shown in step S71 are so that three dimensional object OB1 is located at
After between finger F 1 and finger F 2, the state can not also be kept, finger F 1 and finger F 2 is mobile to other positions.
As shown in step S72, user keeps the distance D1 of finger F 1 and finger F 2 substantially solid from the state of step S71
The mobile finger F 1 of fixed state and finger F 2.In this case, display device 1 is from detecting between finger F 1 and finger F 2
The stage (i.e. from the stage of step S71) for showing three dimensional object OB1, according to the movement of finger F 1 and finger F 2, to three dimensional object
OB1 applies the variations such as mobile, deformation, disappearance.Moreover, keeping big in the distance D1 of finger F 1 and finger F 2 as shown in step S73
Fixed state is caused to continue for the stage of predetermined time or more, three dimensional object OB1 is set as selection state by display device 1.
As shown in the step S74 of Figure 32, when the distance D1 of finger F 1 and finger F 2 becomes larger before by the predetermined time, i.e.,
When not selected, display device 1 to three dimensional object OB1 apply in the opposite variation of the variation applied at that time.As a result, with
The stage of step S71, identical position was with identical status display three dimensional object OB1.Inverse variation is applied to three dimensional object OB1
Speed can also than at that time to three dimensional object OB1 apply variation speed it is fast.That is, three-dimensional can also be made as putting upside down at a high speed
Object OB1 does inverse variation.
In this way, being become by applying since detecting the stage for showing three dimensional object between 2 objects to three dimensional object
Change, user is allowed to recognize selecting three dimensional object before determining selection.As a result, user can know in advance
Whether the three dimensional object wanting select has been selected.In addition, to 2 objects distance keep approximately fixed state predetermined hold-time with
Until upper, display device 1 can also be by showing quilt in the mode (such as translucent) different from usual state and selection state
The three dimensional object for applying variation, so that user is easy to judge the state of three dimensional object.
Then, illustrate processing sequence of the display device 1 about the operation for catching three dimensional object to carry out with reference to Figure 33 and Figure 34.
Figure 33 is the flow chart for showing the processing sequence of selection detection processing of three dimensional object.Processing sequence shown in Figure 33 passes through control
Portion 22 executes control program 24a and realizes.
As shown in figure 33, firstly, as step SH01, control unit 22 synthesis include three dimensional object imaginary space image and
Real space image simultaneously makes its display.Then, as step SH02, control unit 22 judges by 44 (namely the ministry of photography of test section
40 and the ministry of photography 42) whether detect the first object and the second object.(the step when the first object and the second object is not detected
The case where SH02 "No"), as step SH10, if there is in the three dimensional object for assuming selection state, then control unit 22 is solved
Except the hypothesis selection state of the three dimensional object.
Then, as step SH11, control unit 22 judges whether to detect that operation terminates.At the end of detecting operation
(the case where step SH11 "Yes"), control unit 22 terminate selection detection processing.(the step SH11 at the end of operation is not detected
The case where "No"), control unit 22 is executed from step SH02 again.
When detecting the first object and the second object (the case where step SH02 "Yes"), as step SH03, control unit
22 judge whether there is in the three dimensional object for assuming selection state.When being not on the three dimensional object for assuming selection state
(the case where step SH03 "No"), as step SH04, control unit 22 is found among the three dimensional object of display in the first object
And second three dimensional object that shows between object.When the three dimensional object not met (the case where step SH05 "No"), control
Portion 22 enters step SH11.
When the three dimensional object that discovery is shown between the first object and the second object (the case where step SH05 "Yes"), make
For step SH06, the three dimensional object shown between the first object and the second object is set as assuming selection state by control unit 22.
In addition, control unit 22 calculates the distance of the first object and the second object as step SH07.Then, control unit 22 enters step
SH11。
(the step when detecting the first object and the second object and existing in the three dimensional object for assuming selection state
The case where SH03 "Yes"), as step SH08, control unit 22 judges whether at least one of the first object and the second object moves
It is dynamic.When the first object and the second object all do not move (the case where step SH08 "No"), control unit 22 enters step SH11.
When at least one of the first object and the second object movement (the case where step SH08 "Yes"), as step
SH09, control unit 22 execute the operation detection processing shown in Figure 34, wherein being made according to the operation detected in selection state
Three dimensional object variation.After operation detection processing terminates, control unit 22 enters step SH11.
Figure 34 is the flow chart for showing the processing sequence of operation detection processing.Processing sequence shown in Figure 34 passes through control unit
22, which execute control program 24a, realizes.As shown in figure 34, firstly, as step SI01, control unit 22 calculates the first object and second
The distance of object.Then, as step SI02, the judgement of control unit 22 operates the first object and second after detection processing starts
Whether the distance of object is approximately fixed.
When the distance of the first object and the second object is approximately fixed (the case where step SI02 "Yes"), as step
Since SI03, control unit 22 judge whether have passed through the predetermined time after operating detection processing.When a predetermined period of time
(the case where step SI03 "Yes"), as step SI04, if there is in the three dimensional object for assuming selection state, then control unit
The three dimensional object is set as selection state by 22.When without the predetermined time (the case where step SI03 "No"), step is not executed
SI04。
Then, as step SI05, control unit 22 calculates the movement speed of the first object and the second object.Then, as
Whether step SI06, control unit 22 judge calculated movement speed below threshold value.(the step when movement speed is below threshold value
The case where rapid SI06 "Yes"), as step SI07, control unit 22 is moved according to the movement of the first object and the second object detected
Dynamic or rotated three dimensional object.Then, control unit 22 is executed from step SI01 again.
When movement speed is faster than threshold value (the case where step SI06 "No"), as step SI08, control unit 22 eliminates three
Dimensional object.When eliminating three dimensional object, can also quickly be moved with three dimensional object to the moving direction of the first object and the second object
Dynamic mode carries out animation and shows.Then, 22 end operation detection processing of control unit.In addition to the first object of high-speed mobile and
The operation of two objects, such as broken three dimensional object can also will be held as the operation for eliminating three dimensional object.It can also make three dimensional object
Initial configuration position is returned to replace eliminating three dimensional object.Display device 1 can not also execute the place of step SI05, SI06 and SI08
Reason.That is, the case where display device 1 is judged as "No" in step SI03 or execute step S104 after, no matter 2 objects
How the movement speed of body can carry out step SI07.
When the distance of the first object and the second object is not approximately fixed (the case where step SI02 "No"), as step
Whether SI09, control unit 22 judge with select three dimensional object when compared with when detection processing (namely start operation) apart from expanding.
When distance expands (the case where step SI09 "Yes"), as step SI10, control unit 22 judges in the first object and the second object
Whether the three dimensional object shown between body, which is in, assumes selection state.
When three dimensional object, which is in, assumes selection state (the case where step SI10 "Yes"), as step SI11, control unit
22 release the hypothesis selection state of three dimensional object.In addition, control unit 22 makes three dimensional object carry out opposite change as step SI12
Change and returns to original state.Then, 22 end operation detection processing of control unit.
When three dimensional object, which is not at, assumes selection state (i.e. in selection state) (the case where step SI10 "No"), make
For step SI13, control unit 22 releases the selection state of three dimensional object.In addition, control unit 22 makes to relieve as step SI14
It is mobile that the three dimensional object of selection state follows gravity etc..Then, 22 end operation detection processing of control unit.Here mobile display
Gravity fall is followed for such as three dimensional object and is stopped on floor or desk.It, can also before stopping the movement of three dimensional object
To make three dimensional object rebound according to the hardness on the elasticity of three dimensional object or floor or desk.Calculate three dimensional object and floor or table
Shock dynamics when son collides can also be shown as three dimensional object damage when hitting bigger than predetermined value.Alternatively, it is also possible to make three
Dimensional object it is mobile than A/W effect under movement it is slower.
(the feelings of step SI09 "No" when the distance of the first object and the second object reduces compared with when selecting three dimensional object
Condition), as step SI15, control unit 22 deforms three dimensional object according to distance.Then, control unit 22 is held from step SI01 again
Row.The degree for deforming three dimensional object can also for example change the hardness that three dimensional object is set according to as attribute.
As described above, in the third embodiment, due to from being opened when detecting that three dimensional object is between the object of finger etc.
Begin to change three dimensional object according to operation, therefore the selection of the readily identified three dimensional object of user.
Figure 35 is for illustrating that the figure of the variation of the 3rd embodiment of the operation of three dimensional object progress is caught in detection.Such as figure
Step S81 to step S83 shown in 35 can also set the condition of selection three dimensional object are as follows: in the first object and the second object
Among after at least one contacts with three dimensional object, the distance of the first object and the second object keeps approximately fixed predetermined hold-time
More than.By with the alternatively condition of the contact with three dimensional object, so that when multiple three dimensional objects are shown as close to each other,
The readily selected desired three dimensional object of user.
The display device 1 illustrated in the above-described embodiments can be applied to various uses.Three-dimensional as operation object is right
The object of the object of the necessary being such as book, building blocks, spoon, chopsticks, playing card, clay, musical instrument is imitated as (display object) can be,
It is also possible to the object that imaginary virtual user, game role, AR mark (AR タ グ) of imagination reality etc. not actually exist.
In addition, being not limited to above-mentioned movement, deformation, disappearance etc. to the variation that three dimensional object applies according to the operation detected.For example,
Variation in one object includes an object and other object permutations.In addition, three dimensional object is applied according to pressing operation
Variation is not limited to the above embodiments, and can be changed according to the type of three dimensional object.
For example, when the three dimensional object (hereafter only referred to as " clay ") for imitating clay is operation object, it can also be according to pressing
Press operation makes deformation of clay, allows user that clay is made to be configured to arbitrary shape.In addition, with time going by, it can also
With reduce clay viscosity so that clay as having done.In addition, when detecting with the finger of the three dimensional object of dipped water or
When hand presses the operation of clay, the viscosity of clay can also be improved.
For example, disc can also be made when the three dimensional object (hereafter only referred to as " disc ") for imitating disc is operation object
It is rotated centered on fulcrum according to pressing operation, and plays sound.Can also be by making the broadcasting of rotation and sound link, imagination
Realize the technologies such as the scratching plate carried out by disc jockey in ground.
Embodiment 2
In embodiment 1, illustrate that display device detects operation to three dimensional object, and according to the operation detected to three
Dimensional object applies the embodiment of variation.But display device can also be examined according to the operation to three dimensional object according to by test section
The displacement progress and the associated movement of three dimensional object of the predetermined object measured.Illustrate according to the predetermined object detected by test section
Displacement carry out the embodiment with the display device of the associated movement of three dimensional object.
Firstly, illustrating the structure of the display device 5 of second embodiment with reference to Figure 36 and Figure 37.Figure 36 is display device 5
Perspective view.Figure 37 is the block diagram of display device 5.In following discussion, part identical with already explained part is marked
Infuse symbol identical with already explained part.In following discussion, omits and illustrate repeat description with what is carried out.
As shown in Figure 36 and Figure 37, control program 24e replacement control journey is stored in projector 34 and storage unit 24 in addition to also having
Except sequence 24a, display device 5 and the structure having the same of display device 1.
Projector 34 is according to the signal sent from control unit 22, from Projection Division 34a projected image.The image projected is shown
On screen, wall etc., wear display device 5 user other than people it can also be seen that.The mode of 34 projected image of projector
It is not particularly limited.For example, the laser irradiated from light source can also be passed through MEMS (Micro Electro by projector 34
Mechanical Systems) mirror reflection to describe image.Projector 34 is also configured to the light sources such as halogen light, LED, LD
It is combined with optical instruments such as LCD, DMD (Digital Micro-mirror Device).Display device 5, which can also not have, throws
Shadow instrument 34 and have external display.
Processing includes control program 24e and control program 24a structure having the same except linkage display control unit 28.
Linkage display control unit 28, which provides, makes projector 34 project information relevant to the information being shown in display unit 32a and 32b
In function.In the function provided by linkage display control unit 28, including the behaviour with basis to the three dimensional object in display space
Make variation three dimensional object linkage, the function for the information change for projecting projector 34.
Then, with reference to Figure 38 to Figure 41, illustrate the embodiment with the display control of the variation linkage of three dimensional object.Figure 38 is
Figure with one embodiment of the display control of the variation linkage of three dimensional object is shown.Figure 39 is to show to make 1 instantaneous touch of finger F
The figure of one embodiment of the operation trace of three dimensional object.Figure 40 is to show the operation trace for moving finger F 1 along three dimensional object
One embodiment figure.Figure 41 is the figure for showing the one embodiment for the operation trace for damaging three dimensional object by pressure with finger F 1.
In the step S91 shown in Figure 38, control unit 22 shows spherical tellurion as three-dimensional right in display space
As OB2.Moreover, display device 22 projects widened japanese map as projected image P4, the widened Japan from projector 34
Map is located at the center three dimensional object OB2 from the user's viewing for wearing display device 5.In this way, control unit 22 exists from projector 34
Image relevant to shown three dimensional object OB2 is projected in display space.40 He of the ministry of photography is projected in projected image P4
When in 42 camera coverage, user can confirm the state of projected image P4 by image captured by the ministry of photography 40 and 42.
In the present embodiment, projection japanese map as to the relevant image of Japan positioned at the center of three dimensional object OB2,
But it also can replace map and project other images relevant to Japan such as national flag, the national flower of Japan.Image can also be preparatory
It is stored in storage unit 24, can also be obtained by wired or wireless communication from other devices.
In the state of step S91, it is assumed that after detecting that finger F 1 contacts three dimensional object OB2 to movement obliquely downward and in a flash
The operation (as shown in figure 39) left at once.Aforesaid operations and finger F 1 contact in a flash left after touch screen at once touch behaviour
Make similar, but point in plane can only be selected relative to operation is touched, which, which has, can choose appointing for three dimensional object OB2
The advantages of at meaning.
When detecting that control unit 22 is judged as the position for having selected finger F 1 to be contacted when operating shown in Figure 39, go forward side by side
Row corresponds to the processing of selected position.Processing corresponding to selected position refers to and for example corresponds to from projector
The processing of the details of selected position.In the present embodiment, as the processing for corresponding to selected position, execution makes
Three dimensional object OB2 rotates so that the centrally located processing in selected position.
In the step S92 shown in Figure 38, detect operation shown in Figure 39 the result is that 22 rotated three dimensional pair of control unit
As OB2 so that the Florida peninsula contacted is nearby located at center.Rotation with three dimensional object OB2 links, control unit 22
The map near the mobile Florida peninsula in the center of three dimensional object OB2 is projected as projected image P4 from projector 34.
The direction for making finger F 1 mobile in order to contact three dimensional object OB2 in a flash is not limited to obliquely downward, can be any direction.
In the state of step S91, it is assumed that detect the operation for moving finger F 1 along three dimensional object OB2 (such as Figure 40 institute
Show).When detecting that control unit 22 makes three dimensional object according to the moving direction and amount of movement of finger F 1 when operating shown in Figure 40
OB2 rotation.In this way, rotate three dimensional object OB2 as the processing for moving progress according to finger F 1 along three dimensional object OB2,
User is intuitive and easy to understand.
In the step S93 shown in Figure 38, detect operation shown in Figure 40 the result is that control unit 22 makes three dimensional object
OB2 only rotates the angle responded with the amount of movement of finger F 1 to the left.It links with the rotation of three dimensional object, control unit 22 is from projection
Instrument 34 projects the map near the mobile Florida peninsula in the center of three dimensional object OB2 as projected image P4.
In the state of step S91, it is assumed that detect the similar operation for damaging finger F 1 three dimensional object OB2 from above by pressure
(as shown in figure 41).When detect operated shown in Figure 41 when, control unit 22 according to user with finger F 1 be pressed into three dimensional object
The amount of OB2 reduces three dimensional object OB2.In this way, will reduce three dimensional object OB2 as basis damage by pressure the operation of three dimensional object OB2 into
Capable processing, user are intuitive and easy to understand.
In the step S94 shown in Figure 38, detect operation shown in Figure 41 the result is that control unit 22 makes three dimensional object
OB1 reduces.Moreover, the diminution with three dimensional object OB2 links, control unit 22 also makes to project from projector 34 as projected image P4
Map reduce.
Then, the processing sequence with the display control of the variation linkage progress of three dimensional object is illustrated referring to Figure 42.Figure 42 is
The flow chart of the processing sequence for the display control that variation linkage with three dimensional object carries out is shown.Processing sequence shown in Figure 42 is logical
Control unit 22 is crossed to execute control program 24a and realize.
As shown in figure 42, firstly, as step SJ01, control unit 22 synthesis include three dimensional object imaginary space image and
Real space image is simultaneously shown in display unit 32a and display unit 32b.In addition, as step SJ02, control unit 22 is from projector
Project image corresponding with three dimensional object.
Then, as step SJ03, whether the judgement of control unit 22 arrives finger by test section.But it is not examined by test section 44
When measuring finger (the case where step SJ03 "No"), as step SJ08, control unit 22 judges whether to detect that user carries out
End operation.When end operation is not detected (the case where step SJ08 "No"), as step SJ03, control unit 22 is again
It is executed from step SJ03.On the other hand, when detecting end operation (the case where step SJ08 "Yes"), at 22 end of control unit
Make sequence in order.
When detecting finger in step SJ03 (the case where step SJ03 "Yes"), as step SJ04, control unit 22
The movement of finger in three dimensions is judged according to the testing result of test section 44.Then, when detecting finger and three dimensional object
When the movement contacted in a flash (the case where step SJ05 "Yes"), as step SJ06, control unit 22 is executed and the position that is contacted
Set corresponding processing.Then, as step SJ07, the image corresponding and updating projection with three dimensional object foundation of control unit 22, into
Enter step SJ08.
(step SJ05 "No", step SJ09 when detecting displacement corresponding along movement that three dimensional object moves with finger
The case where "Yes"), as step SJ10, control unit 22 is according to the moving direction and amount of movement rotated three dimensional object of finger.Then,
As step SJ07, the image corresponding and updating projection with three dimensional object foundation of control unit 22 enters step SJ08.
(step SJ09 "No", step SJ11 when the corresponding displacement of the movement detected with finger damages three dimensional object by pressure
The case where "Yes"), as step SJ12, control unit 22 reduces three dimensional object according to the amount that finger is pressed into three dimensional object.Then, make
For step SJ07, the image corresponding and updating projection with three dimensional object foundation of control unit 22 enters step SJ08.
When above-mentioned any movement is not detected (the case where step SJ11 "No"), control unit 22 keeps the image of projection not
Become, enters step SJ08.
As described above, in the present embodiment, according to the operation of the motion detection user of finger in three dimensions, therefore
A variety of operating methods can be provided to user.Moreover, being updated in linkage with according to the operation change three dimensional object detected
The information of outside display or projection, therefore a variety of operating methods can be provided to user, even it is related for update to the
The operation for the information that three provides.
In the above-described embodiments, it has been assumed that according to three kinds of the motion detection operations of finger in three dimensions, but according to hand
The operation of the motion detection of finger or hand in three dimensions is without being limited thereto.For example, it is also possible to be associated with Authoring program, so that using
Person can use the block or clay that finger manipulation is shown in three dimensions, manufacture statue or building.Alternatively, it is also possible to be swum with contest
Program of playing association allows user to operate the handle shown in three dimensions with hand and enjoys competition game.
Alternatively, it is also possible to be associated with instrument playing program, user is shown in three dimensions with hand operation
Piano or keyboard are played.Alternatively, it is also possible to be associated with data display program, when the shape shown in three dimensions with hand truncation
When the arbitrary portion of shape, the image in its section is projected from projector 34.
Embodiment 3
Illustrate the structure of the display device 6 of 3rd embodiment with reference to Figure 43.Figure 43 is the block diagram of display device 6.Such as Figure 43
It is shown, control program 24f replacement control is stored in communication unit 16 and action sensor 48 and storage unit 24 in addition to also having
Except program 24a, display device 6 and the structure having the same of display device 1.The also storage catalogue data 24g of storage unit 24.Separately
Outside, display device 6 is also possible to the structure of either one or two of above-mentioned display device 2 to 4.
Communication unit 16 carries out the communication between other devices.Communication unit 16 can support Wireless LAN or Bluetooth
The communication mode that (registered trademark) carries out wireless communication in than narrow range like that, can also support towards telecom operation
The communication mode that the 3G communication mode or 4G communication mode of quotient carries out wireless communication in than broader range like that.Communication unit
16 can also support communication mode of the Ethernet (registered trademark) like that by wired progress.Communication unit 16 can also be supported multiple
Communication mode.
Action sensor 48 detects the change in location and direction (posture) variation of display device 6.Change in location and direction
Variation is detected in three dimensions.That is, the action sensor 48 not only change in location in detection level direction and side
To variation, the also change in location and direction change in detection vertical direction.In order to detect change in location and the direction of display device 6
Variation, action sensor 48 for example may include 3-axis acceleration sensor.In order to detect the change in location of display device 6, act
Sensor 48 may also comprise GPS (Global Positioning System) receiver or baroceptor.In order to detect display
The change in location of device 6, action sensor 48 is also using range measurements measured by ranging unit 46.Action sensor 48
The change in location of display device 6 can also be detected using the combination of various ways.Become to detect the direction of display device 6
Change, action sensor 48 can also include gyro sensor or aspect sensor.Action sensor 48 can also use a variety of
Mode combines to detect the direction change of display device 6.
Storage unit 24 by flash memory etc. there is non-volatile storage device to constitute, and store various programs and data.Storage unit
The program stored in 24 includes control program 24f.The data stored in storage unit 24 include object data 24b, worked upon data
24c, imaginary space data 24d and catalogue data 24g.Storage unit 24 can also pass through the pockets storage mediums such as combination storage card
It is constituted with the read-write equipment being written and read to storage medium.In this case, program 24f, object data 24b, effect are controlled
Data 24c, imaginary space data 24d and catalogue data 24g also can store in storage medium.In addition, control program 24f,
Object data 24b, worked upon data 24c, imaginary space data 24d and catalogue data 24g can also be by being carried out by communication unit 16
Communication obtained from other devices of server unit etc..
Control program 24f provides function relevant to the various controls for operating display device 6.In control program
It include overlapping three dimensional object on the image that the ministry of photography 40 and the ministry of photography 42 obtain and in display unit in the function that 24f is provided
The function of showing in 32a and display unit 32b detects the function of the operation to three dimensional object, makes three-dimensional according to the operation detected
The function etc. of object variation.
Control program 24f includes detection processing portion 25, display object control portion 26, viewpoints' integration portion 30, image combining unit
27 and subscription process portion 29.Detection processing portion 25 is provided to be present in the coverage of the ministry of photography 40 and the ministry of photography 42 for detecting
Real-world object function.It include the function of distance of the measurement away from each object detected in the function that detection processing portion 25 provides
Energy.
Viewpoints' integration portion 30 provides the function for managing in imaginary space, user viewpoint position and direction.
It include that the change in location of the display device 6 detected according to action sensor 48 and direction become in the function that viewpoints' integration portion 30 provides
Change, changes in the imaginary space, viewpoint position of user and function that direction is changed.For example, when by sensor 48
When detecting that display device 6 moves forwards, viewpoints' integration portion 30 moves in imaginary space, user viewpoint forwards
It is dynamic.For example, when detecting that display device 6 rotates to the right by sensor 48, viewpoints' integration portion 30 by it is in imaginary space, use
The viewpoint of person rotates to the right.In this way, being changed in imaginary space by the change in location and direction change for cooperating display device 6
, the viewpoint position of user and direction, so as to so that the imaginary space of display Chong Die with the image of real space image
Change consistent with the image change of real space.
Subscription process portion 29 provides the function of ordering the goods using three dimensional object.Subscription process portion 29 is described in detail below
Function.
Object data 24b includes the information of the shape and property about three dimensional object.Object data 24b is for showing three-dimensional
Object.Worked upon data 24c includes the information that three dimensional object how is acted on about the operation of the three dimensional object to display.Work as detection
To when operation to the three dimensional object of display, worked upon data 24c makes how three dimensional object changes for judging.Change mentioned here
Change includes mobile, rotation, deformation, disappearance, displacement etc..Imaginary space data 24d keeps the three-dimensional with configuration in imaginary space
The relevant information of the state of object.The state of three dimensional object includes such as position, posture, the situation of deformation.Subscription process portion
24g includes the information that the specification of commodity, price etc. are used to peddle commodity.
Then, with reference to Figure 44 to Figure 47, illustrate other implementations that the function of providing according to control program 24f is controlled
Example.Figure 44 is the figure for showing the embodiment for change in location changing in linkage three dimensional object.In the step S3 shown in Figure 44,
Image P1c is the image obtained by the ministry of photography 40, that is, is equivalent to the image that the scene of real space is seen with right eye.In image P1c
In, the pavement of jogging for user front of appearing before one's eyes out.Display device 6 also obtains the image of the same scene shot by the ministry of photography 42,
That is, seeing the image of the scene of realistic space with left eye.
Image P2c is the right eye image generated according to imaginary space data 24d and object data 24b.In the present embodiment
In, imaginary space data 24d saves configuration related with the state of the mark three dimensional object of corresponding position beside pavement of jogging
Information, object data 24b saves and the shape and the related information of property that identify three dimensional object.Each three dimensional object records
It shows the number away from the distance for starting place, and is configured in from starting shown in the number documented by the canter track pitch of place
The position of distance.Display device 6 is equally generated the image that imaginary space is seen with left eye viewpoint.
In the step S3 shown in Figure 44, display device 6 composograph P1c and image P2c generates image P3c.Image
P3c is the image being shown in as right eye with image in display unit 32a.In image P3c, the three dimensional object of mark is such as reality
In the presence of being equally attached in the scene for pavement of jogging.
When user advances forwards in pavement of jogging, display device 6 detects change in location, and according to detecting
Change in location moves the viewpoint position in imaginary space forwards.For example, if the position of display device 6 moves forward 1m,
Then display device 6 makes in imaginary space, user viewpoint move forward the distance for being equivalent to 1m.By repeating above-mentioned change
To change, the three dimensional object of user front is as the tree beside pavement, as the advance of user is slowly close to user, if
User from the visual field by then disappearing.In the step S4 shown in Figure 44, display device 6 composograph P1d and image P2d,
Generate image P3d.In the step S4 shown in Figure 44, the mark three-dimensional of " 600m " that sees at a distance in the stage of step S3 is right
Front as being shown in user.
In this way, by the position of variation display device 6, user operates even without hand, can also change by showing
The three dimensional object that showing device 6 is shown.That is, display device 6 makees the change in location of the display device 6 as caused by user
To be received for making the operation of three dimensional object variation.It is that basis makes by the operation that above-mentioned change in location changes three dimensional object
The operation for the phenomenon that user undergoes in real space, thus it is intuitive and easy to understand for user.Moreover, according to the behaviour of change in location
Work can be carried out with cooperations such as the operations of hand, therefore may be implemented three dimensional object a variety of variations and high convenience.
In addition, being displayed next to the mark not actually existed in pavement of jogging in the embodiment shown in Figure 44.Therefore,
Display device 6, which can be shown, establishes the corresponding information useful for user with the position jogged on pavement.Moreover, because aobvious
The information that showing device 6 is shown can be different therefore different from true identity for each user, can record in mark
To all convenient information of each user.
It is not limited to identify with the three dimensional object of change in location linkage variation.For example, it is also possible to construct three-dimensional imaginary shop
Street is linked with the walking that user waits in a room, the shop that conversion user can see.Display device 6 can also with work as
When step number change three dimensional object in linkage.
Display device 6 can also make the degree of change in location of the variation degree of three dimensional object than detecting big, can also be with
Keep the degree of change in location of the variation degree of three dimensional object than detecting small.For example, it is also possible to which the three-dimensional to imaginary space is right
10 times of the variation as applying change in location size in real space.For example, can also make to mark in the embodiment shown in Figure 44
The interval that the three dimensional object of knowledge is configured is bigger than the interval that display mark is recorded, and the three dimensional object of mark can also be made to be configured
It is spaced smaller than the interval that display mark is recorded.By the interval of such as above-mentioned adjustment mark, it can be adjusted and be run according to physical condition
Distance.Movement for the 10m in real space, display device 6 can be with the mobile foundation pair of the 100m in imaginary space
It answers, it can also be corresponding with the mobile foundation of the 1m in imaginary space.In the place as the only interior of white wall with three
When dimensional object shows multiple gravitation facilities, by making the variation degree of three dimensional object be greater than the change in location detected, Ke Yi
It is easily moved between separated facility.For example, the variation of three dimensional object can also be made when wanting to accurately control three dimensional object
Degree is less than the change in location detected.
Display device 6 can also be such that three dimensional object changes in linkage with the change in location of up and down direction.For example, in user
When jump, the scene of display can also be changed to from the current position of user in terms of height corresponding with the height of jump
The scene of the three dimensional objects such as building.In this case, when user jumps to full capacity, can also show from upper vertical view makes
The scene of the three dimensional objects such as the building of the current position of user.
Display device 6 can also accumulate the variable quantity of position, i.e. accumulation amount of movement, change three dimensional object according to accumulated value.
For example, it is also possible to the accumulated value of amount of movement is scaled place of arrival when walking to capital of a country from Tokyo, with the spy of place of arrival
It levies building or landscape is three dimensional object, display Chong Die with the scene of real space.For example, when place of arrival is equivalent to Yokohama
When, it can also be by the display Chong Die with a part of real situation of the three dimensional object of the door in Chinese street.For example, when place of arrival is suitable
When Shizuoka, the three dimensional object for the Fuji seen from place of arrival can also be superimposed and displayed on to the behind of real situation.
Figure 45 is to conceptually illustrate the figure for configuring the operation screen around user.Figure 46 is to show to join with direction change
The figure for the embodiment for dynamicly changing three dimensional object.In the embodiment shown in Figure 45 and Figure 46, imaginary space data 24d is kept
Information related with the state of the three dimensional object of operation screen OS1 to OS3, object space data 24b are kept and operation screen OS1
To the related information of shape and property of the three dimensional object of OS3.
Operation screen OS1 is configured in imaginary space position corresponding with user front.Operation screen OS2 is configured
The position corresponding with the right side of user in imaginary space.Operation screen OS3 is configured in imaginary space and user
The corresponding position in left side.Multiple icons are configured on operation screen OS1 to OS3.
When having worn the user direction front of display device 6, as shown in image P3e, 6 display operation of display device is drawn
Face OS1.In this condition, if user move as the icon on the finger pressing operation picture OS1 with hand H1
Make, then display device 6 carries out processing corresponding with icon.
When user towards it is right when, display device 6 according to the direction change detected, make viewpoint direction in imaginary space to
Right variation.For example, if display device 6 makes imaginary sky when the head for having worn the user of display device 6 rotates to the right 90 degree
Between in, user's viewpoint direction change 90 degree to the right.As a result, display device 6 is shown in imaginary space as shown in image P3f
Show configuration in the operation screen OS2 of user right side.In this condition, if user pressed with the finger of hand H1
The such movement of icon on operation screen OS2, then display device 6 carries out processing corresponding with icon.
When user towards it is left when, display device 6 according to the direction change detected, make viewpoint direction in imaginary space to
Left variation.For example, if display device 6 makes imaginary sky when the head for having worn the user of display device 6 rotates to the left 90 degree
Between in, user's viewpoint direction change 90 degree to the left.As a result, display device 6 is shown in imaginary space as shown in image P3g
Show configuration in the operation screen OS3 of user left side.In this condition, if user pressed with the finger of hand H1
The such movement of icon on operation screen OS3, then display device 6 carries out processing corresponding with icon.
In this way, user by making the direction change of display device 6, so that being operated even without hand, can also make
The three dimensional object variation shown in display device 6.That is, display device 6 is by the side of the display device 6 generated by user
It is received to variation as the operation for changing three dimensional object.The operation for changing three dimensional object by above-mentioned direction change
It is operation the phenomenon that experience in real space according to user, therefore intuitive and easy to understand for user.Moreover, according to direction
The operation of variation can be carried out with cooperations such as the operations of hand, therefore a variety of variations and high convenience of three dimensional object may be implemented
Property.
In addition, showing multiple operation screens around user in the embodiment shown in Figure 45 and Figure 46.Therefore,
User can only change the direction of face and the easily operation screen of conversion operation.
In Figure 45 and Figure 46, the embodiment of display operation picture on 3 faces around user is shown, but it is aobvious
Showing device 6 can also including the back side including user around display operation picture on 4 faces.Alternatively, can also incite somebody to action
Continuous face is as operation screen as surrounding the cylindric inner face of user.Alternatively, operation screen can also be arranged
Face on user's head, that is, be arranged in make viewpoint direction upward when the face seen on.
When operation screen is arranged on the multiple flat surfaces for surrounding user, display device 6 can also be with such as under type tune
The direction of viewpoint or three dimensional object in whole imaginary space: make it is in the face for being equipped with operation screen, be present in user and face
It is on direction and orthogonal with the sight of user closest to 90 degree of face with user's sight.Pass through such as above-mentioned adjustment viewpoint or three-dimensional
The visuality of operation screen can be improved in the direction of object.
Operation screen is not limited to the three dimensional object that direction change changes in linkage.For example, it is also possible to be configured to using
Display goods three dimensional object on shelf around person, the direction that head is changed with the user for having worn display device 6 are linked, are come
The commodity that change user can see that and take.The three-dimensional including building three dimensional object can also be configured around user
Map, the direction that head is changed with the user for having worn display device 6 are linked, and the map of display is faced direction to user
Map variation.The direction that display device 6 can also change head with the user for having worn display device 6 is linked, and will be used
Weather, activity, the key facility in area of the sight direction of person etc. are shown as three dimensional object.
Display device 6 can also be corresponding with some processing foundation by the direction of user's yaw.For example, display device 6 exists
User to the right yaw when climb over display book three dimensional object page, turn back book three dimensional object in user's yaw to the left
Page.In this case, direction change when returning for the head for will be turned over, does not do linkage processing.
Figure 47 is the flow chart for showing the processing sequence that control and position and direction variation in linkage change three dimensional object.
Processing sequence shown in Figure 47 executes control program 24f by control unit 22 and realizes.Firstly, as step S101, control unit
22 generate imaginary space according to imaginary space data 24d and object data 24b.Then it is used as step S102, control unit 22 is initial
The viewpoint position being set in imaginary space and direction.Such as it corresponds to and builds according to predetermined real space and imaginary space
Vertical rule carries out the initial setting of the viewpoint position and direction in imaginary space.
Then, as step S103, control unit 22 obtains the testing result of action sensor 48.Then, as step
S104, control unit 22 according to the display device 6 change in location change the viewpoint position in imaginary space, as step
S105,6 direction change changes the viewpoint direction in imaginary space according to the display device.
Then, as step S106, control unit 22 judges whether the display of three dimensional object terminates.When the display of three dimensional object
When being not finished (the case where step S106 "No"), 22 return step S103 of control unit.(the step at the end of the display of three dimensional object
The case where S106 "Yes"), control unit 22 terminates the processing sequence shown in Figure 47.
Embodiment 4
Illustrate to be applied to display device 6 to have used the embodiment in the commodity selling of electronic directory.Figure 48 to Figure 52 is
Figure for illustrating that display device 6 is applied to use the embodiment of family's electrical article of electronic directory and furniture peddled.
Figure 48 is to show the figure that electronic directory is shown in the room of the commodity of setting purchase.Image P1h user in order to
The image obtained when holding electronic directory in a room and stretching out one's hand H2 forwards by the ministry of photography 40, that is, be equivalent to and seen very with right eye
The image of the scene of the real space.In image P1h, the sofa 61 contacted with the wall in room of appearing before one's eyes out is placed in opposite side
The platform 62 in corner and the hand H2 of user.Display device 6 also obtains the image that same scene is shot by the ministry of photography 42, that is, phase
When in the image for the scene for seeing real space with left eye.
Image P2h is the right eye image generated according to imaginary space data 24d and object data 24b.In the present embodiment
In, imaginary space data 24d keeps the information about three dimensional object state corresponding with the wall in room, floor and ceiling, and
About configuration in the information of the three dimensional object state of the catalogue 63 of the position hand H2 of user, object data 24b holding is about each
The shape of three dimensional object and the information of shape.Display device 6 equally also generates the image that imaginary space is seen with left eye viewpoint.
Display device 6 composograph P1h and image P2h simultaneously generates image P3h.In image P3h, to be unfolded on hand H2
Mode Display directory 63 three dimensional object.In addition, three dimensional object corresponding with wall, floor and ceiling is respectively with surface and true
The consistent mode in surface of real wall, floor and ceiling configures in imaginary space.Moreover, with wall, floor and ceiling pair
The three dimensional object answered is constituted in such a way that the surface on surface and real wall, floor and ceiling has identical appearance respectively.Cause
This, in image P3h, for the wall in room, floor and ceiling, user can not differentiate true surface or three dimensional object
Surface in which be displayed on above.
Display device 6 equally synthesizes the image shot by the ministry of photography 42 and the image that imaginary space is seen with left eye viewpoint, raw
At the image being shown in display unit 32b as left eye image.Display device 6 shows the composograph of such as above-mentioned generation
In display unit 32a and 32b.As a result, user can see as the three dimensional object of catalogue 63 in a room is unfolded on hand H2
The same scene.
Figure 49 is the figure for illustrating to select the scene of commodity from catalogue.As the step S111 of Figure 49, user with
Being opened to publication has the state of the page of desired commodity to be placed on the three dimensional object of catalogue 63 on hand.Each page of catalogue 63
On with reduce and be pressed for planar state be configured with commodity three dimensional object.In step S112, user is with hand H1
Two fingers catch the three dimensional objects of desired commodity on the page.By being caught with two fingers, three dimensional object becomes selected
It selects and the state of movement according to the movement of hand H1.
In step S113, user catches three dimensional object and hand H1 leaves catalogue 63.As a result, the three dimensional object reduced
It is torn from the page of catalogue 63.The three dimensional object being torn expands as size identical with actuals, becomes television set
64 three dimensional object.It is 0 that the three dimensional object for defining the commodity published in catalogue, which is defined as weight,.Therefore, user can not
The influence of weight is considered to handle the three dimensional object of television set 64.
Figure 50 is the figure of the scene of the size and setting place for illustrating to consider television set.It is corresponding with television set 64 true
Real television set includes the different multiple types of same design size.The three dimensional object of television set 64 is torn in the page from catalogue 63
Under stage when with size identical with the type of smallest.As the step S121 of Figure 50, user uses the hand of hand H1
Refer to an end for catching the three dimensional object of television set 64, catches another end with the finger of hand H2.
In step S122, user keep catch television set 64 three dimensional object state expand hand H1 and hand H2 away from
From.According to the distance change of hand H1 and hand H2, the change in size of the three dimensional object of television set 64.User in television set 64 three
The size of dimensional object expands the three dimensional object of finger interval release television set 64 when reaching desired size.Display device 6 is adjusted again
The size of the three dimensional object of whole television set 64 becomes the size with the immediate type of the size of current three dimensional object.This
Sample, user can be readily selected the size of commodity by shirtsleeve operation.
Later, the three dimensional object of television set 64 is moved to the Desired Height of desired locations by user.Due to defining commodity
The weight of three dimensional object be 0, therefore as shown in step S123, even if user decontrols hand, it is mobile after three dimensional object also stop
It stays put.In step S124, user is sitting in confirmation on sofa 61 and floats on to the three-dimensional right of the television set 64 in sidewalls
The position of elephant and height.The three dimensional object of television set 64 has size identical with actual TV machine, can be with the room of user
Between be overlapped and swim in arbitrary position.Therefore, user can consider in the environment of actual use before buying television set
How television set is set.
Figure 51 is the figure for illustrating to select the scene of cabinet for television set.As the step S131 of Figure 51, user is sitting in sand
On hair, the three dimensional object that hand H2 holds catalogue 63 opens the page of cabinet for TV.
In catalogue data 24g, defines television set and cabinet for TV is associated commodity.Moreover, in catalogue data 24g
In, record has the actual size of each cabinet for TV.As shown in step S131, according to these information, display device 6 is according to television set
The situation of 64 three dimensional object selects height and from floor to television set 64 from the cabinet for TV being recorded in catalogue data 24g
Three dimensional object the consistent cabinet for TV of distance and be shown in the page of cabinet for TV.Therefore, user can be easily from can
Television set to be arranged in the cabinet for TV for selecting to like in the cabinet for TV at the height of current three dimensional object.
In step S132, user selects desired TV by contacting the three dimensional object of catalogue with the finger of hand H1
Cabinet.In step S133, the three dimensional object of selected cabinet for TV 65 is shown under the three dimensional object of television set 64.Due to fixed
Justice television set and cabinet for TV are associated articles, thus display device 6 can by the three dimensional object of selected cabinet for TV 65 with
The three dimensional object foundation for the television set 64 having shown that accordingly is shown.
Figure 52 is the figure for illustrating the scene of mobile real-world object.For example, user will attempt to be moved to platform 62 not
The layout changing at place.Display device 6 shows the three dimensional object for imitating real-world object when detecting the operation of selection real-world object
Show and replaces real-world object in realistic space.Then, display device 6 is when detecting the operation for keeping three dimensional object mobile, according to
Operate moving three dimension object.
User, which can be not concerned about gravity, to be influenced and is easily moved three dimensional object, and three dimensional object can be placed on
In the air so that it has no relations.Moreover, actually also stay put although can't see real-world object in display space, because
This after attempting layout changing without making real-world object return to original place.
Image P1i shown in Figure 52 is that user will catch the image obtained when platform 62 by the ministry of photography 40, that is, with
Right eye sees the image of the scene of realistic space.It appears before one's eyes out in image P1i and is placed on the platform 62 of corner of the room and the hand H1 of user.It is aobvious
Showing device 1 also obtains the image of the same scene shot by the ministry of photography 42, that is, seeing the scene of realistic space with left eye
Image.
Image P2i is the right eye image generated according to imaginary space data 24d and object data 24b.At this stage, exist
The three dimensional object on wall and floor is only configured at imaginary space corresponding with the place that platform 62 in real space is placed.Display dress
It sets 6 composograph P1i and image P2i and generates image P3i.Image that the equally synthesis of display device 6 is shot by the ministry of photography 42 and
The image of imaginary space is seen with left eye viewpoint, generates the image being shown in display unit 32b as left eye image.Display device
6 show the composograph of such as above-mentioned generation in display unit 32a and 32b.As a result, user can see as oneself is in room
Between in catch the scene of platform 62.
Image P1j shown in Figure 52 is after user catches platform 62 to select, by the ministry of photography 40 when picking up platform 62
Image obtained, that is, seeing the image of the scene of realistic space with right eye.At this point, platform 62 is not lifted actually,
But it stays put.
Image P2j is the right eye image generated according to imaginary space data 24d and object data 24b.At this stage, by
The operation for catching real-world object (i.e. platform 62) to carry out selection is carried out in user, therefore display device 6 generates and imitates platform 62
Three dimensional object 66.Object data 24b for generating three dimensional object 66 can also be stored in advance in storage unit 24, can also root
The image dynamic generation shot according to the ministry of photography 40 and 42.In image P2j, three dimensional object 66 picks up operation according to user's,
Configuration is in the position for leaving floor.
Display device 6 composograph P1j and image P2j simultaneously generates image P3j.At this point, display device 6 carries out shelter
Reason, the hidden processing generate the three dimensional object 66 as substitution, and to platform 62 without display.For example, display device 6 will be away from platform
62 distance is handled after being set as infinity.As a result, the three dimensional object of wall, floor, ceiling etc. is shown in front of platform 62, platform
62 are hidden in these three dimensional objects behind.
Then, the processing sequence of subscription process is illustrated with reference to Figure 53.Figure 53 is the stream for showing the processing sequence of subscription process
Cheng Tu.Subscription process shown in Figure 53 executes control program 24f by control unit 22 and realizes.
As shown in figure 53, firstly, as S201, control unit 22 initially sets catalogue data 24g.Then, as step
S202, the imaginary space image and real space image of three dimensional object of the synthesis of control unit 22 including catalogue simultaneously make its display.
As step S203, control unit 22 detects the operation to three dimensional object.Then, as step S204, control unit 22
Judge whether the operation detected is operation for being ordered.When the operation detected is not intended to the operation ordered
When (the case where step S204 "No"), control unit 22 enters step S205.
In step S205, control unit 22 judges whether the operation object detected is three dimensional object.As the behaviour detected
It opposes when liking three dimensional object (the case where step S205 "Yes"), control unit 22 enters step S206.When the operation pair detected
When as not being three dimensional object (the case where step S205 "No"), control unit 22 enters step S208.
In step S206, control unit 22 changes the commodity being shown in catalogue according to operation.Moreover, as step
S207, control unit 22 update display according to operation.Later, 22 return step S203 of control unit.
In step S208, control unit 22 carries out the hidden processing for not showing the real-world object as operation object.
Then, as step S209, the additional three dimensional object in imaginary space of control unit 22, the three dimensional object replaces being used as operation object
Real-world object.Later, 22 return step S203 of control unit.
When the operation detected in step S203 is the operation for being ordered (the case where step S204 "Yes"),
Control unit 22 enters step S210.Control unit 22 carries out the subscription process for order goods in step s 201.
Television set and cabinet for TV are not limited to using the commodity that catalogue is bought.For example, it is also possible to buy wall calendar or drawing.Wall calendar
Or the three dimensional object of drawing is configured to that any position of room wall can be hung over.For example, it is also possible to buy curtain.Display device
6 can also reduce the brightness of display unit 32a and 32b when hanging up the three dimensional object of curtain on window, so that can reproduce
The light-proofness of practical curtain.
Display device 6 can also change the scenery of room outside window by three dimensional object when attempting the layout changing in room,
So that the influence of the variation of outside scenery as caused by season can be confirmed.Display device 6 is also configured to, when trial room
Between layout changing when, color balance that user can arbitrarily set display unit 32a and 32b and brightness are so that can be true
Recognize the influence of the variation of the altitude of the sun as caused by season and moment.
Embodiment 5
Then, the structure of the display device 7 of the 5th embodiment is illustrated with reference to Figure 54.Figure 54 is the block diagram of display device 7.Such as
Shown in Figure 54, display device 7 has operation portion 13, control unit 22, storage unit 24, communication unit 16, display unit 32a and 32b, photography
Portion 40 and 42, test section 44, ranging unit 46 and action sensor 48.The starting of 13 receiving and displaying device 7 of operation portion, is moved stopping
The basic operations such as operation mode change.
Storage unit 24 by flash memory etc. there is non-volatile storage device to be formed, and store various programs and data.Storage unit
The program stored in 24 includes controlling program for 24 hours.The data stored in storage unit 24 include object data 24b, worked upon data 24c
And imaginary space data 24d.Storage unit 24 can also be carried out by the pockets such as storage card storage medium and to storage medium
The combination of the read-write equipment of read-write and constitute.In this case, control program for 24 hours, object data 24b, worked upon data 24c with
And imaginary space data 24d also can store in storage medium.In addition, control program for 24 hours, object data 24b, worked upon data
24c and imaginary space data 24d can also be obtained from other devices such as server unit by the communication of communication unit 16.
Control program provides function relevant to the various controls for operating display device 7 for 24 hours.In control program
It include overlapping three dimensional object on the image that the ministry of photography 40 and the ministry of photography 42 obtain and in display unit in the function of providing for 24 hours
The function of showing in 32a and display unit 32b detects the function of the operation to three dimensional object, makes three-dimensional according to the operation detected
The function etc. of object variation.
Control program includes detection processing portion 25, display object control portion 26, viewpoints' integration portion 30 and image synthesis for 24 hours
Portion 27.
Viewpoints' integration portion 30 provides the function for managing in imaginary space, user viewpoint position and direction.
It include that the change in location of the display device 7 detected according to action sensor 48 and direction become in the function that viewpoints' integration portion 30 provides
Change, changes in the imaginary space, viewpoint position of user and function that direction is changed.For example, when by sensor 48
When detecting that display device 7 moves forwards, viewpoints' integration portion 30 moves in imaginary space, user viewpoint forwards
It is dynamic.For example, when detecting that display device 7 rotates to the right by sensor 48, viewpoints' integration portion 30 by it is in imaginary space, use
The viewpoint of person rotates to the right.In this way, being changed in imaginary space by the change in location and direction change for cooperating display device 7
, the viewpoint position of user and direction, so as to so that the imaginary space of display Chong Die with the image of real space image
Change consistent with the image change of real space.
Other than the control for the function of being provided according to subscription process portion 29, according to the function that provides for 24 hours of control program
It controls identical as the control of function provided according to above-mentioned control program 24f.
Illustrate the embodiment that display device 7 is carried out to commodity selling by internet.Figure 40 to Figure 44 is for illustrating to incite somebody to action
Display device 7 is applied to the figure for the embodiment peddled by the Piza of internet.
Figure 55 is the figure for illustrating to start the subscription process of Piza.When starting to order Piza, user wears display dress
7 are set, sees the place of the plane with a degree of broad degree.For example, display device 7 is shown when user sees desk T2
The image P3k of the desk T2 for reality of appearing before one's eyes.Display device 7 is according to the instruction of user by the communication of communication unit 16 from Piza
It peddles website and obtains object data 24b, worked upon data 24c and imaginary space data 24d, and generated according to the data got
Imaginary space.Display device 7 is by showing image P3m for imaginary space image and the real space image overlapping of generation.
In image P3m, multiple three dimensional objects are configured on desk T2.The three dimensional object configured includes large flat bread bottom
161L, medium cake bottom 161M, little cake bottom 161S, box 162a to 162f, the stick for filling the materials such as sesame, tomato, cheese
163, the soft bottle 164 of catsup and oven 165.Cake bottom 161L is that the cake bottom of the Piza for L dimension, cake bottom 161M are for M ruler
Cake bottom of the cake bottom, cake bottom 161S of very little Piza for the Piza of S size.In this way, by the subscription process for starting Piza, flat
The material and oven 165 that Piza is configured on face are used as three dimensional object.
Display device 7 also can be used the object data 24b being stored in advance in storage unit 24, worked upon data 24c and
Imaginary space data 24d shows the material and oven 165 of Piza.
Figure 56 is the figure of the process of the size and thickness for illustrating to determine cake bottom.As the step S311 of Figure 56, use
Person catches cake bottom 161M with hand H1.Cake bottom 161M is selected and catching, moving condition is become according to the movement of hand H1.In step
In rapid S312, cake bottom 161M is placed in the plane substantial middle position of desk T2 by user, catches stick 163 with hand H1.By grabbing
Stick 163 is firmly selected, moving condition is become according to the movement of hand H1.
In step S313, stick 163 is placed on the 161M of cake bottom by user, is rolled with hand H1 and H2.In object data 24b
Middle definition stick 163 is rigid body, and cake bottom 161M is plastic body.Moreover, defining plastic body in worked upon data 24c and being pressed by rigid body
The portion concave being pressed when pressure.Therefore, if user is rolled on the 161M of cake bottom with stick 163, cake bottom 161M is in round shape
Extend and thinning.User is rolled on the 161M of cake bottom with stick 163, becomes desired ruler to the cake bottom 161M as shown in step S314
Until very little and thickness.
Determine that the operation of the size and thickness at cake bottom is not limited to the embodiment shown in Figure 56.For example, it can be when use
When person with the hands clamps the interval for expanding both hands after the 161M of cake bottom, cake bottom 161M with the round shape for being divided into diameter between both hands in being prolonged
It stretches.Alternatively, being also possible to when user nips a part of cake bottom 161M with 2 fingers, cake bottom 161M overall deformation is with 2
The round shape thin type of thickness is divided between a finger.In these operations, by adjusting the interval of hand or the interval of finger, Ke Yirong
It changes places and adjusts the size and thickness at cake bottom.
Figure 57 is the figure for illustrating to put the process of material.As the step S315 of Figure 57, user's hand of hand H1
Finger pinches the sesame being contained in box 162d.Sesame is selected and pinching, moving condition is become according to the movement of hand H1.?
In step S316, after sesame is moved on the 161M of cake bottom desired position by user, expand the interval of finger H1.In this way,
Sesame is placed on the 161M of cake bottom desired position.
By repeating same operation, user wishes that material is placed on desired position for desired amount of.
In step S317, user catches the soft bottle 164 of catsup with hand H1.The soft bottle of catsup is selected and catching
164, moving condition is become according to the movement of hand H1.In step S318, user, which holds the soft bottle 64 of catsup, makes its outlet
Downward, exist while pressing bottle portion and move on the 161M of cake bottom.In worked upon data 24c, definition is when the bottle for pressing soft bottle
Content is squeezed out from outlet when body part.Using this effect, in step S318, tomato is used on the 161M of cake bottom by user
Sauce draws picture.
Figure 58 is the figure for illustrating to order the process of Piza.As the step S319 of Figure 58, user is opened with hand H1
The front door of oven 165.In step s 320, user holds cake bottom 161M with hand H1 and H2 and is put into oven 165, in step
In S321, with the switch of the finger pressing oven 165 of hand H1.
In this way, confirming the order of Piza, and send order to the website of peddling of Piza when carrying out the movement of heating Piza
Data.It is corresponding with order foundation is carried out by the finishing operation that will be used to make commodity, user is made on one side
Make the process of commodity, is intuitively ordered without useless operation on one side.Operation for ordering Piza is also possible to it
He operates.For example, it is also possible to show with the three dimensional object for the order button being display together with Piza material, this will be pressed and pressed
The operation of button is as the operation for ordering Piza.
It is to determine the price of Piza using the purpose of subscription data and reproduces ordered Piza.Include in subscription data
About selected cake bottom size, the size and thickness at the cake bottom of extension, the information of type, amount, position of material etc..Order number
Operation when Piza three dimensional object is made according to the image or user for the Piza three dimensional object that also may include user's production
By.These information reproduce the process of process identical with actual fabrication Piza in the three dimensional object that user operates Piza material
Middle acquisition.Therefore, user can be without inputting quantity operation troublesome in this way, with can easily imagine ready-made Piza
Method orders Piza.
Figure 59 is the figure for showing the embodiment of dispatching Piza.Later, as shown in figure 59, according to subscription data, dispatching is put into
Piza 172 in pizza box 171.In a manner of being reproduced in the Piza three dimensional object that Figure 55 is made into Figure 58 as correctly as possible
Make Piza 172.It is also possible to the reproduction that cook carries out Piza 172 while referring to subscription data.Cook can also be on one side
The image for the Piza 172 that user is made with three dimensional object is looked at, or looks at reproduce the image that user operates process on one side
Make Piza 172.Or Piza 172 can also be made by making machine (robot) according to subscription data.
In this way, by making subscription data according to the operation to three dimensional object, so as to easily realize according to oneself
Hobby order goods.
Then, the processing sequence of subscription process is illustrated with reference to Figure 60.Figure 60 is the stream for showing the processing sequence of subscription process
Cheng Tu.Processing sequence shown in Figure 60 executes control program by control unit 22 and realizes for 24 hours.
As shown in figure 60, firstly, as step S331, the synthesis of control unit 22 includes the vacation of three dimensional object relevant to commodity
Think spatial image and real space image and makes its display.Then, as step S332, initially number is ordered in setting to control unit 22
According to.Specifically, the condition of merchandise that control unit 22 indicates status three dimensional object is consistent with the condition of merchandise that subscription data indicates.
As step S333, control unit 22 detects the operation to three dimensional object.Then, as step S334, control unit 22
Judge whether the operation detected is and carries out ordering corresponding operation.When the operation detected is not corresponding with order
When operation (in step S334 the case where "No"), control unit 22 enters step S335.In step S335, control unit 22 is according to inspection
The operation measured updates subscription data.Then, as step S336, control unit 22 updates display unit according to the operation detected
The display of 32a and 32b.Later, 22 return step S333 of control unit.
In step S334, when the operation detected is and ("Yes" in step S334 when carrying out ordering corresponding operation
Situation), as step S337, control unit 22 carries out subscription process.Specifically, control unit 22 is logical by being carried out by communication unit 16
Believe to supplier and sends subscription data.Later, control unit 22 terminates subscription process.
Above-mentioned order mode can be used for the case where other foods are ordered by internet.For example, when ordering noodles,
The process cooked noodles is reproduced by using three dimensional object, does the process of soup and the process of putting material, it is possible to specify face amount cooks noodles fire
Wait (hardness), flavor concentration, material category, amount and configuration.For example, when order box lunch when, by using three dimensional object reproduce to
The process for being fitted into the process of dish in convenient case, being packed into convenient case meal, it is possible to specify type, amount and the configuration of dish and meal
Amount.For example, reproducing the process for pinching sushi by using three dimensional object when ordering sushi, it is possible to specify the type of sushi material,
The mode of sushi is arranged in sushi case.
In above-mentioned order mode, multiple users can also be made to share imaginary space.In this case, by multiple
Other device management imaginary spaces of user has in display device 7 one or server unit etc., by communication to pipe
The device for managing imaginary space sends information relevant to the operation that each display device 7 detects.Manage the device of imaginary space
According to the three dimensional object and subscription data in transmitted information update imaginary space relevant to operation.In this way, by multiple
User shares imaginary space, so as to carry out the operation of production Piza jointly in imaginary space.
Above-mentioned order mode also can be applied to the case where by commodity other than internet food order.For example, when ordering
Purchase bouquet or flower arrangement when, can also using in florist's shop the flower of inventory shown as three dimensional object.In this case, user is logical
It crosses and reproduces the three dimensional object production bouquet for the flower that combination is liked or the process of flower arrangement, the configuration combination happiness by liking can be bought
Bouquet made of joyous flower or flower arrangement.In this case, it can also be reproduced with three dimensional object in the shop of florist's shop, by that will carry out
Bouquet or flower arrangement take cashier to be ordered.Bouquet or flower arrangement can allow people with being sent to oneself or delivery, can also be
To florist's shop's instruction commodity ready-made time or after receiving florist's shop's notice, user to florist's shop's picking.
Above-mentioned order mode also can be applied to the case where ordering clothes or ornaments by internet.In this case,
User can buy commodity on the basis of confirming harmony with Composite clothing and ornaments three dimensional object.Difference can also be combined
The commodity three dimensional object of sales field.Moreover, user can also be by commodity three dimensional object and the real garment bought or ornaments
Combination.In this way, by the way that clothes and ornaments are shown as to carry out the three dimensional object of the operations such as mobile, so as to one side confirmation
Commodity are bought in various combinations on one side.
Moreover, by using clothes and ornaments as three dimensional object display Chong Die with real space, user can correct handle
Hold commercial size.
When ordering clothes and ornaments by internet, commodity display can also be done shopping in the imagination for imitating practical shop
In center.In such a case, it is possible to which commodity is made to suspend in the sky etc. in advance, impossible displaying in reality is carried out.Moreover, with
Papery catalogue is different, can also be corresponding with display by inventory, for example, not display of commodity etc. if commodity do not have inventory.In addition,
Due to only imaginary commodity, it also can be set as the commodity in either which shop, user can be one
Complete the price payment of all commodity in a shop.In this case, the distribution of the sales volume in each shop is in background process
It carries out.
Such as it carries out taking the three dimensional object of commodity into the order that cashier carries out commodity.Or by taken from wallet
The operation for imitating operation or the prompt of the three dimensional object of credit card out carries out the order of commodity.Pass through the real credit logged in advance
Card carries out actual price payment.
Multiple users can also be made to share the imaginary space including shopping center.In this case, can also with
The corresponding position display of the viewpoint of each user in imaginary space indicates the three dimensional object of virtual user.Communication display table
The three dimensional object for showing user is easy to hold the popularity of shop and commodity.
In addition, mode of the invention shown in above-described embodiment can appoint without departing from the spirit and scope of the invention
Meaning change.Alternatively, it is also possible to appropriately combined the various embodiments described above.For example, the control program shown in above-described embodiment can also divide
Multiple modules are segmented into, it can also be with other program integrations.
In addition, in the above-described embodiments, illustrate that user itself operates the embodiment of three dimensional object, and still, display dress
Set also can detecte in the coverage of the ministry of photography other people, the movement of animal, machinery etc. is as real-world object.In addition,
Imaginary space can be shared with other devices.That is, display device is configured to except the user of the display device
People can be seen by other devices and operate the three dimensional object in imaginary space.
In addition, in the above-described embodiments, display device individually detects the operation to three dimensional object, but it is also possible to show
The operation of device and server unit cooperative detection to three dimensional object.In this case, display device takes the ministry of photography
Image or the information that detects of test section successively sent to server unit, server unit detection, which operates, simultaneously will test result
It is notified to display device.By being set as above-mentioned construction, it is possible to reduce the load of display device carrying.
Description of symbols
1~7 display device
1a front face
1b side surface part
1c side surface part
4d external device (ED)
13 operation portions
16 communication units
22 control units
24 storage units
24a, 24e, 24f, program is controlled for 24 hours
24b object data
24c worked upon data
24d imaginary space data
24g catalogue data
25 detection processing portions
26 display object control portions
27 image combining units
28 linkage display control units
29 subscription process portions
30 image combining units
32a, 32b display unit
34 projectors
The Projection Division 34a
40,42 the ministry of photography
44 test sections
46 ranging unit
48 action sensors
Claims (7)
1. a kind of display device, comprising:
Display unit, when the display device is worn, by display with two corresponding images of eyes of user come with three
Dimension mode shows the object configured in imaginary space;
Sensor detects direction change of the display device in real space;Control unit is detected according to the sensor
The direction change, change the object;
The ministry of photography shoots the image of real space;And
Image combining unit, by the image of the image of the real space shot by described the ministry of photography and the object synthesize as
The image shown in the display unit, wherein the image of the object is the image of the imaginary space,
The control unit examines the variable quantity of the object in image shown by the display unit greater than the sensor
The variable quantity of the display device in the real space measured.
2. display device as described in claim 1, wherein
The control unit changes the direction of viewpoint according to the direction change, to change the object.
3. display device as described in claim 1, wherein the ministry of photography shooting and two corresponding figures of eyes of user
Picture,
The control unit is respectively the figure of image and the object in terms of viewpoint that each eye production is shot by described the ministry of photography
The image formed as overlapping, and be shown in the display unit.
4. display device as claimed in claim 3, wherein further include test section, the test section detects described the ministry of photography
Real-world object in coverage,
The control unit changes the object according to the movement of the real-world object.
5. display device as described in claim 1, wherein
The sensor also detects change in location of the display device in real space,
The change in location that the control unit is detected according to the sensor, the position of Lai Bianhua viewpoint.
6. display device as described in claim 1, wherein
The sensor detects the movement that the user shakes the head,
The direction for the movement that the control unit is shaken the head according to changes the object.
7. a kind of control method, the control method as display device, comprising:
Display unit shows configuration in imaginary space in three dimensions by display with two corresponding images of eyes of user
In object;
Sensor detects direction change of the display device in real space;
Change the object according to the direction change;
The image of the ministry of photography shooting real space;And
The image of the image of the real space shot by described the ministry of photography and the object is synthesized as image to be shown,
Wherein, the image of the object is the image of the imaginary space,
In the step of changing the object, the variable quantity of the object in image shown by the display unit is made to be greater than institute
State the variable quantity of the display device in real space detected by sensor.
Applications Claiming Priority (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012215079A JP2014072575A (en) | 2012-09-27 | 2012-09-27 | Display device and control method |
JP2012-215081 | 2012-09-27 | ||
JP2012215081A JP6159069B2 (en) | 2012-09-27 | 2012-09-27 | Display device |
JP2012-214954 | 2012-09-27 | ||
JP2012215080A JP2014072576A (en) | 2012-09-27 | 2012-09-27 | Display device and control method |
JP2012214954A JP6125785B2 (en) | 2012-09-27 | 2012-09-27 | Display device and control method |
JP2012-215080 | 2012-09-27 | ||
JP2012-215079 | 2012-09-27 | ||
CN201380050460.9A CN104685869B (en) | 2012-09-27 | 2013-09-26 | Display device, control method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201380050460.9A Division CN104685869B (en) | 2012-09-27 | 2013-09-26 | Display device, control method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106020620A CN106020620A (en) | 2016-10-12 |
CN106020620B true CN106020620B (en) | 2019-07-26 |
Family
ID=50388352
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201380050460.9A Active CN104685869B (en) | 2012-09-27 | 2013-09-26 | Display device, control method |
CN201610531692.1A Active CN106020620B (en) | 2012-09-27 | 2013-09-26 | Display device, control method and control program |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201380050460.9A Active CN104685869B (en) | 2012-09-27 | 2013-09-26 | Display device, control method |
Country Status (4)
Country | Link |
---|---|
US (1) | US10341642B2 (en) |
EP (1) | EP2903265A4 (en) |
CN (2) | CN104685869B (en) |
WO (1) | WO2014050957A1 (en) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6143469B2 (en) * | 2013-01-17 | 2017-06-07 | キヤノン株式会社 | Information processing apparatus, information processing method, and program |
KR101839441B1 (en) * | 2014-09-17 | 2018-03-16 | (주)에프엑스기어 | Head-mounted display controlled by tapping, method for controlling the same and computer program for controlling the same |
US10726625B2 (en) | 2015-01-28 | 2020-07-28 | CCP hf. | Method and system for improving the transmission and processing of data regarding a multi-user virtual environment |
US10725297B2 (en) | 2015-01-28 | 2020-07-28 | CCP hf. | Method and system for implementing a virtual representation of a physical environment using a virtual reality environment |
US20180035056A1 (en) * | 2015-04-24 | 2018-02-01 | Hewlett-Packard Development Company, L.P. | Tracking a target with an imaging system |
US10165199B2 (en) * | 2015-09-01 | 2018-12-25 | Samsung Electronics Co., Ltd. | Image capturing apparatus for photographing object according to 3D virtual object |
JP6367166B2 (en) * | 2015-09-01 | 2018-08-01 | 株式会社東芝 | Electronic apparatus and method |
JP6597235B2 (en) | 2015-11-30 | 2019-10-30 | 富士通株式会社 | Image processing apparatus, image processing method, and image processing program |
JP6560974B2 (en) * | 2015-12-17 | 2019-08-14 | 株式会社ソニー・インタラクティブエンタテインメント | Information processing apparatus and operation reception method |
JP2017182532A (en) * | 2016-03-31 | 2017-10-05 | ソニー株式会社 | Information processing apparatus, display control method, and program |
US11070724B2 (en) | 2017-03-22 | 2021-07-20 | Sony Corporation | Image processing apparatus and method |
CN107679942A (en) * | 2017-09-26 | 2018-02-09 | 北京小米移动软件有限公司 | Product introduction method, apparatus and storage medium based on virtual reality |
CN107831920B (en) * | 2017-10-20 | 2022-01-28 | 广州视睿电子科技有限公司 | Cursor movement display method and device, mobile terminal and storage medium |
CN107835403B (en) * | 2017-10-20 | 2020-06-26 | 华为技术有限公司 | Method and device for displaying with 3D parallax effect |
CN110119194A (en) * | 2018-02-06 | 2019-08-13 | 广东虚拟现实科技有限公司 | Virtual scene processing method, device, interactive system, head-wearing display device, visual interactive device and computer-readable medium |
CN108379780B (en) * | 2018-03-13 | 2020-06-02 | 北京小米移动软件有限公司 | Virtual running scene control method and device and running machine |
US11285368B2 (en) * | 2018-03-13 | 2022-03-29 | Vc Inc. | Address direction guiding apparatus and method |
CN111125273A (en) * | 2018-11-01 | 2020-05-08 | 百度在线网络技术(北京)有限公司 | Store site selection method and device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1746822A (en) * | 2004-09-07 | 2006-03-15 | 佳能株式会社 | Information processing apparatus and method for presenting image combined with virtual image |
CN102289073A (en) * | 2007-11-21 | 2011-12-21 | 松下电器产业株式会社 | Display Apparatus |
CN102540464A (en) * | 2010-11-18 | 2012-07-04 | 微软公司 | Head-mounted display device which provides surround video |
Family Cites Families (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07200162A (en) | 1993-12-29 | 1995-08-04 | Namco Ltd | Virtual reality experience device and game machine using the same |
JPH086708A (en) | 1994-04-22 | 1996-01-12 | Canon Inc | Display device |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
JPH11237867A (en) | 1998-02-20 | 1999-08-31 | Shimadzu Corp | Virtual space display device |
JPH11312033A (en) * | 1998-04-28 | 1999-11-09 | Sony Corp | Spectacle type image display device |
JP3786154B2 (en) | 1998-04-30 | 2006-06-14 | 積水ハウス株式会社 | Display method of building by CG image |
JP2001154781A (en) | 1999-11-29 | 2001-06-08 | Nec Corp | Desktop information device |
JP2002042172A (en) | 2000-07-25 | 2002-02-08 | Matsushita Electric Works Ltd | Virtual object selection method and recording medium and service receiving application of this method |
JP2002092496A (en) | 2000-09-14 | 2002-03-29 | Nec Corp | Network transaction method, data processing method and device, data communication system and information storage medium |
JP2003030469A (en) | 2001-07-16 | 2003-01-31 | Ricoh Co Ltd | Commodity sales system by virtual department store using virtual reality space, virtual sales system, program and recording medium |
JP3847634B2 (en) | 2002-02-15 | 2006-11-22 | シャープ株式会社 | Virtual space simulation device |
JP4032776B2 (en) | 2002-03-04 | 2008-01-16 | ソニー株式会社 | Mixed reality display apparatus and method, storage medium, and computer program |
US20030184525A1 (en) * | 2002-03-29 | 2003-10-02 | Mitac International Corp. | Method and apparatus for image processing |
JP2003296379A (en) | 2002-04-05 | 2003-10-17 | Fuontaajiyu:Kk | Simulation method |
JP2005157610A (en) | 2003-11-25 | 2005-06-16 | Canon Inc | Image processor and image processing method |
JP2005165776A (en) | 2003-12-03 | 2005-06-23 | Canon Inc | Image processing method and image processor |
JP2005174021A (en) | 2003-12-11 | 2005-06-30 | Canon Inc | Method and device for presenting information |
JP4763695B2 (en) | 2004-07-30 | 2011-08-31 | アップル インコーポレイテッド | Mode-based graphical user interface for touch-sensitive input devices |
JP4738870B2 (en) | 2005-04-08 | 2011-08-03 | キヤノン株式会社 | Information processing method, information processing apparatus, and remote mixed reality sharing apparatus |
JP4739002B2 (en) * | 2005-06-30 | 2011-08-03 | キヤノン株式会社 | Image processing method and image processing apparatus |
JP4810295B2 (en) | 2006-05-02 | 2011-11-09 | キヤノン株式会社 | Information processing apparatus and control method therefor, image processing apparatus, program, and storage medium |
SE0601216L (en) | 2006-05-31 | 2007-12-01 | Abb Technology Ltd | Virtual workplace |
JP5228305B2 (en) | 2006-09-08 | 2013-07-03 | ソニー株式会社 | Display device and display method |
CN101743567A (en) * | 2007-05-18 | 2010-06-16 | Uab研究基金会 | Virtual interactive presence systems and methods |
US11325029B2 (en) * | 2007-09-14 | 2022-05-10 | National Institute Of Advanced Industrial Science And Technology | Virtual reality environment generating apparatus and controller apparatus |
JP5125779B2 (en) | 2008-06-04 | 2013-01-23 | 株式会社ニコン | Head mounted display device |
US20100053151A1 (en) * | 2008-09-02 | 2010-03-04 | Samsung Electronics Co., Ltd | In-line mediation for manipulating three-dimensional content on a display device |
JP5407498B2 (en) * | 2009-04-08 | 2014-02-05 | ソニー株式会社 | Information processing apparatus, information recording medium, information processing method, and program |
US20110057862A1 (en) * | 2009-09-07 | 2011-03-10 | Hsin-Liang Chen | Image display device |
JP2011086049A (en) | 2009-10-14 | 2011-04-28 | Mitsubishi Electric Corp | Mobile terminal device for maintenance |
JP2011095547A (en) | 2009-10-30 | 2011-05-12 | Sharp Corp | Display device |
JP4999910B2 (en) | 2009-12-02 | 2012-08-15 | 株式会社スクウェア・エニックス | User interface processing device, user interface processing method, and user interface processing program |
JP4679661B1 (en) * | 2009-12-15 | 2011-04-27 | 株式会社東芝 | Information presenting apparatus, information presenting method, and program |
JP5499762B2 (en) | 2010-02-24 | 2014-05-21 | ソニー株式会社 | Image processing apparatus, image processing method, program, and image processing system |
KR20130000401A (en) * | 2010-02-28 | 2013-01-02 | 오스터하우트 그룹 인코포레이티드 | Local advertising content on an interactive head-mounted eyepiece |
JP5564300B2 (en) | 2010-03-19 | 2014-07-30 | 富士フイルム株式会社 | Head mounted augmented reality video presentation device and virtual display object operating method thereof |
JP5759110B2 (en) | 2010-04-27 | 2015-08-05 | 泉陽興業株式会社 | Ferris wheel |
JP5640486B2 (en) | 2010-06-15 | 2014-12-17 | 日産自動車株式会社 | Information display device |
JP2012048656A (en) | 2010-08-30 | 2012-03-08 | Canon Inc | Image processing apparatus, and image processing method |
US20120069055A1 (en) * | 2010-09-22 | 2012-03-22 | Nikon Corporation | Image display apparatus |
JP5597087B2 (en) | 2010-10-04 | 2014-10-01 | パナソニック株式会社 | Virtual object manipulation device |
US9122053B2 (en) * | 2010-10-15 | 2015-09-01 | Microsoft Technology Licensing, Llc | Realistic occlusion for a head mounted augmented reality display |
JP5472056B2 (en) | 2010-11-19 | 2014-04-16 | コニカミノルタ株式会社 | Display system, display processing apparatus, display method, and display program |
JP5429198B2 (en) | 2011-01-12 | 2014-02-26 | コニカミノルタ株式会社 | Image processing apparatus, image forming system, and control program |
US8965049B2 (en) | 2011-02-01 | 2015-02-24 | Panasonic Intellectual Property Corporation Of America | Function extension device, function extension method, computer-readable recording medium, and integrated circuit |
AU2011205223C1 (en) | 2011-08-09 | 2013-03-28 | Microsoft Technology Licensing, Llc | Physical interaction with virtual objects for DRM |
US20130342572A1 (en) * | 2012-06-26 | 2013-12-26 | Adam G. Poulos | Control of displayed content in virtual environments |
-
2013
- 2013-09-26 CN CN201380050460.9A patent/CN104685869B/en active Active
- 2013-09-26 CN CN201610531692.1A patent/CN106020620B/en active Active
- 2013-09-26 WO PCT/JP2013/076046 patent/WO2014050957A1/en active Application Filing
- 2013-09-26 EP EP13842569.9A patent/EP2903265A4/en not_active Ceased
- 2013-09-26 US US14/431,194 patent/US10341642B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1746822A (en) * | 2004-09-07 | 2006-03-15 | 佳能株式会社 | Information processing apparatus and method for presenting image combined with virtual image |
CN102289073A (en) * | 2007-11-21 | 2011-12-21 | 松下电器产业株式会社 | Display Apparatus |
CN102540464A (en) * | 2010-11-18 | 2012-07-04 | 微软公司 | Head-mounted display device which provides surround video |
Also Published As
Publication number | Publication date |
---|---|
EP2903265A1 (en) | 2015-08-05 |
EP2903265A4 (en) | 2016-05-18 |
US10341642B2 (en) | 2019-07-02 |
CN104685869B (en) | 2018-12-28 |
CN104685869A (en) | 2015-06-03 |
WO2014050957A1 (en) | 2014-04-03 |
CN106020620A (en) | 2016-10-12 |
US20150312559A1 (en) | 2015-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106020620B (en) | Display device, control method and control program | |
CN104662494B (en) | Display device and control method | |
US10573091B2 (en) | Systems and methods to create a virtual object or avatar | |
US9811854B2 (en) | 3-D immersion technology in a virtual store | |
GB2564745B (en) | Methods for generating a 3D garment image, and related devices, systems and computer program products | |
JP2014072576A (en) | Display device and control method | |
KR20180070681A (en) | Select virtual objects in 3D space | |
US20170132845A1 (en) | System and Method for Reducing Virtual Reality Simulation Sickness | |
US20170352188A1 (en) | Support Based 3D Navigation | |
CN107251025A (en) | System and method for generating virtual content from threedimensional model | |
CN107656615A (en) | The world is presented in a large amount of digital remotes simultaneously | |
CN104662588B (en) | Display device, control system and control method | |
CN107251026A (en) | System and method for generating fictitious situation | |
JP2017208820A (en) | Order system | |
JP2014072575A (en) | Display device and control method | |
JP2014071520A (en) | Display device, control method and control program | |
JP6125785B2 (en) | Display device and control method | |
CN108205823A (en) | MR holographies vacuum experiences shop and experiential method | |
Dvořák et al. | Presentation of historical clothing digital replicas in motion | |
CN115049803A (en) | Augmented reality picture display method and device, computer equipment and storage medium | |
CN117136371A (en) | Method for displaying products in virtual environment | |
JP2017163560A (en) | Display device, system, and display method | |
JP2016224959A (en) | Display device and control method | |
US20210008461A1 (en) | Virtual Puppeteering Using a Portable Device | |
JP2016189194A (en) | Display device, control method, and control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |