CN111052218B - Optical device control apparatus, optical device control method, and storage medium - Google Patents

Optical device control apparatus, optical device control method, and storage medium Download PDF

Info

Publication number
CN111052218B
CN111052218B CN201780093840.9A CN201780093840A CN111052218B CN 111052218 B CN111052218 B CN 111052218B CN 201780093840 A CN201780093840 A CN 201780093840A CN 111052218 B CN111052218 B CN 111052218B
Authority
CN
China
Prior art keywords
image
projection
shadow
optical device
projected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780093840.9A
Other languages
Chinese (zh)
Other versions
CN111052218A (en
Inventor
山崎贤人
冈原浩平
有井刊
上野真理子
佐佐木干郎
古木一朗
加藤义幸
峯慎吾
阿倍博信
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN111052218A publication Critical patent/CN111052218A/en
Application granted granted Critical
Publication of CN111052218B publication Critical patent/CN111052218B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/08Power processing, i.e. workload management for processors involved in display operations, such as CPUs or GPUs

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)
  • Projection Apparatus (AREA)
  • Processing Or Creating Images (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

An optical device control apparatus (1) is characterized by comprising: an object detection unit (11) that detects an object that is present between the optical device and the object; an overlap determination unit (12) that determines whether or not the detected shadow of the object and the CG to be projected are in an overlapped state; an attribute management unit (13) that supplies projection image data to a projection image generation unit (14); a layout calculation unit (15) that, when the overlap determination unit (12) determines that the shadow of the object and the CG are in an overlapping state, changes the projection image data so that the CG is projected to an area other than the shadow of the object; a projection image generation unit (14) that generates a projection image including the CG of which the projection position has been changed by the layout calculation unit (15); and a storage unit (16).

Description

Optical device control apparatus, optical device control method, and storage medium
Technical Field
The present invention relates to an optical device control apparatus, an optical device control method, and a storage medium that control an optical device that projects or captures an image.
Background
Conventionally, there is an augmented reality technology for superimposing and drawing electronic information (CG: Computer Graphics) as additional information in real space based on the position and orientation of a user acquired from a camera or a sensor. The augmented reality technology is, for example, the following technology: by obtaining the position and orientation of the camera in an arbitrary three-dimensional coordinate system from a two-dimensional image acquired from an imaging device such as a camera, CG in a three-dimensional coordinate system generated on a computer can be drawn in a superimposed manner in real space.
A projector is known as a device that displays an image by irradiating light emitted from a projection lens on an object, and can perform calculation by a substantially similar model, except that the direction in which the light travels is the opposite direction compared to a camera. Therefore, by determining the position and orientation of the projector in an arbitrary three-dimensional coordinate system, the CG in the three-dimensional coordinate system generated on the computer can be projected to an arbitrary position in real space.
In general, the augmented reality technology focuses on the problem of "geometric matching" in which CG is drawn to overlap. On the other hand, a technique that focuses on how to manage the overlapped CGs to deliver information to the user is called view management (for example, refer to non-patent document 1). In addition, the following scheme is proposed: when displaying CG-based annotation information using augmented reality technology, 3 elements "the amount of superimposition of annotations with each other and with an explanation target object", "the distance between annotation information and annotated information", and "the amount of movement of annotation information in a time system" should be considered (for example, see non-patent document 2).
Patent document 1 proposes a display device in which: the display device includes a display mask detecting unit that detects an object (light mask) when the object is present between an observer (user) and a display device such as a display, and a display layout unit that determines a layout of information based on a detection result of the display mask detecting unit.
Patent document 2 proposes a remote instruction system as follows: the remote instruction system includes a photographing unit that photographs a projection object, and an annotation adjustment unit that adjusts a display form of annotation information according to a photographing region of the photographing unit.
Patent document 3 proposes an image projection apparatus including: the image projection apparatus includes a projection unit that projects an annotation image input from an external apparatus onto a subject, and a control unit that controls movement of the annotation image projected onto the subject based on information obtained by capturing, by an imaging unit, the subject onto which the annotation image is projected by the projection unit.
Documents of the prior art
Patent document
Patent document 1: japanese patent publication No. 3874737 (for example, paragraphs 0020 to 0029)
Patent document 2: japanese patent laid-open No. 5092459 (for example, paragraphs 0037 to 0048 and 0053 to 0061)
Patent document 3: japanese patent laid-open No. 2008-158419 (for example, paragraphs 0031 to 0041)
Non-patent document
Non-patent document 1: tibetan wuzhi, wine tianxin chang, herding xianxiao chang, "Augmented Reality (AR): 11. and 3, prospect: interface of AR ", information processing Vol.51, No.4, Apr.2010, pp.425-430
Non-patent document 2: ronald Azuma, Chris Furmanski, "evaluation Label platform for Augmented Reality View Management," IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2003), (Tokyo7-10October 2003), pp.66-75
Disclosure of Invention
Problems to be solved by the invention
However, when an object (light shield) is present between an optical device (for example, a projector) and a projection target, a shadow of the object (light shield) appears on the projection target, and therefore, there is a problem that intended information cannot be provided to an observer such as CG cannot be projected to an arbitrary position.
For example, the apparatus described in patent document 1 considers only an object between an observer and a device such as a monitor, and does not consider an object located between a projection display device and a projection target as in a projector. The device described in patent document 1 does not describe how to control the relationship between CGs when a plurality of CGs are present.
The device described in patent document 2 adjusts the projection conditions of CGs, but focuses only on the CG to be added, and does not describe that a plurality of CGs originally exist. The device described in patent document 3 shifts a plurality of CGs, but depending on the way in which the CGs are shifted, the CG may be further away from the CG.
The present invention has been made to solve the above-described problems, and an object thereof is to provide an optical device control apparatus, an optical device control method, and a storage medium that can display a graphic image in an appropriate area even when an object (light shielding object) is present between an optical device and an object.
Means for solving the problems
An optical device control apparatus according to the present invention is characterized by comprising: a projection image generation unit that projects a projection image based on the projection image data from the optical device onto the object; an object detection unit that detects a 1 st object present between the optical device and the target object, based on the physical space information acquired by the physical space information acquisition device; an overlap determination unit that determines whether or not a graphic image included in the projection image and a shadow, which is an area where the projection image does not reach the target object due to the 1 st object, overlap each other, based on the projection image data and a detection result of the object detection unit; a changing unit that, when the superimposition determining unit determines that the graphic image and the shadow are in the superimposed state, changes the projection image data so that the graphic image is projected onto an area other than the shadow; and a management unit configured to supply the projection image data changed by the changing unit to the projection image generating unit as the projection image data when the superimposition determining unit determines that the graphic image and the shadow are in the superimposed state.
An optical device control method of the present invention is characterized by comprising: a projection image generation step of projecting a projection image based on the projection image data from an optical device onto an object; an object detection step of detecting a 1 st object existing between the optical device and the object, based on the physical space information acquired by the physical space information acquisition device; an overlap determination step of determining whether or not a graphic image included in the projected image and a shadow, which is an area where the projected image does not reach the object due to the 1 st object, overlap each other, are in an overlapped state, based on the projected image data and a detection result of the object detection step; a changing step of changing the projection image data so that the graphic image is projected to an area other than the shadow when the superimposition determining step determines that the graphic image and the shadow are in the superimposed state; and a management step of supplying the projection image data changed in the changing step to the projection image generating step as the projection image data when it is determined in the superimposition determining step that the graphics image and the shadow are in the superimposed state.
Effects of the invention
According to the present invention, even when an object (light shielding object) is present between the optical device and the object, the graphic image can be displayed in an appropriate region, and therefore, information intended by the observer can be provided.
Drawings
Fig. 1 is a functional block diagram schematically showing the configuration of an optical device control apparatus according to embodiment 1 of the present invention.
Fig. 2 is a diagram showing an example of the hardware configuration of the optical device control apparatus according to embodiment 1.
Fig. 3 is a diagram for explaining the operation of the optical device control apparatus according to embodiment 1.
Fig. 4 (a) and (b) are diagrams for explaining the operation of the optical device control apparatus according to embodiment 1.
Fig. 5 (a) and (b) are diagrams for explaining the operation of the optical device control apparatus according to embodiment 1.
Fig. 6 (a) to (c) are diagrams for explaining the operation of the optical device control apparatus according to embodiment 1.
Fig. 7 (a) to (d) are diagrams for explaining the operation of the optical device control apparatus according to embodiment 1.
Fig. 8 is a flowchart showing the operation of the optical device control apparatus according to embodiment 1.
Fig. 9 is a flowchart illustrating an example of the object detection process in embodiment 1.
Fig. 10 is a flowchart showing an example of the overlap determination process in embodiment 1.
Fig. 11 is a flowchart showing an example of CG movement processing in embodiment 1.
Fig. 12 is a functional block diagram schematically showing the configuration of an optical device control apparatus according to embodiment 2 of the present invention.
Fig. 13 is a diagram for explaining an operation of the optical device control apparatus according to embodiment 2.
Fig. 14 is a flowchart illustrating an example of the object detection process in embodiment 2.
Detailed Description
An optical device control apparatus, an optical device control method, and an optical device control program according to embodiments of the present invention will be described below with reference to the drawings. The following embodiments are merely examples, and various modifications can be made within the scope of the present invention.
EXAMPLE 1
Structure of (1-1)
Fig. 1 is a functional block diagram schematically showing the configuration of an optical device control apparatus 1 according to embodiment 1 of the present invention. The optical device control apparatus 1 is an apparatus capable of implementing the optical device control method of embodiment 1. The optical device control apparatus 1 is an apparatus capable of executing the optical device control program according to embodiment 1.
As shown in fig. 1, the optical equipment control apparatus 1 according to embodiment 1 is connected to a real space information acquisition apparatus 201, which is an apparatus for acquiring information in real space, such as a camera 201a or a sensor 201b, and an image projection apparatus 202, which is an apparatus for projecting an image, such as a projector 202 a. The physical space information acquisition apparatus 201 and the image projection apparatus 202 may be separate apparatuses or may be an integrated apparatus.
As shown in fig. 1, the optical device control apparatus 1 according to embodiment 1 includes an object detection unit 11, an overlap determination unit 12, an attribute management unit 13 as a management unit, a projection image generation unit 14, a layout calculation unit 15 as a change unit, and a storage unit 16.
The physical space information acquisition device 201 acquires physical space information G1 using the camera 201a, the sensor 201b, and the like. The object detection unit 11 receives the physical space information G1 from the physical space information acquisition device 201, and detects an object (e.g., a hand as the object 100 shown in fig. 3) present between the optical device (e.g., the projector 202a) and the projection target (object) from the physical space information G1. The object detection unit 11 transmits detection object information G2 regarding the detected object 100 to the superimposition determination unit 12.
The overlap determination unit 12 determines whether or not the CG104 included in the projection image overlaps with a shadow (i.e., a shadow area) that is an area where the light emitted from the projector 202a does not reach the projection target due to the object 100, based on the detected object information G2 received from the object detection unit 11 and information G3 of the CG (CG104 as a graphic image described later) read from the attribute management unit 13 (i.e., performs overlap determination).
The superimposition determining unit 12 determines whether or not the object 100 blocks each CG104 during superimposition determination. The overlap determination result G4 determined by the overlap determination unit 12 is sent to the attribute management unit 13. The superimposition determining unit 12 may determine that there is a portion where the CG104 and the shadow included in the projection image overlap each other as the superimposed state, but may determine that the area of the portion where the CG104 and the shadow included in the projection image overlap each other is equal to or larger than a certain ratio of the area of the CG 104.
The attribute management unit 13 is a management unit that reads information (recording content) stored in the storage unit 16, writes information into the storage unit 16, and rewrites the recording content. The layout calculation unit 15 is a changing unit that changes the placement position of the CG104 so that the CG104 is projected to an area other than the shadow of the object 100. The layout calculation unit 15 receives the information G3 of the CG104 and the overlap determination result G4 from the attribute management unit 13, and calculates the rearrangement coordinates 16a of the CG104 from the received information G3 of the CG104 and the overlap determination result G4.
The information G5 including the rearrangement coordinates 16a of the CG104 calculated by the layout calculation unit 15 is sent to the attribute management unit 13. The attribute management section 13 supplies the information G3 of the CG104 and the information G5 containing the rearrangement coordinates 16a of the CG104 calculated by the layout calculation section 15 to the projection image generation section 14.
The projection image generation unit 14 generates projection image data G6 from the information G3 of the CG104 received from the attribute management unit 13 and the information G5 including the rearrangement coordinates 16a of the CG104, and transmits the projection image data G6 to the image projection apparatus 202. The image projection device 202 receives the projection image data G6 from the projection image generation unit 14, and projects a projection image (projection image 102 described later) including the CG 104.
The storage unit 16 is a recording medium or a storage device that records information related to the CG 104. As shown in fig. 1, the storage unit 16 stores, for example, the rearrangement coordinates 16a, the hierarchy level 16b, the area 16c, the arrangement coordinates 16d, and the movement flag 16e of the CG 104.
Fig. 2 is a diagram showing an example of the hardware configuration of the optical device control apparatus 1 according to embodiment 1. The optical device control apparatus 1 may be a computer. The optical device control apparatus 1 includes a CPU (central Processing Unit) 21, a GPU (Graphics Processing Unit) 22, a main memory 23, a memory 24, and a bus 25 as processors.
As shown in fig. 2, the optical equipment control device 1 is connected with a camera 201a as the physical space information acquisition device 201, a sensor 201b as the physical space information acquisition device 201, and a projector 202a as the image projection device 202. The physical space information acquisition device 201 may be only one of the camera 201a and the sensor 201 b. The real space information acquiring apparatus 201 may be a plurality of cameras 201a that capture images of the object from different positions or a plurality of sensors 201b that detect the object from different positions.
The CPU21 is an arithmetic unit for executing the entire operation of the optical device control apparatus 1. The GPU22 is mainly an arithmetic device that performs image data processing. The main memory 23 is a storage device in which data can be deleted and rewritten by hardware of the optical device control apparatus 1. The main memory 23 is volatile, but operates at a higher speed than the memory 24, and is used for storing data in use or data that is required to be used recently.
The optical apparatus control program is executed by the CPU 21. The CPU21 executes the optical device control program, thereby realizing the whole or a part of the object detection unit 11, the superimposition determination unit 12, the attribute management unit 13, the projection image generation unit 14, and the layout calculation unit 15 shown in fig. 1.
The memory 24 is a storage device capable of deleting and rewriting data by hardware of the optical device control apparatus 1, and corresponds to the storage unit 16 in fig. 1. The information necessary for the optical device control program stored in the memory is expanded in the main memory 23 and executed by the CPU 21. The bus 25 is a data transfer path for data exchange.
The camera 201a is a physical space information acquisition device 201 that captures an image necessary for acquiring physical space information G1.
The sensor 201b is the physical space information acquisition device 201 that acquires a value necessary for acquiring the physical space information G1. The sensor 201b is, for example, a GPS (Global Positioning System) for measuring a position, an acceleration sensor for detecting acceleration, a geomagnetic sensor for measuring a direction, and a depth sensor for measuring a distance to an object.
The projector 202a is an image projection device 202 required to project the projection image 102 into a real space.
Actions of 1-2
Fig. 3 is a diagram for explaining the operation of the optical device control apparatus 1 according to embodiment 1. As shown in fig. 3, in embodiment 1, for example, a camera 201a is used as the real space information acquisition apparatus 201, and a projector 202a is used as the image projection apparatus 202. As shown in fig. 3, the camera 201a has a camera coordinate system (device coordinate system) 2011a, and the projector 202a has a projector coordinate system 2021 a.
In fig. 3, a viewing frustum 2020a of projector 202a is shown. The frustum 2020a of the projector 202a shows the spatial extent of the image projected by the projector 202 a. As shown in fig. 3, an object (e.g., a human hand) 100 (i.e., a presence object) as a 1 st object is contained in a view frustum 2020a of the projector 202 a. Therefore, a part of the image projected by the projector 202a is blocked by the object 100.
Fig. 4 (a) and (b) are diagrams for explaining the operation of the optical device control apparatus 1 according to embodiment 1. Fig. 4 (a) shows a state in which there is no object that blocks the projected image 102 of the projector 202a, and fig. 4 (b) shows a state in which there is an object 100 that blocks the projected image 102 of the projector 202 a.
As shown in fig. 4 (a), the projection image 102 is projected by the projector 202a onto the projection surface 101 as a projection target (object). Each CG104 is included in the projection image 102. In the center portion of the projection image 102, a component (component a)103 as an object annotated by the projector 202a exists.
Further, in the upper right portion of the projected image 102, a CG (information indicating that the part 103 is "part a") 104 relating to the part 103 is displayed by the projector 202 a. CG104 is additional information for annotation projected by projector 202 a. In fig. 4 (a), since there is no object that blocks the projection image 102 of the projector 202a, the observer can visually recognize the CG (additional information) 104.
On the other hand, as shown in fig. 4 (b), in a state where there is an object 100 that blocks the projected image 102 of the projector 202a, since a part of the projected image 102 of the projector 202a is blocked by the object 100, there is a non-projection area (shadow area) 105 that is a shadow of the object 100 in a part of the projected image 102. Further, as shown in fig. 4 (b), since the non-projection region 105 and the arrangement position (projection region) of the CG104 overlap, an observer (for example, an operator who performs a work on the component 103 with the hand 100) cannot visually recognize the CG 104.
Fig. 5 (a) and (b) are diagrams for explaining the operation of the optical device control apparatus 1 according to embodiment 1. Fig. 5 (a) and (b) show a state in which the object 100 which blocks the projection image 102 of the projector 202a exists. In fig. 5 (a) and (b), the CG104 overlaps the non-projection area 105 and is not visually recognized by the observer, and therefore, the arrangement position of the CG104 is moved (rearranged) from the state of fig. 4 (a) and (b).
For example, in fig. 5 (a), the CG104 is displayed while moving to the lower left of the projection image 102, and in fig. 5 (b), the CG104 is displayed while moving to the upper left of the projection image 102. As described above, the optical device control apparatus 1 according to embodiment 1 is characterized in that the placement position of the CG104 is moved so that the placement position of the CG104 does not overlap the non-projection area 105. Thus, even when the object 100 is present and a part or all of the projection image 102 of the projector 202a is blocked, the observer can visually recognize the CG 104.
Fig. 6 (a) to (c) are diagrams for explaining the operation of the optical device control apparatus 1 according to embodiment 1. Fig. 6 (a) to (c) show plan views of the projection image 102 projected onto the projection surface 101 by the projector 202 a. Fig. 6 (a) to (c) show states in which 2 CGs (CG104a and CG104b)104 are included in the projection image 102.
Fig. 6 (a) shows a state in which there is no object obstructing the projected image 102 of the projector 202a, and fig. 6 (b) and (c) show a state in which there is an object 100 obstructing the projected image 102 of the projector 202 a. Fig. 6 (b) shows a state before the CG104b is moved, and fig. 6 (c) shows a state after the CG104b is moved.
As shown in fig. 6 (a), the projection image 102 includes, as the CG104, a CG104a (1 st image) explaining "part a" (i.e., relating to "part a") and a CG104B (2 nd image) explaining "part B". CG104a and CG104b are shown separated by a distance 106 ab.
As shown in fig. 6 (B), since the object 100 is present, the CG104B explaining "component B" overlaps with the shadow of the object 100 (i.e., the non-projection area 105 shown in fig. 5), and the observer cannot visually recognize the CG104B, the optical device control apparatus 1 moves (rearranges) the arrangement position of the CG 104B.
As shown in fig. 6 (c), the CG104B explaining "component B" moves from the state of fig. 6 (B) to the upper right in the figure. This prevents the shadow of the object 100 from overlapping the CG104b, and allows the observer to visually recognize the CG104 b. The position of the CG104b after the movement can be freely set as long as the CG104b does not overlap the shadow of the object 100 and the CG104b and the CG104a can secure the distance 106 ab'. The distance 106 ab' can be, for example, a predetermined distance, a distance within a range not overlapping the shadow of the CG104a and the object 100 such that the CG104b does not emerge from the projected image 102, or a user-specified distance, or the like.
Fig. 7 (a) to (d) are diagrams for explaining the operation of the optical device control apparatus 1 according to embodiment 1. Fig. 7 (a) to (d) show plan views of the projection image 102 projected onto the projection surface 101 by the projector 202 a. Fig. 7 (a) to (d) show states in which 3 CGs (CG104a, CG104b, and CG104c)104 are included in the projection image 102.
Fig. 7 (a) shows a state before the CGs 104a, 104b, and 104c are moved, and fig. 7 (b) shows a state after the CGs 104a, 104b, and 104c shown in fig. 7 (a) are moved. Fig. 7 (c) shows a state before the CGs 104a, 104b, and 104c are moved, and fig. 7 (d) shows a state after the CGs 104a, 104b, and 104c shown in fig. 7 (c) are moved. Fig. 7 (a) and (b) show an example of movement when the CGs 104a, 104b, and 104c do not have the attribute of the parent-child hierarchy (common attribute), and fig. 7 (c) and (d) show an example of movement when the CGs 104a, 104b, and 104c have the attribute of the parent-child hierarchy.
As shown in fig. 7 (a), as the CG104, the projection image 102 includes a CG104a explaining "part a", a CG104B explaining "part B", and a CG104C explaining "part C". There is a separation shown by distance 106ab between CG104a and CG104 b. There is a separation shown by distance 106ac between CG104a and CG104 c.
As shown in fig. 7 (a), since the object 100 is present and the CG104B for explaining "part B" and the CG104C for explaining "part C" overlap with the shadow of the object 100, the observer cannot visually recognize the CG104B and the CG104C, and thus, as shown in fig. 7 (B), the arrangement positions of the CG104B and the CG104C are moved (rearranged).
As shown in fig. 7 (B), the CG104B explaining "component B" moves upward to the right in the figure. This prevents the shadow of the object 100 from overlapping the CG104b, and allows the observer to visually recognize the CG104 b. The position of the moved CG104b may be freely set as long as the CG104b does not overlap the shadow of the object 100 and the CG104c, and the CG104b and the CG104a can secure the distance 106 ab'. The distance 106 ab' can be, for example, a predetermined distance, a distance within a range not overlapping with the shadows of the CG104a, CG104c, and object 100 such that the CG104b does not emerge from the projected image 102, or a user-specified distance, or the like.
As shown in fig. 7 (b), the CG104C explaining "component C" moves to the lower left in the figure. This prevents the shadow of the object 100 from overlapping the CG104c, and allows the observer to visually recognize the CG104 c. The position of the moved CG104c may be freely set as long as the CG104c does not overlap the shadow of the object 100 and the CG104b and the CG104c and the CG104a can secure the distance 106 ac'. The distance 106 ac' can be, for example, a predetermined distance, a distance such that the CG104c does not emerge from the projected image 102 within a range that does not overlap with the CG104a, the CG104b, and the shadow of the object 100, or a user-specified distance, or the like.
In (c) and (d) of fig. 7, a case is shown where there are attributes of the parent-child hierarchy between the CG104b and the CG104c (for example, the CG104b is the parent, the CG104c is the child). As shown in fig. 7 (C), the projection image 102 includes CG104a explaining "part a", CG104B explaining "part B", and CG104C explaining "part C". There is a separation shown by distance 106ab between CG104a and CG104 b.
Further, there is a spacing shown by distance 106bc between CG104b and CG104 c. In this case, the arrangement positions of the CG104b and the CG104c are moved so that a distance 106 bc' between the CG104b and the CG104c exists. Further, similarly to (a) and (b) of fig. 7, the arrangement position of the CG104b is moved so that a distance 106 ab' between the CG104a and the CG104b exists.
As shown in fig. 7 (d), the CG104B explaining "part B" and the CG104C explaining "part C" are moved to the lower left in the figure. Thus, the shadow of the object 100 does not overlap the CGs 104b and 104c, and the CGs 104b and 104c can be visually recognized. The position of the moved CG104c may be freely set as long as the CG104c does not overlap the shadow of the object 100 and the CG104b and the CG104c and the CG104a can secure the distance 106 ac'. The distance 106 ac' can be, for example, a predetermined distance, a distance such that the CG104c does not emerge from the projected image 102 within a range that does not overlap with the CG104a, the CG104b, and the shadow of the object 100, or a user-specified distance, or the like.
The position of the moved CG104c may be freely set as long as the CG104c does not overlap the shadow of the object 100 and the CG104a, and the CG104c and the CG104b can secure the distance 106 bc'. The preferred distance 106 bc' is a distance as short as possible within a range not overlapping the shadow of the object 100 and the CG104a with a common attribute such as an attribute of a parent-child hierarchy between the CG104b and the CG104 c. Specifically, when the CG104b and the CG104c have a common attribute, the projection image data is changed such that a distance 106 bc' between the projection position of the CG104b and the projection position of the CG104c after the projection image data is changed is shorter than a distance 106bc between the projection position of the CG104b and the projection position of the CG104c before the projection image data is changed or shorter than a predetermined distance.
The CG104(104a, 104b, 104c) preferably has a destination close to the original position. It is preferable that the CG104 is located at a position close to the annotation-providing member (object) 103. Further, the CG104 preferably has a short moving distance. The optical device control apparatus 1 performs the movement of each CG104 under such conditions.
Fig. 8 is a flowchart showing the operation of the optical device control apparatus 1 according to embodiment 1. The operation shown in fig. 8 shows the following CG control process: when there is a light shielding object (object 100) between the projector 202a and the projection surface 101, it is determined whether or not the CG104 to be projected can be projected at any position, the CG104 determined to be not projectable is moved in consideration of the attribute, and the projection image 102 projected from the projector 202a is generated and actually projected.
Next, with reference to fig. 8, a CG control process of the optical device control apparatus 1 according to embodiment 1 will be described. However, the CG control processing performed by the optical device control apparatus 1 may be processing different from the processing shown in fig. 8. The CG control process shown in fig. 8 is executed each time the camera 201a or the sensor 201b used in the physical space information acquisition device 201 acquires the physical space information G1 (for example, at a constant cycle).
As shown in fig. 8, in step S11, the optical device control apparatus 1 causes the physical space information acquisition apparatus 201 to acquire physical space information G1 based on the physical space including the object. When the physical space information acquisition device 201 is the camera 201a, the camera 201a acquires image data as physical space information G1. When the real space information acquisition device 201 is the sensor 201b, the sensor 201b acquires a detection value (for example, distance information in the case of a depth sensor) as the real space information G1.
In the next step S12, the optical device control apparatus 1 detects whether or not the object 100 is present between the projector 202a and the projection surface 101 by the object detection unit 11 based on the physical space information G1 acquired by the physical space information acquisition apparatus 201. In the case where the object 100 is detected (step S12: YES), the process proceeds to the next step S13.
In the next step S13, the optical device control apparatus 1 determines, by the overlap determination section 12, whether the object 100 detected in step S12 obscures the projection of the CG104 (whether the shadow of the object 100 and the CG104 are in an overlapped state). If it is determined that the CG104 is occluded by the object 100 (yes in step S13), the process proceeds to the next step S14. In the next step S14, the optical device control apparatus 1 reads the movement flag 16e in the storage unit 16 by the attribute management unit 13 for the CG104 determined to be shielded in step S13, and sets the movement flag 16 e.
In the next step S15, the optical device control apparatus 1 moves the CG104 to a target position by the layout calculating section 15 in accordance with the position and attribute of itself and the position and attribute of the other CG104, with respect to the CG104 to which the movement flag 16e is set, and generates the projection image 102. At this time, the CG104 having the movement flag 16e set therein is the coordinate after the movement, and the CG104 having no movement flag 16e set therein is the original coordinate, thereby generating the projection image 102.
In the next step S16, the optical device control apparatus 1 causes the image projection apparatus 202 to project the projection image 102 generated by the projection image generation unit 14.
Fig. 9 is a flowchart illustrating an example of the object detection process in embodiment 1. Fig. 9 shows an example of the object detection processing in step S12 in fig. 8. In step S21, the optical device control apparatus 1 estimates the position and shape of the object 100 in the coordinate system 2011a of the device (for example, the camera 201a) used in the real space information acquisition apparatus 201, based on the real space information G1 acquired by the real space information acquisition apparatus 201.
In the next step S22, the optical device control apparatus 1 converts the result (information of the position and shape of the object 100) estimated in step S21 into information of the position and shape in the projector coordinate system 2021 a. In the next step S23, the optical device control apparatus 1 determines whether or not the object 100 is present in the view frustum 2020a of the projector. If it is determined that the object 100 is present in the frustum 2020a, the determination at step S12 in fig. 8 is yes, and if it is determined that the object 100 is not present in the frustum 2020a, the determination at step S12 in fig. 8 is no.
Fig. 10 is a flowchart showing an example of the overlap determination process in embodiment 1. Fig. 10 shows an example of the superimposition determination processing of the CG104 in step S13 in fig. 8. In step S31, the optical device control apparatus 1 projects the object 100 detected by the object detection section 11 onto the projector image coordinate system.
In the next step S32, the optical device control apparatus 1 generates a convex hull (convex envelope) about the shadow of the object 100 from the outline of the shadow of the projected object 100. In addition, a convex hull refers to the smallest convex set that contains a given set. A convex set refers to a given set when a line segment connecting any 2 points within the set is contained within the set.
In the next step S33, the optical device control apparatus 1 projects all the CGs 104 to the projector image coordinate system. In the next step S34, the optical device control apparatus 1 generates convex hulls about the CGs 104 from the outlines of the projected CGs 104. In the next step S35, the optical device control apparatus 1 extracts the convex hull of the detected shadow of the object 100 and the convex hull of the CG104 one by one without falling over (i.e., for all combinations).
In the next step S36, the optical device control apparatus 1 calculates the area of the common portion from the extracted 2 convex hulls, and if the area is equal to or larger than a predetermined threshold (area threshold), it determines that the CG104 overlaps with the shadow of the object 100 (i.e., is in an overlapping state). If there are a plurality of detected objects 100, this process is repeated for each object 100.
The method of determining overlap in step S36 is not limited to the above method. For example, when the ratio of the area of the extracted common portion of the 2 convex hulls to the area of the convex hull of the CG104 is equal to or greater than a predetermined ratio (ratio threshold), it may be determined that the shadows of the CG104 and the object 100 overlap (i.e., are in an overlapping state).
Fig. 11 is a flowchart showing an example of the CG104 movement process in embodiment 1. Fig. 11 shows an example of the CG104 movement process in step S15 in fig. 8. In step S41, the optical device control apparatus 1 sets the process number n to 1.
In the next step S42, the optical device control apparatus 1 compares the process number n with the number (total number) of CGs 104 to be projected included in the projection image 102, and when the process number n is equal to or less than the number of CGs 104 to be projected, continues the process of the CGs 104. If the processing number n is higher than the number of CGs 104 included in the projection image 102 (i.e., n > "the number of CGs 104") (step S42: yes), the CG104 movement process ends.
If the processing number n is equal to or less than the number of CGs 104 included in the projection image 102 (no in step S42), the process proceeds to step S43, and the optical device control apparatus 1 confirms whether the movement flag 16e of the CG104 is set. When the movement flag 16e is set (yes in step S43), calculation is performed in the following steps S44 to S47, for example, according to a rendering algorithm based on a mechanical model.
In this rendering algorithm, in a dynamic model in which a plurality of points (in the present application, shadows and CGs) are brought close to or away from each other by an attractive force or a repulsive force acting between the points, the positions of the points are determined so as not to overlap each other, and all the points are arranged stably.
The optical device control apparatus 1 calculates the attractive force and the repulsive force to which the CG104 is subjected. The CG104 receives, for example, a repulsive force from a position of a shadow of the object 100 (step S44), a repulsive force from another CG104 (step S45), an attractive force from the CG104 having a common attribute (for example, an attribute of a parent-child hierarchy) (step S46), and an attractive force from an original position (step S47).
In the next step S48, the optical device control apparatus 1 calculates the arrangement position of the CG104 at the destination of movement by the layout calculating unit 15, by integrating the attractive force and the repulsive force received by the CG 104. In the next step S49, the optical device control apparatus 1 updates the reconfiguration coordinates 16a of the CG 104. In the next step S50, the optical device control apparatus 1 increments the process number n by 1, returns the process to step S42, and continues the process until the process number n becomes higher than the number of CGs 104.
Effect of (1-3)
As described above, according to the optical device control apparatus 1, the optical device control method, and the optical device control program of embodiment 1, the object detection unit 11, the overlap determination unit 12, and the layout calculation unit 15 are provided, and when the overlap determination unit 12 determines that the shadow of the object 100 overlaps (is in an overlapping state) with the CG104, the CG104 is moved to a position where the shadow of the object 100 does not overlap by the layout calculation unit 15 and is relocated. Accordingly, even when a light shielding object (for example, the object 100) is present between the optical device (for example, the projector 202a) and the projection target (the projection surface 101), the CG104 can be displayed at an appropriate place, and information intended by the observer can be provided.
According to the optical device control apparatus 1, the optical device control method, and the optical device control program of embodiment 1, when a plurality of CGs 104 (e.g., 104b and 104c in fig. 7 (c) and (d)) have a common attribute with each other, the arrangement positions of the CGs 104 are determined based on these pieces of information. This enables the CG104 to continue displaying information that has common attributes with each other and provides the observer with intention.
EXAMPLE 2
Structure of 2-1
Fig. 12 is a functional block diagram showing a schematic configuration of an optical device control apparatus 1a according to embodiment 2 of the present invention. The optical equipment control device 1a shown in fig. 12 is substantially the same as the optical equipment control device 1 shown in fig. 1, but differs in that it includes a user line of sight estimation unit 17. In fig. 12, the same or corresponding components as those shown in fig. 1 are denoted by the same reference numerals as those shown in fig. 1. Note that the same or corresponding structure as that shown in fig. 1 will not be described.
In embodiment 1 described above, when the CG104 projected from the projector 202a is hidden by the object 100, the CG104 is moved to provide information to the observer. However, in embodiment 1, the viewpoint and the line of sight of the observer (line connecting the center of the eye, that is, the viewpoint and the viewing object) are not taken into consideration. Therefore, embodiment 2 describes the following: in consideration of the line of sight of the observer, even if an object (an object 100a described later) exists between the observer and the projection target, the observer can be provided with intended information. In embodiment 2, similarly to embodiment 1, an object 100 existing between the optical device (projector 202a) and the projection target is considered.
As shown in fig. 12, the optical device control apparatus 1a has a user line of sight estimation section 17. The user line-of-sight estimation unit 17 estimates the line of sight of the observer (the user 203 in fig. 13) from the physical space information G1 acquired by the physical space information acquisition device 201. The information G7 on the line of sight of the user 203 estimated by the user line of sight estimating unit 17 is sent to the overlap determining unit 12. Further, the object detection unit 11a in embodiment 2 detects an object (2 nd object) 100a located between the viewpoint of the user 203 and the projection target (projection plane 101).
The superimposition determining unit 12 performs superimposition determination of the CG104 based on the detected object information G2 received from the object detecting unit 11a, the information G7 relating to the line of sight of the user 203, and the information G3 of the CG104 read from the attribute managing unit 13. In the overlap determination, it is determined whether or not the CG104 and a visually imperceptible region (i.e., a region where the line of sight is blocked by the object 100a and thus visually imperceptible) which is made invisible to the user 203 by the object 100a detected by the object detection unit 11a are in an overlapped state on the projection plane 101. The layout calculation unit 15 changes the projection image data G6 so that the CG104 is projected to a region other than the visually imperceptible region that is not visually perceptible by the object 100 a.
Fig. 13 is a diagram for explaining the operation of the optical device control apparatus 1a according to embodiment 2. As shown in fig. 13, an optical device control apparatus 1a according to embodiment 2 is different from the optical device control apparatus 1 according to embodiment 1 in that a viewpoint 203a of a user 203, not a projector 202a, is a starting point. As shown in fig. 13, in embodiment 2, the projector coordinate system 2021a and the projector viewing cone 2020a (see fig. 3) in embodiment 1 are replaced with a user coordinate system 2031 and a user viewing cone 2030.
The cone table 2030 of the user 203 shows a spatial range that is visually recognized by the user 203. As shown in fig. 13, an object (2 nd object) 100a is contained in a viewing frustum 2030 of the user. Therefore, a part of the image visually recognized by the user 203 is blocked by the object 100 a.
Fig. 14 is a flowchart illustrating an example of the object detection process in embodiment 2. Fig. 14 shows an example of the object detection processing in step S12 in fig. 8. In fig. 14, the processing of step S21 to step S23 is the same as the processing of step S21 to step S23 in fig. 9. In the next step S51, the optical device control apparatus 1a converts the position of the line of sight of the user 203 estimated by the user line of sight estimation section 17 into the user coordinate system 2031. In the next step S52, the optical device control apparatus 1a determines whether or not the object 100a is present in the user' S view frustum 2030.
Effect of (2-2)
As described above, according to the optical equipment control device 1a, the optical equipment control method, and the optical equipment control program according to embodiment 2, by adding the user line of sight estimating unit 17 as a configuration, even when there is a light shielding object (the object 100a) between the viewpoint of the observer (the user 203) and the projection target (the projection surface 101), the CG104 can be displayed at a position that can be visually recognized by the observer, and information intended for the observer can be provided.
Further, according to the optical device control apparatus 1a, the optical device control method, and the optical device control program of embodiment 2, the same effects as those of the optical device control apparatus 1, the optical device control method, and the optical device control program of embodiment 1 can be obtained.
Variation of 3
Although the CG104 can be visually recognized by moving the CG104 in the above embodiments 1 and 2, the CG104 may be visually recognized by displaying the CG104 in a reduced size, by changing the character arrangement of the CG104 (for example, changing a horizontal book to a vertical book), or by any combination of movement, reduction, and change in the character arrangement.
In addition, although the CG104b and the CG104c have been described in fig. 7 (c) and (d) as having the attribute of the parent-child hierarchy, when any one of the CGs is not accommodated in the projection plane 101 in the CG placement having the attribute of the parent-child hierarchy, the CG placement may be switched to another placement such as the CG placement without considering the attribute of the parent-child hierarchy as shown in fig. 7 (a) and (b).
Description of the reference symbols
1. 1 a: an optical device control device; 11. 11 a: an object detection unit; 12: an overlap determination unit; 13: an attribute management unit; 14: a projection image generation unit; 15: a layout calculation section; 16: a storage unit; 16 a: then, coordinates are configured; 16 b: a hierarchy; 16 c: area; 16 d: configuring coordinates; 16 e: a mobile sign; 17: a user sight line estimation unit; 21: a CPU; 22: a GPU; 23: a main memory; 24: a memory; 25: a bus; 100: an object; 101: a projection surface; 102: projecting an image; 103: a component; 104: CG; 105: a non-projection region; 106: a distance; 201: a real space information acquisition device; 201 a: a camera; 201 b: a sensor; 202: an image projection device; 202 a: a projector; 203: a user; G1-G7: and (4) information.

Claims (7)

1. An optical device control apparatus, characterized by comprising:
a projection image generation unit that projects a projection image based on the projection image data from the optical device onto the object;
an object detection unit that detects a 1 st object present between the optical device and the target object, based on the physical space information acquired by the physical space information acquisition device;
an overlap determination unit that determines whether or not a graphic image included in the projection image and a shadow, which is an area where the projection image does not reach the target object due to the 1 st object, overlap each other, based on the projection image data and a detection result of the object detection unit;
a changing unit that, when the superimposition determining unit determines that the graphic image and the shadow are in the superimposed state, changes the projection image data so that the graphic image is projected onto an area other than the shadow;
a management unit configured to supply the projection image data changed by the changing unit to the projection image generating unit as the projection image data when the superimposition determining unit determines that the graphic image and the shadow are in the superimposed state; and
a storage section that stores attributes of the 1 st image and the 2 nd image,
the graphics image includes the 1 st image and the 2 nd image,
when the overlap determination unit determines that the 1 st image and the shadow are in the overlap state, the changing unit calculates an attractive force and a repulsive force to which the 1 st image is subjected based on a mechanical model, and changes the projection image data based on the calculated result so that the 1 st image is projected onto a region other than the projection region of the 2 nd image and other than the shadow,
the repulsive force includes a force borne by the 1 st image according to the position of the shadow and a force borne by the 1 st image according to the projected position of the 2 nd image,
the attractive force is a force to which the 1 st image is subjected according to a position at which the 1 st image is projected, and is a force to which the 1 st image is subjected according to a projected position of the 2 nd image in a case where the 1 st image and the 2 nd image have the common attribute.
2. The optical device control apparatus according to claim 1,
when the 1 st image and the 2 nd image have the common attribute, the changing unit changes the projection image data so that a distance between a projection position of the 1 st image and a projection position of the 2 nd image after the projection image data is changed is shorter than a distance between a projection position of the 1 st image and a projection position of the 2 nd image before the projection image data is changed or shorter than a predetermined distance.
3. The optical device control apparatus according to claim 2,
the positional relationship between the projection position of the 1 st image and the projection position of the 2 nd image is based on at least one of a distance between the projection position of the 1 st image and the projection position of the 2 nd image and an arrangement direction of the projection position of the 1 st image with respect to the projection position of the 2 nd image.
4. The optical device control apparatus according to any one of claims 1 to 3,
the superimposition determining unit determines that the graphics image and the shadow are in the superimposed state when an area of a portion where the graphics image and the shadow are in the superimposed state is equal to or larger than a predetermined area threshold or when a ratio of the area to the graphics image is equal to or larger than a predetermined ratio threshold.
5. The optical device control apparatus according to any one of claims 1 to 3,
the optical device control apparatus further includes a line-of-sight estimation unit that estimates a line of sight of an observer who observes the projection image, based on the physical space information acquired by the physical space information acquisition device,
the object detection unit detects a 2 nd object existing between the viewpoint of the observer and the target object,
the superimposition determining unit determines whether or not the graphic image and a visually imperceptible region that is a region visually imperceptible to the observer due to the 2 nd object detected by the object detecting unit are in the superimposed state on the object, based on the projection image data, the estimation result by the line-of-sight estimating unit, and the detection result by the object detecting unit,
when the superimposition determining unit determines that the graphic image and the visually imperceptible region are in the superimposed state, the changing unit changes the projection image data so that the graphic image is projected to a region other than the visually imperceptible region.
6. An optical apparatus control method characterized by having the steps of:
a projection image generation step of projecting a projection image based on the projection image data from an optical device onto an object;
an object detection step of detecting a 1 st object existing between the optical device and the object, based on the physical space information acquired by the physical space information acquisition device;
an overlap determination step of determining whether or not a graphic image included in the projected image and a shadow, which is an area where the projected image does not reach the object due to the 1 st object, overlap each other, are in an overlapped state, based on the projected image data and a detection result of the object detection step;
a changing step of changing the projection image data so that the graphic image is projected to an area other than the shadow when the superimposition determining step determines that the graphic image and the shadow are in the superimposed state; and
a management step of supplying the projected image data changed in the changing step to the projected image generating step as the projected image data when it is determined in the superimposition determining step that the graphics image and the shadow are in the superimposed state,
the graphics image comprises a 1 st image and a 2 nd image,
in the changing step, when it is determined in the overlapping determination step that the 1 st image and the shadow are in the overlapping state, the attractive force and the repulsive force to which the 1 st image is subjected are calculated from a mechanical model, and the projected image data is changed so that the 1 st image is projected to an area other than the projection area of the 2 nd image and other than the shadow based on the calculation result,
the repulsive force includes a force borne by the 1 st image according to the position of the shadow and a force borne by the 1 st image according to the projected position of the 2 nd image,
the attractive force is a force to which the 1 st image is subjected according to a position at which the 1 st image is projected, and is a force to which the 1 st image is subjected according to a projected position of the 2 nd image in a case where the 1 st image and the 2 nd image have a common attribute.
7. A computer-readable storage medium storing an optical device control program, the optical device control program causing a computer to execute:
projection image generation processing of projecting a projection image based on projection image data from an optical device onto an object;
an object detection process of detecting a 1 st object existing between the optical device and the object, based on the physical space information acquired by the physical space information acquisition device;
an overlap determination process of determining whether or not a graphic image included in the projection image and a shadow, which is an area where the projection image does not reach the object due to the 1 st object, are overlapped with each other, based on the projection image data and a detection result of the object detection process;
a change process of changing the projection image data so that the graphic image is projected to an area other than the shadow when it is determined by the superimposition determination process that the graphic image and the shadow are in the superimposed state; and
a management process of supplying the projection image data changed by the change process to the projection image generation process as the projection image data when it is determined by the superimposition determination process that the graphics image and the shadow are in the superimposed state,
the graphics image comprises a 1 st image and a 2 nd image,
in the changing process, when it is determined that the 1 st image and the shadow are in the superimposed state by the superimposition determining process, an attractive force and a repulsive force to which the 1 st image is subjected are calculated from a mechanical model, and the projected image data is changed so that the 1 st image is projected to an area other than the projection area of the 2 nd image and other than the shadow based on the calculation result,
the repulsive force includes a force borne by the 1 st image according to the position of the shadow and a force borne by the 1 st image according to the projected position of the 2 nd image,
the attractive force is a force to which the 1 st image is subjected according to a position at which the 1 st image is projected, and is a force to which the 1 st image is subjected according to a projected position of the 2 nd image in a case where the 1 st image and the 2 nd image have a common attribute.
CN201780093840.9A 2017-08-31 2017-08-31 Optical device control apparatus, optical device control method, and storage medium Active CN111052218B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/031300 WO2019043854A1 (en) 2017-08-31 2017-08-31 Optical device controller, method for controlling optical device, and optical device control program

Publications (2)

Publication Number Publication Date
CN111052218A CN111052218A (en) 2020-04-21
CN111052218B true CN111052218B (en) 2022-04-19

Family

ID=65037128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780093840.9A Active CN111052218B (en) 2017-08-31 2017-08-31 Optical device control apparatus, optical device control method, and storage medium

Country Status (4)

Country Link
JP (1) JP6456551B1 (en)
CN (1) CN111052218B (en)
DE (1) DE112017007791B4 (en)
WO (1) WO2019043854A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110197633A (en) * 2019-01-07 2019-09-03 深圳市佛光照明有限公司 A kind of device being provided with luminous mark
US20220400239A1 (en) 2019-11-15 2022-12-15 Ntt Docomo, Inc. Information processing apparatus and projection system

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7131060B1 (en) * 2000-09-29 2006-10-31 Raytheon Company System and method for automatic placement of labels for interactive graphics applications
JP3874737B2 (en) 2003-03-31 2007-01-31 株式会社東芝 Display device
JP2005156591A (en) * 2003-11-20 2005-06-16 Nippon Telegr & Teleph Corp <Ntt> Method and system for displaying article information, server system, program, and recording medium for the same
JP5092459B2 (en) 2006-07-18 2012-12-05 富士ゼロックス株式会社 Remote indication system and program for remote indication system
JP2008158419A (en) 2006-12-26 2008-07-10 Fuji Xerox Co Ltd Image projection device, image control system and image control program
US8007110B2 (en) 2007-12-28 2011-08-30 Motorola Mobility, Inc. Projector system employing depth perception to detect speaker position and gestures
JP2012058838A (en) * 2010-09-06 2012-03-22 Sony Corp Image processor, program, and image processing method
JP5434997B2 (en) 2010-10-07 2014-03-05 株式会社ニコン Image display device
JP5304848B2 (en) * 2010-10-14 2013-10-02 株式会社ニコン projector
JP6102215B2 (en) * 2011-12-21 2017-03-29 株式会社リコー Image processing apparatus, image processing method, and program
JP2013156324A (en) * 2012-01-27 2013-08-15 Seiko Epson Corp Projector and image projection method by the same
JP5924020B2 (en) * 2012-02-16 2016-05-25 セイコーエプソン株式会社 Projector and projector control method
US9081418B1 (en) * 2013-03-11 2015-07-14 Rawles Llc Obtaining input from a virtual user interface
JP6263881B2 (en) * 2013-07-10 2018-01-24 株式会社リコー Image projection apparatus, control method for image projection apparatus, and control program for image projection apparatus
JP2015130555A (en) * 2014-01-06 2015-07-16 株式会社東芝 image processing apparatus, image processing method, and image projection apparatus
JP6459194B2 (en) 2014-03-20 2019-01-30 セイコーエプソン株式会社 Projector and projected image control method
US9304599B2 (en) 2014-03-21 2016-04-05 Dell Products L.P. Gesture controlled adaptive projected information handling system input and output devices
WO2015170680A1 (en) * 2014-05-09 2015-11-12 コニカミノルタ株式会社 Projection system
US20170070711A1 (en) * 2015-09-03 2017-03-09 Disney Enterprises, Inc. Intensity correction for projection systems

Also Published As

Publication number Publication date
DE112017007791B4 (en) 2021-10-07
WO2019043854A1 (en) 2019-03-07
CN111052218A (en) 2020-04-21
JP6456551B1 (en) 2019-01-23
JPWO2019043854A1 (en) 2019-11-07
DE112017007791T5 (en) 2020-04-16

Similar Documents

Publication Publication Date Title
US20240062488A1 (en) Object centric scanning
CN106062862B (en) System and method for immersive and interactive multimedia generation
KR101266198B1 (en) Display apparatus and display method that heighten visibility of augmented reality object
US9336603B2 (en) Information processing device and information processing method
CN115509352A (en) Optimized object scanning using sensor fusion
JP6476657B2 (en) Image processing apparatus, image processing method, and program
KR20190045317A (en) Image processing apparatus, image generation method, and computer program
JP2016218905A (en) Information processing device, information processing method and program
KR20150082358A (en) Reference coordinate system determination
KR102279300B1 (en) Virtual object display control apparatus, virtual object display system, virtual object display control method, and virtual object display control program
KR102539427B1 (en) Image processing apparatus, image processing method, and storage medium
CN111833403B (en) Method and apparatus for spatial localization
KR102382247B1 (en) Image processing apparatus, image processing method, and computer program
CN112667179B (en) Remote synchronous collaboration system based on mixed reality
CN111052218B (en) Optical device control apparatus, optical device control method, and storage medium
KR101426378B1 (en) System and Method for Processing Presentation Event Using Depth Information
KR101308184B1 (en) Augmented reality apparatus and method of windows form
US20220300120A1 (en) Information processing apparatus, and control method
EP3723365A1 (en) Image processing apparatus, system that generates virtual viewpoint video image, control method of image processing apparatus and storage medium
JP6719945B2 (en) Information processing apparatus, information processing method, information processing system, and program
JP6632298B2 (en) Information processing apparatus, information processing method and program
CN111819603B (en) Virtual object display control device, virtual object display control method, recording medium, and virtual object display system
CN114201028B (en) Augmented reality system and method for anchoring display virtual object thereof
JP6641525B2 (en) Optical device control device, optical device control method, and optical device control program
KR101860215B1 (en) Content Display System and Method based on Projector Position

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant