WO2021117595A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2021117595A1
WO2021117595A1 PCT/JP2020/045003 JP2020045003W WO2021117595A1 WO 2021117595 A1 WO2021117595 A1 WO 2021117595A1 JP 2020045003 W JP2020045003 W JP 2020045003W WO 2021117595 A1 WO2021117595 A1 WO 2021117595A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
map
information processing
information
processing device
Prior art date
Application number
PCT/JP2020/045003
Other languages
English (en)
Japanese (ja)
Inventor
健志 後藤
誠史 友永
淳 入江
哲男 池田
英佑 藤縄
忠義 村上
洋祐 加治
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021117595A1 publication Critical patent/WO2021117595A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • This technology relates to information processing devices, information processing methods, and programs that can be applied to control distance measuring sensors.
  • Patent Document 1 In the information processing system described in Patent Document 1, it is recognized that the real object comes into contact with the display object projected on the table based on the detection result of the distance measuring sensor. Based on the recognition result, the provided information corresponding to either the display object or the real object is displayed in the table. Thereby, it is possible to improve the convenience of the user (paragraphs [0038] [0055] of Patent Document 1 and the like).
  • the purpose of the present technology is to provide an information processing device, an information processing method, and a program capable of efficiently operating the distance measuring sensor.
  • the information processing device includes an acquisition unit and a range control unit.
  • the acquisition unit acquires map information in which distance information is associated with position information in the target area.
  • the range control unit controls the ranging range of the ranging sensor with respect to the target area based on the map information.
  • map information in which distance information is associated with position information in the target area is acquired. Based on the map information, the ranging range of the ranging sensor with respect to the target area is controlled. This makes it possible to operate the ranging sensor efficiently.
  • the range control unit may set the ranging range for each position in the target area.
  • the range control unit may set the ranging range for each area with respect to the target area.
  • the range control unit may set the distance measurement range so that the processing amount required for distance measurement with respect to the target area by the distance measurement sensor does not exceed a predetermined processing amount.
  • the range control unit may control the ranging range so that the amount of processing required for ranging by the ranging sensor is constant for each position in the target region.
  • the range control unit may set a minimum distance to be measured, and a range larger than the minimum distance may be set as the distance measurement range.
  • the acquisition unit may acquire reference map information which is the map information when the target area is in a predetermined reference state.
  • the range control unit may set the ranging range based on the reference map information.
  • the acquisition unit may acquire the immediately preceding map information which is the map information at the timing immediately before the distance measurement with respect to the target area by the distance measuring sensor is executed.
  • the range control unit may set the ranging range based on the immediately preceding map information.
  • the range control unit may set the distance measurement range when the distance measurement sensor executes distance measurement with respect to the target area with the first distance measurement accuracy.
  • the immediately preceding map information may be the map information generated with a second ranging accuracy lower than the first ranging accuracy.
  • the range control unit may set the maximum distance to be distance-measured and the distance-measured width to be measured, and set the range of the distance-measured width based on the maximum distance as the distance-measured range. Good.
  • the range control unit sets the maximum distance based on the reference map information which is the map information when the target area is in a predetermined reference state, and the distance measurement sensor executes distance measurement on the target area.
  • the measurement width may be set based on the immediately preceding map information which is the map information at the timing immediately before the measurement.
  • the range control unit may set the distance measurement range based on the result of distance measurement for the target area by another distance measurement sensor having a distance measurement direction different from that of the distance measurement sensor.
  • the information processing device may further include a GUI output unit that outputs a GUI (Graphical User Interface) for inputting an instruction regarding the ranging range.
  • GUI Graphic User Interface
  • the acquisition unit may generate the map information based on the environment map including the target area acquired by SLAM (Simultaneous Localization and Mapping).
  • SLAM Simultaneous Localization and Mapping
  • the distance measuring sensor may be a stereo camera.
  • the range control unit may control the matching range of the stereo camera.
  • the distance measuring sensor may be a TOF (Time of Flight) camera having a light emitting unit and a light receiving unit.
  • the range control unit may control the range of the time difference for which the distance is to be calculated with respect to the time difference between the light emitting time of the light emitting unit and the light receiving time of the light receiving unit.
  • the range control unit sets the light emitting mode of the light emitting unit for each area with respect to the target area, and calculates the distance in the area in which the light emitting mode is set when the light emitting unit executes light emission. It may be set as an area that becomes.
  • the information processing device further includes a processing execution unit that executes processing based on an operation of an operation object in the target area based on the result of distance measurement of the target area by the distance measurement sensor. May be good.
  • the information processing method is an information processing method executed by a computer system, and includes acquiring map information in which distance information is associated with position information in a target area. Based on the map information, the distance measuring range of the distance measuring sensor with respect to the target area is controlled.
  • the program related to one form of this technology causes the computer system to perform the following steps.
  • FIG. 1 is a schematic diagram for explaining an outline of a distance measuring system according to a first embodiment of the present technology.
  • the ranging system 1000 corresponds to an embodiment of an information processing system according to the present technology.
  • the distance measuring system 1000 includes a distance measuring sensor 5 and an information processing device 30.
  • the distance measuring sensor 5 and the information processing device 30 are communicably connected via a wire or a radio.
  • the connection form between each device is not limited, and for example, wireless LAN communication such as WiFi and short-range wireless communication such as Bluetooth (registered trademark) can be used.
  • the distance measuring sensor 5 may be equipped with the function of the information processing device 30. That is, the distance measuring sensor 5 and the information processing device 30 may be integrally configured.
  • the distance measuring sensor 5 can measure the distance (distance measurement) with respect to the target area 1 to be measured and acquire the distance information.
  • the distance information is the distance from the origin to each position in the target area 1.
  • the depth (depth) to each position of the target area 1 with the position of the distance measuring sensor 5 as a reference (origin) is acquired as distance information.
  • the XYZ coordinate system is set with reference to the position of the distance measuring sensor 5. For example, as illustrated in FIG. 1, the XYZ coordinates so that the distance measuring direction (direction of the measured distance) of the distance measuring sensor 5 is the Z direction and the plane direction in which the target area 1 is set is the XY plane direction. The system is set.
  • the distance measuring sensor 5 measures the distance from the distance measuring sensor 5 along the Z direction for each XY coordinate value which is the position information of each position of the target area 1, and acquires it as the distance information.
  • the position information is information indicating each position of the target area 1, and for example, it is possible to define the position information with a predetermined reference position as the origin. For example, it is possible to define the position information with the central point of the target area 1 or the apex of the corner as the origin. For example, it is also possible to use the position (coordinate value) of the pixel in the captured image in which the target area 1 is captured as the position information of each position of the target area 1. It is also possible to specify position information for each object, area, etc. in the target area 1.
  • the position information of an object or the like by defining the representative position information such as the position of the center of gravity of the object or the like.
  • the coordinate system an absolute coordinate system (world coordinate system) may be used instead of the relative coordinate system based on the distance measuring sensor 5.
  • the information (data) that defines the distance information of each position in the target area 1 is not limited to the distance associated with the XY coordinate values.
  • the information indicating each position of the target area 1 and the information indicating the distance other arbitrary types of information may be used.
  • the space of the quadrangular pyramid illustrated in FIG. 1 is the measurement target space.
  • the area where the rectangular parallelepiped object 3 is placed on the ground 2 is the target area 1.
  • the distance to the ground 2 is calculated for the area of the target area 1 where the object 3 is not placed.
  • the distance to the surface of the object 3 on the distance measuring sensor 5 side is calculated.
  • the distance measuring direction (Z direction) of the distance measuring sensor 5 is the vertical direction (height direction)
  • the target area 1 is set in the horizontal direction. In this case, it is also possible to calculate the height of each position in the target area 1 based on the distance information acquired by the distance measuring sensor 5.
  • the direction in which the distance measuring sensor 5 is arranged and the distance measuring direction are not limited. Further, the size and shape of the target area 1 and the target space are not limited.
  • the distance measuring sensor 5 for example, a stereo camera or a TOF (Time of Flight) camera is used.
  • this technology can be applied to sensor devices such as Structured-Light sensor devices, laser ranging sensors, ultrasonic sensors, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), and sonar. ..
  • the information processing device 30 has hardware necessary for configuring a computer, such as a processor such as a CPU, GPU, or DSP, a memory such as a ROM or RAM, and a storage device such as an HDD (see FIG. 31).
  • a computer such as a processor such as a CPU, GPU, or DSP, a memory such as a ROM or RAM, and a storage device such as an HDD (see FIG. 31).
  • the information processing method according to the present technology is executed when the CPU loads and executes the program according to the present technology recorded in advance in the ROM or the like into the RAM.
  • the information processing device 30 can be realized by an arbitrary computer such as a PC (Personal Computer). Of course, hardware such as FPGA and ASIC may be used.
  • the program is installed in the information processing apparatus 30 via, for example, various recording media. Alternatively, the program may be installed via the Internet or the like.
  • the type of recording medium on which the program is recorded is not limited, and any computer-readable recording medium may be
  • the information processing device 30 acquires the distance map 6 of the target area 1.
  • the acquisition of the distance map 6 by the information processing device 30 includes both receiving the distance map 6 transmitted from the outside and generating the distance map 6 by the information processing device 30 itself.
  • the distance map 6 is map information in which the distance information is associated with the position information in the target area 1.
  • the map information is information in which an arbitrary parameter is associated with the position information in the target area 1, and the distance map is information included in the map information.
  • a distance map 6 having a correlation with the distance measurement result of the distance measurement sensor 5 for the target area 1 is used. Specifically, there is a correlation between each position information of the target area 1 included in the distance map 6 and each position information of the target area 1 included in the distance measurement result of the distance measurement sensor 5. Further, there is a correlation between the distance direction of the distance information included in the distance map 6 and the distance direction of the distance information included in the distance measurement result of the distance measurement sensor 5. Such map information is acquired. In the example shown in FIG.
  • the correspondence with the XY coordinate values can be grasped as the position information of the target area 1, and the distance map in which the distance from the distance measuring sensor 5 along the Z-axis direction can be calculated is acquired.
  • the distance measurement result acquired by the distance measurement sensor 5 itself is also included in the distance map 6.
  • the distance map 6 may be generated by a distance measuring device, a sensing device, or the like different from the distance measuring sensor 5.
  • the distance map 6 includes position information (XY coordinate values) based on the distance measuring sensor 5 and distance information along the distance measuring direction (Z direction) of the distance measuring sensor 5. Will be explained assuming that is acquired.
  • the range control unit 35 as a functional block is configured by the CPU executing a predetermined program.
  • dedicated hardware such as an IC (integrated circuit) may be used to realize the functional block.
  • the range control unit 35 can control the distance measurement range with respect to the target area 1 of the distance measurement sensor 5 based on the distance map 6.
  • the ranging range 7 is a range for performing ranging.
  • the distance measuring range 7 is defined based on the distance from the distance measuring sensor 5.
  • the distance measurement range is set from a height of 4 m from the distance measurement sensor 5 to a height of 2 m.
  • the distance information is calculated at the position (XY coordinate values) where the object or the like exists in the distance measuring range.
  • Distance information is not calculated at a position (XY coordinate value) where no object or the like exists in the distance measurement range.
  • the distance measuring range 7 is defined by the matching range in which stereo matching is performed. That is, the matching range is controlled by the range control unit 35. For example, a parameter such as 10 pixels is set as the matching range.
  • the distance information is calculated at the position (XY coordinate value) corresponding to the pixel position of the feature point. Distance information is not calculated for feature points where matching is not successful with stereo matching for 10 pixels.
  • the distance measuring range 7 is defined by the time between the start time and the end time of monitoring the reflected light from the subject. That is, the range control unit 35 controls the range of the time difference for which the distance is to be calculated with respect to the time difference between the light emitting time of the light emitting unit of the TOF camera and the light receiving time of the light receiving unit.
  • the distance measuring range 7 is set by the farthest position and the distance measuring width.
  • the farthest position is a position that serves as a reference for the ranging range 7. For example, in FIG. 1, the surface of the object 3 facing the distance measuring sensor 5 is the farthest position 8 of the object 3. Further, the region of the ground 2 is the farthest position 9 of the ground 2.
  • the distance measurement width is the distance (length) of the distance measurement range 7 with respect to the farthest position.
  • the distance from the farthest position 8 of the object 3 to the dotted space 10 is the distance measurement width 11 of the object 3.
  • the distance from the farthest position 9 of the ground 2 to the dotted space 12 is the distance measurement width 13 on the ground 2.
  • the direction of the distance measurement width is not limited to the Z direction, and the distance measurement width in any direction such as the X-axis direction or the Y-axis direction shown in FIG. 1 may be set. That is, the setting of the distance measuring range 7 corresponds to setting the farthest position and the distance measuring width.
  • FIG. 2 is a schematic view showing an embodiment of the ranging system 1000 illustrated in FIG.
  • the ranging system 1000 has an input unit 15 and an output unit 20.
  • the distance measuring system 1000 displays information on the table 25 and causes the user to operate the display object 21 displayed on the table 25.
  • the method of displaying the display object 21 on the top surface 26 of the table 25 in this way is also referred to as a projection type.
  • the method of displaying the display object 21 on the top surface 26 of the table 25 is also referred to as a table top screen.
  • the user can perform various operations on the display object 21 displayed on the table 25 by the output unit 20 with a finger or the like. Further, the user can cause the distance measuring system 1000 to execute various processes related to the object by placing the object on the top surface 26 of the table 25 and causing the input unit 15 to recognize the object.
  • the input unit 15 includes the distance measuring sensor 5 and can acquire various data regarding the target area 1.
  • the input unit 15 acquires an operation by the user and the shape and pattern of an object placed on the table 25.
  • the distance measuring sensor 5 acquires the height (Z coordinate value) of the table 25, the object placed on the table 25, and the floor. That is, the distance measuring sensor 5 has the Z-axis direction as the distance measuring direction.
  • an object placed on the table 25 may be recognized by analyzing the captured image.
  • an object such as a hand placed on the table 25 can be recognized based on the height acquired by the stereo camera. Further, the distance measuring system 1000 can recognize the contact of the user's hand with the table 25 and the withdrawal of the hand from the table based on the height.
  • a microphone is used as the input unit 15, a microphone array for collecting sound in a specific direction may be used for the microphone.
  • the sound collecting direction may be adjusted to any direction.
  • the configuration of the input unit 15 is not limited, and for example, an imaging device such as a camera that images the table 25 with one lens may be used, or a microphone for collecting the sound emitted by the user or the environmental sound. May be used.
  • the output unit 20 displays the display object 21 on the table 25 according to the input information. For example, a GUI (Graphical User Interface) for inputting an operation related to the control of the ranging range 7 is output.
  • a GUI Graphic User Interface
  • a projection device such as a projector is used.
  • the configuration of the output unit 20 is not limited, and there may be a plurality of speakers or the like for outputting sound, or a lighting device or the like that projects lighting onto the table 25 may be included.
  • the input unit 15 and the output unit 20 are provided above the table 25 in a state of being suspended from the ceiling.
  • the method of irradiating the table 25 from above with the output unit 20 and displaying the display object 21 on the top surface 26 of the table 25 is also referred to as an upper projection type.
  • FIG. 3 is a block diagram showing a functional configuration example of the information processing device 30.
  • the information processing device 30 includes a distance map generation unit 31, a recognition unit 32, a storage control unit 33, a map information DB 34, a range control unit 35, a SLAM (Simultaneous Localization and Mapping) execution unit 36, and a GUI output unit. It has a 37, a command execution unit 38, and a display control unit 39.
  • the distance map generation unit 31 generates a distance map of the target area 1.
  • the distance map of the table 25 and the floor in the target area 1 is generated based on the detection result detected by the input unit 15. Further, the distance map generation unit 31 can execute the distance measurement of the distance measurement sensor 5 according to the distance measurement range 7 controlled by the range control unit 35.
  • the distance map generated by the storage control unit 33 is stored in the map information DB 34.
  • the recognition unit 32 recognizes the target object in the target area 1.
  • the target object arranged on the table 25 or around the table 25 is recognized.
  • the target object is not limited, and includes, for example, any object such as a table 25, a chair, and a mug on the table 25.
  • the recognition unit 32 can recognize a predetermined operation of the operation object based on the information input by the input unit 15. For example, the touched position of the operation object and the content of the operation are recognized.
  • the operation object is an object for inputting instructions related to the operation. For example, a user's hand, a pen, a dedicated device, or the like is included. The user can input various instructions to the distance measuring system 1000 by using the operation object.
  • the memory control unit 33 can control various information stored in the map information DB 34.
  • the storage control unit 33 can store the map information in the map information DB 34 and can acquire the map information.
  • various controls such as updating and deleting map information can be executed.
  • the range control unit 35 controls the ranging range of the ranging sensor 5.
  • the farthest position map and the width map are generated based on the distance map generated by the distance map generation unit 31.
  • the farthest position map is map information in which a specified amount for dealing with noise in distance measurement is added to the distance information of each pixel of the distance map.
  • the specified amount refers to a margin in consideration of the noise component in depth detection. For example, when the distance measuring sensor 5 can measure a distance up to 10 m, the distance measuring sensor 5 to 10 m becomes a distance map, and the map from the distance measuring sensor 5 to 11 m obtained by adding 1 m as a specified amount to the distance map is the farthest. It becomes a position map.
  • the specified amount is not limited, and an arbitrary numerical value may be set. For example, it may be appropriately set according to the performance of the distance measuring sensor 5 such as the range in which the distance measuring sensor 5 can measure the distance.
  • the width map is map information in which the distance measurement width is set for each pixel of the farthest position map. That is, the dotted space 10 on the object 3 shown in FIG. 1 is the width map on the object 3, and the dotted space 12 on the floor is the width map on the floor. The width map in which the width map on the object 3 and the width map on the floor are combined becomes the width map of the target area 1.
  • the SLAM execution unit 36 executes SLAM based on the information input by the input unit 15. For example, the self-position is estimated from the information acquired from the camera, LiDAR, gyro sensor, acceleration sensor, etc. included in the input unit 15. Further, the SLAM execution unit 36 can create a surrounding environment map by a device on which the input unit 15 is mounted.
  • the specific algorithm for executing SLAM is not limited, and any algorithm may be used.
  • machine learning may be performed. For example, an arbitrary machine learning algorithm using DNN (Deep Neural Network) or the like may be used.
  • DNN Deep Neural Network
  • AI artificial intelligence
  • deep learning deep learning
  • a learning unit and an identification unit are constructed to determine the estimation of the self-position.
  • the learning unit performs machine learning based on the input information (learning data) and outputs the learning result.
  • the identification unit identifies (determines, predicts, etc.) the input information based on the input information and the learning result.
  • a neural network or deep learning is used as a learning method in the learning unit.
  • a neural network is a model that imitates a human brain neural circuit, and is composed of three types of layers: an input layer, an intermediate layer (hidden layer), and an output layer.
  • Deep learning is a model that uses a multi-layered neural network, and it is possible to learn complex patterns hidden in a large amount of data by repeating characteristic learning in each layer.
  • Deep learning is used, for example, to identify objects in images and words in sounds.
  • a convolutional neural network used for recognizing images and moving images is used.
  • a neurochip / neuromorphic chip incorporating the concept of a neural network can be used.
  • Machine learning problem settings include supervised learning, unsupervised learning, semi-supervised learning, reinforcement learning, reverse reinforcement learning, active learning, and transfer learning.
  • Self-position may be estimated using these learning methods.
  • an arbitrary learning algorithm or the like different from machine learning may be used. By estimating the self-position according to a predetermined learning algorithm, it is possible to improve the estimation accuracy of the self-position. Of course, it is not limited to the case where the learning algorithm is used.
  • the application of the learning algorithm may be executed for any process in the present disclosure.
  • the GUI output unit 37 outputs a GUI for inputting an instruction regarding a ranging range. For example, various information included in the GUI is updated according to the operation from the user.
  • the display control unit 39 displays the image of the information updated by the GUI output unit 37 on the top surface 26 of the table 25 as the display object 21.
  • the instruction includes executing a predetermined gesture such as a user's voice instruction or a hand signal, or inputting a predetermined operation.
  • various input means may be included in the instruction.
  • a hand signal is an instruction in which various operations are associated with various hand shapes. For example, the shape of a hand such as a choki (a state in which only the index finger and the middle finger are extended) and a par (a state in which the hand is open) is included.
  • a process of being divided into two and displayed may be associated with the shape of the choki so that the display object 21 is cut.
  • Gestures are instructions that are input by a predetermined movement of a hand or finger. For example, an operation of touching the display object 21 with a finger, an operation of sliding the finger horizontally (drag), an operation of releasing two fingers (pinch out), an operation of bringing two fingers closer (pinch in), and the like are included.
  • the output of the GUI by the GUI output unit 37 includes both the output of the GUI transmitted from the outside and the generation of the GUI by the GUI output unit 37 itself.
  • the output of an image or GUI containing predetermined information may be expressed as displaying an image or GUI containing predetermined information.
  • the command execution unit 38 executes a command corresponding to the action performed by the user.
  • processing based on the operation of the operation object in the target area 1 is executed. For example, when the user moves a finger while touching the display object 21 displayed on the top surface 26 of the table 25, the display object 21 is moved so as to follow the movement of the finger.
  • the type of command to be executed may be appropriately changed according to the application to be executed.
  • the display control unit 39 controls the display object output by the output unit 20. For example, the position and size of the display object 21 are controlled based on the command executed by the command execution unit 38.
  • the distance map generation unit 31 corresponds to an acquisition unit that acquires map information in which distance information is associated with position information in the target area.
  • the range control unit 35 corresponds to a range control unit that controls the distance measurement range with respect to the target area of the distance measurement sensor based on the map information.
  • the farthest position corresponds to the maximum distance to be measured.
  • the GUI output unit 37 corresponds to a GUI output unit that outputs a GUI for inputting an instruction regarding a ranging range.
  • the command execution unit 38 corresponds to a processing execution unit that executes processing based on the operation of the operation object in the target area based on the result of distance measurement for the target area by the distance measurement sensor.
  • FIG. 4 is a flowchart showing a basic execution example of controlling the ranging range.
  • FIG. 4A is a flowchart showing a process until the map information is acquired.
  • the farthest position map and the width map are generated based on the distance map as a preliminary preparation for controlling the distance measurement range.
  • the distance map generation unit 31 generates a distance map of the target area 1 based on the sensing result acquired by the distance measurement sensor 5 (step 101).
  • the range control unit 35 generates the farthest position map from the generated distance map (step 102).
  • the range control unit 35 generates a width map by a predetermined method of determining the distance measurement width from the farthest position map (step 103).
  • the distance map, the farthest position map, and the width map generated in steps 101 to 103 are included in the reference map information.
  • the reference map information is map information when the target area 1 is in a predetermined reference state. For example, in FIG.
  • the state in which the table 25 is arranged in the target area 1 is the reference state. That is, the distance map, the farthest position map, and the width map generated as the reference map information can be said to be the reference distance map, the reference farthest position map, and the reference width map.
  • the reference map information is not limited to this, and may include arbitrary map information acquired as a preliminary preparation for controlling the ranging range.
  • FIG. 4B is a flowchart showing control for each distance measurement.
  • Distance measurement is performed by the distance measurement sensor 5 based on the generated width map.
  • Distance measurement is performed pixel by pixel by the distance measurement sensor 5 (step 201).
  • the range control unit 35 controls the ranging range based on the value of the farthest position map at each pixel (Z coordinate of the farthest position) and the value of the width map at each pixel (distance measuring width), and the ranging sensor Distance measurement is performed according to step 5 (step 202).
  • the distance measurement in step 202 is performed on all the pixels of the distance map, the distance measurement is completed (step 203).
  • Steps 201 to 203 are distance measurement processes performed when the distance measurement range is determined.
  • steps 201 to 203 are flowcharts executed in a state where the reference map information is generated in advance.
  • the timing at which the distance map, the farthest position map, and the width map are generated is not limited.
  • the distance map, the farthest position map, and the width map may be generated at the timing immediately before the target area 1 is measured.
  • the map information at the timing immediately before the distance measurement for the target area 1 by the distance measurement sensor 5 is executed is described as the immediately preceding map information.
  • the distance map, the farthest position map, and the width map generated as the immediately preceding map information can be said to be the immediately preceding distance map, the immediately preceding farthest position map, and the immediately preceding width map.
  • FIG. 5 is a schematic diagram showing a distance map and a farthest position map.
  • FIG. 5A is a schematic diagram showing a distance map.
  • FIG. 5B is a schematic view showing the farthest position map.
  • the distance map generation unit 31 acquires the distance map 40 of the target area.
  • a distance map relating to the area between the table 25 and the floor is acquired as the distance map 40 of the target area.
  • the range control unit 35 generates the farthest position map 41 from the generated distance map 40.
  • the farthest position map 41 has a specified amount a added to the distance map 40.
  • FIG. 6 is a schematic view showing a width map.
  • FIG. 6A is a schematic view of the width map viewed from the Y direction.
  • FIG. 6B is a schematic view of the width map viewed from the Z direction.
  • the width map 43 is generated by a predetermined method of determining the distance measurement width from the farthest position map.
  • a width map of the area 44 of the table 25 and the area 45 of the floor is generated. That is, in the distance measurement for the area 44 of the table 25, the position where the specified amount a is added to the top surface 26 of the table 25 is the farthest position 46, and the distance from the farthest position 46 to the distance measurement width 47 is the distance measurement sensor. It becomes the distance measuring range measured by 5.
  • the position where the specified amount a is added to the floor is the farthest position 48, and the distance from the farthest position to the distance measurement width 49 is measured by the distance measurement sensor 5. It becomes the distance measurement range.
  • the colors are shown so as to be different for each region where the distance information between the region 44 of the table 25 and the region 45 of the floor is different. In the present embodiment, the closer to the distance measuring sensor 5, the darker the color of the region is shown.
  • FIG. 7 is a schematic diagram showing another example of the width map.
  • FIG. 7A is a schematic view showing another example of the width map as viewed from the Y direction.
  • FIG. 7B is a schematic view of another example of the width map viewed from the Z direction.
  • the value of the farthest position Z coordinate of the farthest position
  • the shape of the width map 50 also matches the tilt of the table 25. It has a shape. In this case, as shown in FIG.
  • FIG. 8 is a schematic diagram showing a specific example of a predetermined method for determining the distance measurement width.
  • the predetermined method for determining the distance measuring width shown in FIG. 8 is a method for determining the distance measuring width so that the detection processing load of the distance measuring sensor 5 at each position in the target area becomes constant.
  • the detection processing load is a processing load when the distance measuring sensor 5 is detected.
  • the detection processing load increases when a wider range is measured more finely. That is, when distance measurement is performed for each pixel in the target area, the distance measurement width is controlled to be narrow (small), so that the detection processing load of the distance measurement sensor 5 is reduced.
  • the distance measuring width becomes wider (longer) as the farthest position is farther from the distance measuring sensor 5.
  • the distance measuring width 53 of the floor is wider than the distance measuring width 54 of the table 25.
  • the "method of determining the distance measurement width so that the detection processing load of the distance measurement sensor 5 at each position in the target area is constant" shown in FIG. 8 is the method of pattern 1 for determining the distance measurement width. Alternatively, it is described as pattern 1.
  • FIG. 9 is a schematic diagram showing a specific example of a predetermined method for determining the distance measurement width.
  • FIG. 9A is a schematic view showing a width map.
  • FIG. 9B is a schematic view of the width map viewed from the Z direction.
  • the predetermined method for determining the distance measuring width shown in FIG. 9 is a method for determining the distance measuring width so as to measure the portion after the predetermined distance from the distance measuring sensor 5.
  • the width map 60 is a region separated from the distance measuring sensor 5 by a distance A.
  • the predetermined distance A is set so that the detection processing load of the distance measuring sensor 5 does not exceed the threshold value.
  • processing amount of the distance measuring sensor 5 on the table 25 ⁇ (area 61 of the top surface 26 of the table 25) + (processing amount of the distance measuring sensor 5 on the floor) ⁇ (floor area 62) is the detection processing load.
  • the floor area 62 is the area obtained by subtracting the table area 61 from the total area of the target area 1.
  • the calculation method of the distance A is not limited, and may be arbitrarily set by the user. Further, in the present embodiment, the distance A may correspond to the minimum distance to be measured. Further, in the present embodiment, the threshold value of the detection processing load corresponds to a predetermined processing amount related to distance measurement.
  • the "method of determining the distance measurement width so as to measure the portion after a predetermined distance from the distance measurement sensor 5" shown in FIG. 9 is the method of the pattern 2 for determining the distance measurement width, or the pattern 2. Describe.
  • FIG. 10 is a schematic diagram showing a specific example of a predetermined method for determining the distance measurement width.
  • FIG. 10A is a schematic view of the width map viewed from the Y direction.
  • FIG. 10B is a schematic view of the width map viewed from the Z direction.
  • the predetermined method for determining the distance measurement width shown in FIG. 10 is a method for manually determining the distance measurement width for each pixel in the target area.
  • a person 65 with his arms extended into the target area is detected outside the table 25.
  • the user determines the distance measuring width 66 capable of detecting the height of the person 65 and the height of the arm of the person 65.
  • the outside of the table 25 is a space not included in the three-dimensional space extending in the direction of the distance measuring sensor 5 from the area surrounded by the XY coordinates at the four corners of the top surface 26 of the table 25.
  • the width map 67 of the person 65 has a rectangular shape long in the Y direction in consideration of the fact that the person 65 moves in the Y direction.
  • a person 68 sitting on a chair outside the table 25 is detected.
  • the width map 69 of the person 68 sets the height of the sitting state of the person 68 as the distance measuring width 70, and the width map 69 around the person 68. Is determined by the user.
  • the "method of manually determining the distance measurement width for each pixel in the target area" shown in FIG. 10 will be described as the method of pattern 3 for determining the distance measurement width or pattern 3.
  • FIG. 11 is a schematic diagram showing a ranging width control GUI related to the pattern 3.
  • the ranging width control GUI 75 output by the GUI output unit 37 is displayed as a display object 21 on the top surface 26 of the table 25.
  • FIG. 11A is a schematic view showing a distance measuring width control GUI75 when the distance map is viewed from the Z direction. As shown in FIG. 11A, the distance map 76 of the target area generated by the distance map generation unit 31 is displayed. Further, in the distance measuring width control GUI 75, a rectangle 77 long in the X direction is displayed on the distance map 76.
  • the length of the rectangle 77 in the Y direction corresponds to one pixel of the distance map 76
  • the length of the rectangle 77 in the X direction corresponds to the length of the distance map 76 in the X direction.
  • the user can move the rectangle 77 in the Y direction by executing an operation to move the rectangle 77 in the Y direction via the ranging width control GUI75.
  • FIG. 11B is a schematic view showing a distance measuring width control GUI75 seen from the X direction in the rectangle 77 of the distance map 76.
  • the ranging width control GUI75 shown in FIG. 11B may be displayed on the top surface 26 of the table 25 at the same time as FIG. 11A.
  • a rectangle 77 is displayed on the distance map 76 viewed from the X direction.
  • the convex portion 78 indicates the position of the table 25 which is close to the distance measuring sensor 5.
  • the recess 79 indicates the position of the floor which is far from the distance measuring sensor 5.
  • the user can move the rectangle 77 in the Y direction by executing an operation to move the rectangle 77 in the Y direction via the ranging width control GUI75. Further, as shown in FIG. 11B, a point 80 is displayed at a position indicating the top surface 26 of the table 25 in the rectangle 77. The user can move the point 80 in the rectangle 77 in the Z-axis direction via the ranging width control GUI75. In the present embodiment, the position arranged before the point 80 moves is the farthest position of the pixel (the position of the rectangle 77).
  • FIG. 11C is a schematic view showing a ranging width control GUI75 when the point 80 moves.
  • a rectangle 81 indicating the distance measurement width is displayed between the top surface 26 of the table 25 and the point 80 moved in the rectangle 77. That is, the length of the rectangle 81 in the Z direction indicates the distance measurement width in the Z direction, and the user sets the distance measurement width of 1 pixel of the distance map 76 in which the rectangle 77 is located.
  • FIG. 11D is a schematic view showing a ranging width control GUI 75 when the rectangle 77 moves.
  • the rectangle 81 is displayed so as to cover the distance map 76.
  • the distance measurement width for one pixel is set in FIG. 11C
  • the distance measurement width for all pixels in the target area that is, the width map is set in FIG. 11D.
  • the user can control the distance measurement width with respect to the distance map 76 of the target area and generate the width map via the distance measurement width control GUI75.
  • the user may be informed to that effect.
  • the information displayed on the ranging width control GUI 75 is not limited.
  • the width map shown in FIG. 6B may be displayed on the GUI.
  • the user may arbitrarily set the type and shade of the color.
  • it may be displayed in gradation as shown in FIG. 7B.
  • the width map shown in FIG. 6B may be projected onto the top surface 26 of the table 25 when the ranging sensor 5 is calibrated. As a result, the user can visually determine whether the distance information is correctly acquired by the distance measuring sensor 5.
  • FIG. 12 is a schematic view showing a specific example of a predetermined method for determining the distance measurement width.
  • FIG. 12A is a schematic view of the width map viewed from the Y direction.
  • FIG. 12B is a schematic view of the width map viewed from the Z direction.
  • the predetermined method for determining the distance measurement width shown in FIG. 12 is a method in which an area is divided for each area on the distance map where close distances are continuous, and the distance measurement width is determined for each area.
  • the distance measurement widths are manually determined for the regions having different heights of the region 83 of the table 25 and the region 84 of the floor.
  • the range control unit 35 controls the distance measurement width for each set area.
  • the "method of dividing the area into areas where close distances are continuous on the distance map and determining the distance measurement width for each area" shown in FIG. 12 is the method or pattern of the pattern 4 for determining the distance measurement width. Describe as 4.
  • FIG. 13 is a schematic view showing a ranging width control GUI85 with respect to the pattern 4.
  • FIG. 13A is a schematic view of the distance map viewed from the Z direction.
  • the distance map 86 of the target area generated by the distance map generation unit 31 is displayed.
  • the area 83 of the table 25 and the area 84 of the floor, which have different distance information, are displayed in different colors.
  • a rectangle 87 long in the X direction is displayed on the distance map 86.
  • the length of the rectangle 87 in the Y direction corresponds to one pixel of the distance map 86
  • the length of the rectangle 87 in the X direction corresponds to the length of the distance map 86 in the X direction.
  • the user can move the rectangle 87 in the Y direction by executing an operation to move the rectangle 87 in the Y direction via the ranging width control GUI85.
  • FIG. 13B is a schematic view of the distance map 86 viewed from the X direction.
  • the ranging width control GUI85 shown in FIG. 13B may be displayed on the top surface 26 of the table 25 at the same time as FIG. 13A.
  • a rectangle 87 corresponding to the area 83 of the table 25 shown in FIG. 13A is displayed.
  • the user can move the rectangle 87 for each area.
  • a line 88 is displayed at a position indicating the top surface 26 of the table 25 in the rectangle 87.
  • the user can move the line 88 in the rectangle 87 in the Z-axis direction via the ranging width control GUI85.
  • the position arranged before the line 88 moves is the farthest position in the region.
  • FIG. 13C is a schematic view showing a ranging width control GUI85 when the line 88 moves.
  • the width map 89 of the area 83 of the table 25 is displayed between the top surface 26 of the table 25 and the line 88 moved in the rectangle 87.
  • the length of the width map 89 in the Z-axis direction indicates the distance measurement width.
  • FIG. 14 is a schematic diagram showing a specific example of a predetermined method for determining the distance measurement width.
  • the predetermined method for determining the distance measurement width shown in FIG. 14 is a method for determining the distance measurement width by performing two-step detection.
  • Two-step detection includes detection with a low detection processing load such as coarse resolution of the distance map or low resolution in the distance direction (Z-axis direction), and high resolution of the distance map or high resolution in the distance direction. This refers to detection with a high processing load.
  • a low detection processing load may be described as low accuracy
  • a high detection processing load may be described as high accuracy.
  • the high accuracy corresponds to the first distance measurement accuracy.
  • FIG. 14A is a schematic view showing a state in which detection with a low detection processing load is executed in the target area.
  • the detection process is performed by the width map 92 using the fixed ranging width 91 based on the value of the farthest position of the farthest position map generated by the range control unit 35 based on the distance map of the target area.
  • Low load detection is performed.
  • detection with a low detection processing load may be performed on a width map using the distance measurement width obtained in patterns 1 to 4 based on the value of the farthest position on the farthest position map.
  • FIG. 14B is a schematic view showing a state in which detection with a high detection processing load is executed in the target area.
  • the height of the person 93 and the table 25 is detected by the detection performed in FIG. 14A with a low detection processing load.
  • the range control unit 35 controls the distance measuring width of each pixel corresponding to the position of the person 93 or the table 25 so as to sufficiently include the length of the person 93 or the table 25 in the height direction. As a result, a highly accurate width map for each pixel in the target area is generated.
  • the "method of determining the distance measurement width by performing the two-step detection" shown in FIG. 14 will be described as the method of the pattern 5 for determining the distance measurement width or the pattern 5.
  • FIG. 15 is a schematic view showing a specific example of a predetermined method for determining the distance measurement width.
  • the predetermined method for determining the distance measuring width shown in FIG. 15 is a method for determining the distance measuring width by using another distance measuring sensor 95 arranged at a position different from that of the distance measuring sensor 5.
  • FIG. 15A is a schematic view showing how the target area is detected by another ranging sensor 95.
  • the distance measuring sensor 95 detects the target region from a direction different from the distance measuring direction (Z-axis direction) of the distance measuring sensor 5.
  • the distance measuring direction of the distance measuring sensor 95 is the Y-axis direction. That is, the distance measuring sensor 95 is arranged on the Y axis.
  • the person 96 and the table 25 with their arms extended in the target area are detected based on the sensing result of the distance measuring sensor 95.
  • the range control unit 35 controls the distance measuring width of each pixel corresponding to the position of the person 96 or the table 25 so as to sufficiently include the length of the person 96 or the table 25 in the height direction.
  • the position information of the distance map is the XZ coordinate value based on the distance measurement sensor 95. It becomes.
  • the distance information of the distance map is the distance measuring direction (Y direction) of the distance measuring sensor 95.
  • the "method of determining the distance measurement width using another distance measurement sensor 95 arranged at a position different from the distance measurement sensor 5" shown in FIG. 15 is the pattern 6 for determining the distance measurement width. The method or pattern 6 is described.
  • FIG. 16 is a flowchart showing a specific example of control of the ranging range.
  • the range in which the distance measuring sensor 5 of the target area can measure the distance is measured as much as possible, and the distance map generation unit 31 generates the distance map A (step 301).
  • step 301 is performed with the target objects on and around the table 25 rejected.
  • the range control unit 35 generates the farthest position map B in which a specified amount a is added to each pixel of the distance map A. Further, the storage control unit 33 saves the farthest position map B in the map information DB 34 (step 302).
  • the range control unit 35 determines whether patterns 1 to 4 of a predetermined method for determining the distance measurement width are selected (step 303).
  • step 303 When patterns 1 to 4 are used (YES in step 303), a predetermined method for determining the distance measurement width selected by the user is used for each pixel of the farthest position map B.
  • the range control unit 35 generates the width map C, and the storage control unit 33 saves the width map C in the map information DB 34 (step 304).
  • step 304 is executed or patterns 1 to 4 are not used (NO in step 303), the control of the ranging range ends.
  • the distance map A, the farthest position map B, and the width map C generated in steps 301 to 304 are reference map information.
  • the distance map A, the farthest position map B, and the width map C will be referred to as the reference distance map A, the reference farthest position map B, and the reference width map C.
  • the distance map, the farthest position map, and the width map included in the immediately preceding map information are described as the immediately preceding distance map, the immediately preceding farthest position map, and the immediately preceding width map.
  • FIG. 17 is a flowchart showing a specific example of control for each distance measurement. It is determined by the user whether pattern 5 or 6 of a predetermined method for determining the distance measurement width is selected (step 401). When the pattern 5 or 6 is used (YES in step 401), the range control unit 35 generates the immediately preceding width map C based on the reference farthest position map B (step 402). When patterns 1 to 4 are used and the reference width map C is generated (NO in step 401), or when pattern 5 or 6 is used and the immediately preceding width map C is generated (YES in step 401), the measurement is performed. The distance sensor 5 measures the distance pixel by pixel (step 403).
  • the range control unit 35 controls the distance measurement range based on the value of the reference farthest position map B in each pixel and the value of the immediately preceding width map C in each pixel, and the distance measurement sensor 5 performs distance measurement (step). 404). When the distance measurement in step 405 is performed on all the pixels of the reference distance map A, the distance measurement is completed (step 405).
  • the number in which a predetermined method for determining the distance measurement width is selected is not limited, and patterns 1 to 6 may be combined.
  • the following combination patterns 1 to 4 are examples of combinations of patterns 1 to 6.
  • Combination pattern 1 Using pattern 3 or pattern 4, the distance measurement width of a part of the target area A is determined. For the area B other than a part of the area A, the distance measurement width is determined by the pattern 1 or the pattern 2.
  • steps 303 and 304 shown in FIG. 16 are executed for the area A. Further, step 303 and step 304 are similarly executed for the area B as well.
  • the width map in the area A and the width map in the area B are acquired, and the distance measurement of the area A and the area B is performed in steps 403 to 405.
  • Combination pattern 2 Using pattern 3 or pattern 4, the distance measurement width of a part of the target area A is determined. For the area B other than a part of the area A, the distance measurement width is determined by the pattern 5.
  • Combination pattern 3 Using pattern 3 or pattern 4, the distance measurement width of a part of the target area A is determined. For the area B other than a part of the area A, the distance measurement width is determined by the pattern 6.
  • Combination pattern 4 The distance measurement width of the distance measurement sensor 5 is determined using the pattern 1 or the pattern 2. If the target object to be detected does not fit within the distance measurement width determined by pattern 1 or pattern 2 using pattern 5 or pattern 6, the detection processing load is low, or the detection result of the sensing device is used for distance measurement. Adjust the width.
  • the distance measurement width of the area of the table 25 is manually determined by the pattern 3 or 4.
  • the width map of the area outside the table 25 can be set by a predetermined algorithm according to the patterns 1, 2, 5 and 6.
  • the distance measurement width may not be set correctly due to erroneous detection of the distance measurement sensor 5. Therefore, the target object on the table 25 is reliably detected by using pattern 3 or 4. It becomes possible.
  • the detection processing load cannot be limited. Therefore, in the combination pattern 4, the upper limit ranging width is determined by the pattern 1 or 2. Further, by switching to detection in which the detection processing load is low for the portion exceeding the distance measurement width in the pattern 5 or 6, it is possible to prevent the detection processing load from exceeding the threshold value.
  • the information processing device 30 acquires the map information in which the distance information is associated with the position information in the target area 1. Based on the map information, the distance measuring range 7 with respect to the target area 1 of the distance measuring sensor 5 is controlled. This makes it possible to operate the ranging sensor efficiently.
  • the ranging range of a ranging sensor such as a stereo camera is controlled based on the map information of the target area. This makes it possible to reduce the detection processing load of the distance measuring sensor and avoid erroneous detection. Further, by reducing the detection processing load, it is possible to expand the ranging range.
  • the distance measurement width is determined after the distance map is acquired.
  • a low-precision distance map is acquired every frame, and the farthest position map is generated based on the low-precision distance map.
  • FIG. 18 is a schematic diagram showing a distance map and a width map of a second embodiment according to the present technology.
  • FIG. 18A is a schematic diagram showing a low-precision distance map.
  • FIG. 18B is a schematic view showing a high-precision width map.
  • the distance map generation unit 31 generates a low-precision distance map 98 for each frame.
  • a high-precision width map 99 is generated using the generated low-precision distance map 98.
  • FIG. 19 is a flowchart showing a specific example of control of the ranging range of the second embodiment according to the present technology.
  • the range in which the distance measuring sensor 5 in the target area can measure the distance is measured as much as possible, and the distance map is generated.
  • a reference distance map A is generated by unit 31 (step 502).
  • the range control unit 35 generates a reference farthest position map B in which a specified amount a is added to each pixel of the reference distance map A. Further, the storage control unit 33 saves the reference farthest position map B in the map information DB 34 (step 503).
  • any of patterns 1 to 4 selected by the user is used.
  • the range control unit 35 generates the reference width map C, and the storage control unit 33 saves the reference width map C in the map information DB 34 (step 504).
  • step 504 is executed or patterns 1 to 4 are not used (NO in step 501), the control of the ranging range ends.
  • FIG. 20 is a flowchart showing a specific example of control for each distance measurement according to the second embodiment of the present technology.
  • the distance is measured with low accuracy within the range that can be measured by the distance measurement sensor 5, and the distance map generation unit 31 generates the immediately preceding distance map D (step 601).
  • the range control unit 35 generates the immediately preceding farthest position map E by adding a specified amount a to each pixel of the immediately preceding distance map D (step 602).
  • the range control unit 35 determines whether the pattern 5 or 6 is selected by the user (step 603). When the pattern 5 or 6 is used (YES in step 603), the range control unit 35 generates the immediately preceding width map C based on the immediately preceding farthest position map E (step 604).
  • the distance measuring sensor 5 performs distance measurement for each pixel (step 605).
  • the range control unit 35 controls the ranging range based on the value of the immediately preceding farthest position map E at each pixel and the value of the immediately preceding width map C at each pixel, and the ranging sensor 5 performs ranging (step). 606).
  • the distance measurement in step 606 is performed on all the pixels of the immediately preceding distance map D, and the distance measurement is completed (step 607).
  • the low-precision detection result is used, so that the distance measurement width is adjusted with high accuracy even if the position of the target object 101 on the table 25 changes. Therefore, a highly accurate immediately preceding width map on the moved target object 101 is generated.
  • ⁇ Third embodiment> In the above embodiment, a stereo camera was used for the distance measuring sensor 5. In the third embodiment, a TOF camera is used as the distance measuring sensor 5. Further, in the third embodiment, grouping is performed in the region where the farthest position and the distance measurement width are close to the width map.
  • FIG. 21 is a flowchart showing a specific example of control of the ranging range according to the third embodiment according to the present technology.
  • the range in which the distance measuring sensor 5 of the target area can measure the distance is measured as much as possible, and the distance map generation unit 31 generates the reference distance map A (step 701).
  • the range control unit 35 generates a reference farthest position map B in which a specified amount a is added to each pixel of the reference distance map A. Further, the storage control unit 33 saves the reference farthest position map B in the map information DB 34 (step 702).
  • the range control unit 35 determines whether patterns 1 to 4 are selected (step 703).
  • step 703 When patterns 1 to 4 are used (YES in step 703), any of patterns 1 to 4 selected by the user is used for each pixel of the reference farthest position map B.
  • the range control unit 35 generates the reference width map C, and the storage control unit 33 saves the reference width map C in the map information DB 34 (step 704).
  • step 704 is executed or patterns 1 to 4 are not used (NO in step 703), the control of the ranging range ends.
  • steps 703 and 704 shown in FIG. 21 are executed for the area A.
  • step 703 and step 704 are executed for the area B as well.
  • FIG. 22 is a flowchart showing a specific example of control for each distance measurement according to the third embodiment of the present technology.
  • the range control unit 35 determines whether the pattern 5 or 6 is selected by the user (step 801).
  • the range control unit 35 generates the immediately preceding width map C based on the reference farthest position map B (step 802).
  • patterns 1 to 4 are used and the reference width map C is generated (NO in step 801), or when patterns 5 or 6 are used and the immediately preceding width map C is generated (YES in step 801), the reference.
  • a grouping list I is created for each area in which the range is close to each pixel of the farthest position map B and the immediately preceding width map C.
  • FIG. 23 is a schematic diagram showing a specific example of grouping.
  • FIG. 23A is a schematic view of the target region as viewed from the Y direction.
  • FIG. 23B is a schematic view showing a width map seen from the Z direction.
  • the target object 105 and the target object 106 are arranged on the table 25.
  • the distance map generation unit 31 generates a distance map of the target area including the table 25, the target object 105, and the target object 106. From the generated distance map, a width map is generated from step 704 or step 802 of FIGS. 21 and 22. As shown in FIG.
  • a group list I including the groups 1 to 3 is created, and the value J at the farthest position of each group and the value K of the ranging width that can cover the height of each group are determined (step 803). .. Distance measurement is performed for each group in the group list I by the distance measurement sensor 5 (step 804).
  • the range control unit 35 controls the ranging range based on the value J of the farthest position in each group and the value K of the ranging width in each group, and the ranging sensor 5 performs the ranging (step 805). ..
  • the distance measurement in step 805 is performed in all the groups in the group list I, and the distance measurement is completed (step 806).
  • the target area can be measured at the same time, so that the distance can be measured efficiently.
  • a TOF camera having a light emitting unit and a light receiving unit needs to weaken the intensity of light because if it emits strong light when measuring a distance at a close position, it will be saturated in receiving the reflected light.
  • the TOF camera measures the distance by measuring the time until the emitted light is reflected by the subject and the reflected light is received. That is, the farther the distance measurement target is, the longer it takes to receive the reflected light.
  • the TOF camera needs to change the light emission pattern for each of the farthest position and the distance measurement width, and by limiting the measurement distance width, the detection processing load and the processing time can be reduced. Further, the TOF camera can calculate the distance from the waveform or intensity of the received light, but the processing amount can be reduced by limiting the calculation to only the target area.
  • the distance measuring sensor 5 (input unit 15) is fixed at a position facing the top surface 26 of the table 25.
  • SLAM is used to generate an environment map by the distance map generation unit 31.
  • the environment map is stored in the map information DB by the storage control unit 33.
  • An environmental map is a map that includes a map of the self-location and surroundings of a device equipped with SLAM. For example, when a device equipped with SLAM is indoors, self-position and position information such as walls and furniture are generated as an environment map. That is, the environment map can be said to be a distance map in which distance information is associated with surrounding position information based on a device equipped with SLAM.
  • FIG. 24 is a flowchart showing a specific example of control of the ranging range according to the fourth embodiment according to the present technology.
  • the distance map generation unit 31 generates a distance map of an arbitrary viewpoint in the target area. Further, the SLAM execution unit 36 acquires the self-position information of the distance measuring sensor 5 (a device such as an HMD (Head Mounted Display) equipped with the distance measuring sensor 5). The self-position information includes the three-dimensional position of the distance measuring sensor 5 and the distance measuring direction.
  • the distance map generation unit 31 generates an environment map L based on the distance map and the self-position information (step 901). In this embodiment, the movable target object in the target area is rejected.
  • an environment map is generated based on the self-position information before and after the movement of the distance measuring sensor 5, and the distance map corresponding to the frame before the movement and the frame after the movement.
  • the movable state is a state in which the object is not connected to an immovable object such as a floor, a state in which the target object can be easily carried, and the like.
  • the environment map L generated in step 901 is reference map information.
  • FIG. 25 is a flowchart showing a specific example of control for each distance measurement according to the fourth embodiment of the present technology.
  • the distance map generation unit 31 generates an immediately preceding distance map M based on the distance measurement sensor 5 from the environment map L (step 1001).
  • the range control unit 35 generates the immediately preceding farthest position map N by adding a specified amount a to each pixel of the immediately preceding distance map M (step 1002).
  • the range control unit 35 generates the immediately preceding width map O from the patterns 1, 2, 5, or 6 selected by the user (step 1003).
  • Distance measurement is performed pixel by pixel by the distance measurement sensor 5 (step 1004).
  • the range control unit 35 controls the ranging range based on the value of the nearest and farthest position map N in each pixel and the value of the immediately preceding width map O in each pixel, and the ranging sensor 5 performs ranging (distance measurement). Step 1005). When the distance measurement in step 1005 is performed on all the pixels of the immediately preceding distance map M, the distance measurement is completed (step 1006).
  • this technology can be applied even when the position changes due to the distance measurement sensor 5 being mounted on an HMD or the like.
  • SLAM was used when the position of the ranging sensor 5 fluctuated.
  • the low-precision distance map generated for each frame described in the second embodiment is used.
  • FIG. 26 is a flowchart showing a specific example of control for each distance measurement according to the fifth embodiment of the present technology.
  • the range in which the distance measuring sensor 5 in the target area can measure the distance is measured with low accuracy, and the distance map generation unit 31 generates the immediately preceding distance map D (step 1101).
  • the range control unit 35 generates the immediately preceding farthest position map E by adding a specified amount a to each pixel of the distance map D (step 1102).
  • the range control unit 35 generates the immediately preceding width map P from the patterns 1, 2, 5, or 6 selected by the user (step 1103).
  • Distance measurement is performed pixel by pixel by the distance measurement sensor 5 (step 1104).
  • the range control unit 35 controls the ranging range based on the value of the immediately preceding farthest position map E at each pixel and the value of the immediately preceding width map P at each pixel, and the ranging sensor 5 performs ranging (step). 1105).
  • the distance measurement is completed by performing the distance measurement in step 1105 on all the pixels of the immediately preceding distance map D (step 1106).
  • this technology can be applied even when the position changes due to the distance measurement sensor 5 being mounted on an HMD or the like.
  • the distance measuring system 1000 controls the distance measuring range so that the target object can be recognized with high accuracy.
  • a sixth embodiment shows a specific example in which the ranging system 1000 is applied to various applications.
  • FIG. 27 is a schematic diagram showing a specific example of an application to which the ranging system 1000 according to the sixth embodiment is applied.
  • the user's hand is set as the operation object
  • the target on which the operation is executed by the operation object is set as the table 25.
  • the recognition unit 32 determines whether or not the hand touches the table 25, the position of the table 25 touched by the hand, the positional relationship between the hand and the table 25, and the like based on the detection result of the distance measuring sensor 5.
  • the command execution unit 38 controls the processing for the command executed by the user based on the determination result of the recognition unit 32.
  • the operation for the table 25 in the present embodiment includes an operation on the table 25 and an operation on the display object 21 displayed on the table 25.
  • FIG. 27A is a schematic diagram showing a specific example of control of the command to be executed.
  • 27B and 27C are schematic views showing specific examples of manual operation.
  • a hand 110 that overlaps the area of the table 25 and a hand 111 that is outside the table 25 are shown.
  • the recognition unit 32 includes the position information (XY coordinates) of the area of the hand 110 in the table 25, it is determined that the hand 110 overlaps the table 25.
  • FIG. 27B when the hand 110 is in the table 25, the operation by the hand 110 is limited to the touch operation and the command is executed.
  • the icon 115 may be displayed at the position of the table 25 that matches the position information of the tip of the finger.
  • the operation by the hand 111 is limited to operations other than touch operations such as pointing, and the command is executed.
  • the icon 115 may be displayed at the position of the table 25 that coincides with the straight line 116 extending in the direction of the tip of the finger.
  • FIG. 28 is a schematic diagram showing a specific example of an application to which the ranging system 1000 is applied.
  • FIG. 28A is a schematic view showing the positional relationship between the hand and the table.
  • the command execution unit 38 determines that the height (distance information) of the hand 120 is higher than the height of the table 25 (close to the distance measuring sensor 5), and the distance information of the hand 120 is the table 25.
  • the processing for the command executed by the user is controlled depending on the case where the distance information is lower than the distance information of.
  • a region 121 below the top surface 26 of the table 25 is set for the region of the hand 120.
  • FIG. 28B is a schematic view showing a specific example when the hand 120 is below the table 25. As shown in FIG. 28B, the user's hand 120 is in contact with the edge of the table 122 located below the top surface 26 of the table 25. In this case, the command execution unit 38 regulates the touch operation of the hand 120 on the table 25.
  • FIG. 28C is a schematic diagram showing another example of the case where the hand 120 is below the table. As shown in FIG. 28C, the user's hand 120 is gripping the end of the table 25.
  • the command execution unit 38 regulates the touch operation of the hand 120 on the table 25.
  • the process executed by the command execution unit 38 is not limited.
  • the command may be switched according to the ratio below the area 121 of the hand 120.
  • FIG. 29 is a schematic diagram showing a specific example of an application to which the ranging system 1000 is applied.
  • the hand 130 touches the display object 21 on the display object 21 displayed on the top surface 26 of the table 25 to execute the drag operation.
  • the drag operation is an operation of moving the hand 130 while touching the display object 21.
  • the display object 21 moves following the movement of the hand 130.
  • the hand 130 drags the display object 21 toward the outside of the table 25.
  • the execution of the command is controlled according to the following two ways. When the hand 130 moves below the table 25 (see the left figure of FIG. 29). For example, suppose that the hand 130 exits the area of the table 25 and moves below the table 25 when performing the drag operation.
  • the display control unit 39 deletes the display object 21 on the table 25.
  • the display control unit 39 moves the display object 21 to the table 131 which is separated (not adjacent to) the table 25.
  • FIG. 30 is a schematic diagram showing a specific example of an application to which the ranging system 1000 is applied.
  • the thickness of the table 25 cannot be obtained from the distance measuring sensor 5
  • the thickness of the table 25 (length in the Z direction) is input to the information processing device 30 in advance.
  • FIG. 30A is a schematic view showing an example of operation on the side surface 135 of the table 25.
  • the command execution unit 38 executes a command different from the command when the touch operation is performed on the top surface 26 of the table 25.
  • the recognition unit 32 matches the distance information on the side surface 135 of the table 25 with the distance information on the fingertips of the hand 136, a predetermined command is executed. For example, when the hand 136 touches the side surface 135 of the table 25, the setting related to the distance measuring width control GUI displayed on the table 25 may be changed.
  • FIG. 30A is a schematic view showing an example of operation on the side surface 135 of the table 25.
  • FIG. 30B is a schematic view showing an example of an operation on the edge of the table 25.
  • the hand 136 is located outside the table 25 and the fingertip of the hand 136 touches the edge where the top surface 26 and the side surface 135 of the table 25 intersect.
  • the command execution unit 38 may execute a command different from the command when the touch operation is performed on the top surface 26 or the side surface 135 of the table 25.
  • FIG. 30C is a schematic diagram showing another example of the operation on the edge of the table 25. As shown in the left figure of FIG. 30C, it is assumed that the hand 136 is located outside the table 25 and the sliding operation is performed on the edge (or side surface 135) of the table 25.
  • the command execution unit 38 may execute a command different from the command when the slide operation is performed on the top surface 26 of the table 25. Further, as shown in the right figure of FIG. 30C, a different command may be executed by the command execution unit 38 based on the angle of the hand 136 touching the edge of the table 25.
  • the determination method when the side surface 135 of the table 25 is touched is not limited. For example, it may be determined that the touch operation has been performed when the hand 136 is placed under the table 25.
  • the ranging width control GUI shown in FIGS. 11 and 13 is output by the GUI output unit 37.
  • map information such as FIGS. 6B and 7B may be output as a ranging width control GUI.
  • the distance information of each region (for example, the distance from the distance measuring sensor 5 to the table 25) may be displayed.
  • the distance measuring width in the Z-axis direction is set by the distance measuring width control GUI shown in FIGS. 11 and 13.
  • the distance measurement width is not limited to this, and the distance measurement width in the X-axis direction or the Y-axis direction may be set via the distance measurement width control GUI.
  • the farthest position map was generated based on the distance map.
  • the distribution of the histogram in the distance measuring direction such as the table 25 or the floor may be generated.
  • the farthest position map may be generated by adding a specified amount a to this histogram.
  • the ranging width control GUI is displayed on the top surface 26 of the table 25.
  • the ranging width control GUI may be displayed on a display such as a PC.
  • the farthest position is defined by adding a specified amount a to the pixels of the distance map.
  • the farthest position is not limited to this, and may be arbitrarily determined. For example, it may be set by the user. Further, for example, the distance measuring width may be set in the direction opposite to that of the distance measuring sensor 5 with the position separated from the distance measuring sensor 5 by the distance A as the farthest position as shown in the second embodiment.
  • FIG. 31 is a block diagram showing a hardware configuration example of the information processing device 30.
  • the information processing device 30 includes a CPU 141, a ROM 142, a RAM 143, an input / output interface 145, and a bus 144 connecting these to each other.
  • a display unit 146, an input unit 147, a storage unit 148, a communication unit 149, a drive unit 150, and the like are connected to the input / output interface 145.
  • the display unit 146 is a display device using, for example, a liquid crystal or an EL.
  • the input unit 147 is, for example, a keyboard, a pointing device, a touch panel, or other operating device. When the input unit 147 includes a touch panel, the touch panel can be integrated with the display unit 146.
  • the storage unit 148 is a non-volatile storage device, for example, an HDD, a flash memory, or other solid-state memory.
  • the drive unit 150 is a device capable of driving a removable recording medium 151 such as an optical recording medium or a magnetic recording tape.
  • the communication unit 149 is a modem, router, or other communication device for communicating with another device that can be connected to a LAN, WAN, or the like.
  • the communication unit 149 may communicate using either wired or wireless.
  • the communication unit 149 is often used separately from the information processing device 30. In the present embodiment, the communication unit 149 enables communication with other devices via the network.
  • Information processing by the information processing device 30 having the hardware configuration as described above is realized by the cooperation between the software stored in the storage unit 148 or the ROM 142 or the like and the hardware resources of the information processing device 30.
  • the information processing method according to the present technology is realized by loading the program constituting the software stored in the ROM 142 or the like into the RAM 143 and executing the program.
  • the program is installed in the information processing device 30 via, for example, the recording medium 141.
  • the program may be installed in the information processing apparatus 30 via a global network or the like.
  • any non-transient storage medium that can be read by a computer may be used.
  • the information processing device, information processing method, and program related to this technology are executed by linking the computer mounted on the communication terminal with another computer that can communicate via a network or the like, and the information processing device related to this technology. May be constructed.
  • the information processing apparatus, information processing method, and program according to the present technology can be executed not only in a computer system composed of a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device in which a plurality of modules are housed in one housing are both systems.
  • Execution of the information processing device, information processing method, and program according to the present technology by a computer system is performed, for example, when the generation of a distance map, the control of a range measurement range, the output of a GUI, and the like are executed by a single computer. And both when each process is performed by a different computer. Further, the execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquire the result.
  • the information processing device, information processing method, and program related to the present technology can be applied to a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network. ..
  • Each configuration of the distance map generation unit, range control unit, GUI output unit, information processing device, etc., the control flow of the communication system, etc. described with reference to each drawing are merely one embodiment and do not deviate from the purpose of the present technology. It can be deformed arbitrarily within the range. That is, other arbitrary configurations, algorithms, and the like for implementing the present technology may be adopted.
  • the effects described in this disclosure are merely examples and are not limited, and other effects may be obtained.
  • the description of the plurality of effects described above does not necessarily mean that those effects are exerted at the same time. It means that at least one of the above-mentioned effects can be obtained depending on the conditions and the like, and of course, there is a possibility that an effect not described in the present disclosure may be exhibited.
  • the present technology can also adopt the following configurations.
  • An acquisition unit that acquires map information in which distance information is associated with position information in the target area
  • An information processing device including a range control unit that controls a distance measuring range of the distance measuring sensor with respect to the target area based on the map information.
  • the information processing device according to (1) is an information processing device that sets the distance measurement range for each position in the target area.
  • the range control unit is an information processing device that sets the ranging range for each area with respect to the target area.
  • the range control unit is an information processing device that sets the distance measurement range so that the processing amount required for distance measurement with respect to the target area by the distance measurement sensor does not exceed a predetermined processing amount.
  • the range control unit is an information processing device that controls the distance measurement range so that the amount of processing required for distance measurement by the distance measurement sensor is constant for each position in the target area.
  • the range control unit is an information processing device that sets a minimum distance to be measured and sets a range larger than the minimum distance as the distance measurement range. (7) The information processing device according to any one of (1) to (6).
  • the acquisition unit acquires reference map information, which is the map information when the target area is in a predetermined reference state, and obtains the reference map information.
  • the range control unit is an information processing device that sets the ranging range based on the reference map information.
  • the information processing device according to any one of (1) to (7).
  • the acquisition unit acquires the immediately preceding map information which is the map information at the timing immediately before the distance measurement with respect to the target area by the distance measuring sensor is executed.
  • the range control unit is an information processing device that sets the ranging range based on the immediately preceding map information.
  • the information processing apparatus according to (8).
  • the range control unit sets the distance measurement range when the distance measurement sensor executes distance measurement with respect to the target area with the first distance measurement accuracy.
  • the immediately preceding map information is an information processing device that is the map information generated with a second ranging accuracy lower than that of the first ranging accuracy.
  • the information processing apparatus according to any one of (1) to (9).
  • the range control unit sets a maximum distance to be distance-measured and a distance-measured width to be measured, and sets the range of the distance-measured width based on the maximum distance as the distance-measured range. apparatus.
  • (11) The information processing apparatus according to any one of (1) to (10).
  • the range control unit sets the maximum distance based on the reference map information which is the map information when the target area is in a predetermined reference state, and the distance measurement sensor executes distance measurement on the target area.
  • An information processing device that sets the measurement width based on the immediately preceding map information, which is the immediately preceding map information at the timing immediately before the measurement.
  • the information processing apparatus according to any one of (1) to (11).
  • the range control unit is an information processing device that sets the distance measurement range based on the result of distance measurement for the target area by another distance measurement sensor having a distance measurement direction different from that of the distance measurement sensor.
  • An information processing device including a GUI output unit that outputs a GUI (Graphical User Interface) for inputting an instruction regarding the ranging range.
  • the information processing apparatus according to any one of (1) to (13).
  • the acquisition unit is an information processing device that generates the map information based on an environment map including the target area acquired by SLAM (Simultaneous Localization and Mapping).
  • the information processing apparatus according to any one of (1) to (14).
  • the ranging sensor is a stereo camera.
  • the range control unit is an information processing device that controls the matching range of the stereo camera.
  • the distance measuring sensor is a TOF (Time of Flight) camera having a light emitting unit and a light receiving unit.
  • the range control unit is an information processing device that controls the range of the time difference for which the distance is calculated with respect to the time difference between the light emitting time of the light emitting unit and the light receiving time of the light receiving unit.
  • the information processing apparatus sets the light emitting mode of the light emitting unit for each area with respect to the target area, and calculates the distance in the area in which the light emitting mode is set when the light emitting unit executes light emission.
  • An information processing device that is set as the area to be.
  • An information processing device including a processing execution unit that executes processing based on an operation of an operation object in the target area based on the result of distance measurement for the target area by the distance measurement sensor.
  • the information processing device is an information processing device that sets the distance measurement range based on the difference between the reference map information and the immediately preceding map information.
  • the information processing device is provided with an instruction receiving unit for receiving instructions regarding the ranging range from the user.
  • the range control unit is an information processing device that controls the ranging range based on the received instruction.
  • the processing execution unit is an information processing device that executes the processing based on the operation of the operation object with respect to the target object arranged in the operation area.
  • a distance measuring sensor that can measure the target area and An acquisition unit that acquires map information in which distance information is associated with position information in the target area, and An information processing system including an information processing device including a range control unit that controls a distance measuring range of the distance measuring sensor with respect to the target area based on the map information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Pour atteindre le but de la présente invention, un dispositif de traitement d'informations selon un mode de réalisation de la présente technologie comprend une unité d'acquisition et une unité de commande de région. L'unité d'acquisition acquiert des informations de carte dans lesquelles des informations de distance sont associées à des informations de position dans une zone cible. L'unité de commande de région commande une région de mesure de distance par rapport à la zone cible d'un capteur de mesure de distance, sur la base des informations de carte. Ceci permet un fonctionnement efficace du capteur de mesure de distance.
PCT/JP2020/045003 2019-12-13 2020-12-03 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2021117595A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019225687 2019-12-13
JP2019-225687 2019-12-13

Publications (1)

Publication Number Publication Date
WO2021117595A1 true WO2021117595A1 (fr) 2021-06-17

Family

ID=76330304

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/045003 WO2021117595A1 (fr) 2019-12-13 2020-12-03 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2021117595A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017125764A (ja) * 2016-01-14 2017-07-20 株式会社リコー 物体検出装置、及び物体検出装置を備えた画像表示装置
WO2018216342A1 (fr) * 2017-05-24 2018-11-29 ソニー株式会社 Appareil de traitement d'informations, procédé de traitement d'informations et programme

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017125764A (ja) * 2016-01-14 2017-07-20 株式会社リコー 物体検出装置、及び物体検出装置を備えた画像表示装置
WO2018216342A1 (fr) * 2017-05-24 2018-11-29 ソニー株式会社 Appareil de traitement d'informations, procédé de traitement d'informations et programme

Similar Documents

Publication Publication Date Title
US20220129060A1 (en) Three-dimensional object tracking to augment display area
US9430698B2 (en) Information input apparatus, information input method, and computer program
EP2907004B1 (fr) Saisie sans toucher pour interface d'utilisateur
JP5752715B2 (ja) デバイスレスな拡張現実およびインタラクションのためのプロジェクタおよび深度カメラ
CN104249371B (zh) 信息处理装置和信息处理方法
US8933882B2 (en) User centric interface for interaction with visual display that recognizes user intentions
US9619042B2 (en) Systems and methods for remapping three-dimensional gestures onto a finite-size two-dimensional surface
US9342925B2 (en) Information processing apparatus, information processing method, and program
JP6825087B2 (ja) 電子機器及びその動作方法
JP5783828B2 (ja) 情報処理装置およびその制御方法
US9262012B2 (en) Hover angle
TWI553508B (zh) 物件感測裝置與方法
JP6127564B2 (ja) タッチ判定装置、タッチ判定方法、およびタッチ判定プログラム
JP2016128970A (ja) 情報処理装置とその制御方法、プログラム、記憶媒体
CN105759955B (zh) 输入装置
WO2021117595A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
KR101535738B1 (ko) 비접촉 동작 제어가 가능한 스마트 디바이스 및 이를 이용한 비접촉 동작 제어 방법
JP6141290B2 (ja) マルチポインタ間接入力装置の加速度によるインターラクション
WO2021075103A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2016110492A (ja) 光学式位置情報検出装置、プログラム、対象の紐付け方法
WO2022244296A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations
WO2021075102A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2021517314A (ja) 電子機器確定方法、システム、コンピュータシステムおよび読取り可能な記憶媒体
EP3059664A1 (fr) Procédé pour commander un dispositif par des gestes et système permettant de commander un dispositif par des gestes
CN117950533A (zh) 投影装置的遥控交互方法、装置及计算机可读储存介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20900106

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20900106

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP