CN111212241A - High-speed automatic driving automatic exposure control method based on image gradient and entropy fusion - Google Patents

High-speed automatic driving automatic exposure control method based on image gradient and entropy fusion Download PDF

Info

Publication number
CN111212241A
CN111212241A CN202010032149.3A CN202010032149A CN111212241A CN 111212241 A CN111212241 A CN 111212241A CN 202010032149 A CN202010032149 A CN 202010032149A CN 111212241 A CN111212241 A CN 111212241A
Authority
CN
China
Prior art keywords
image
value
camera
exposure time
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010032149.3A
Other languages
Chinese (zh)
Other versions
CN111212241B (en
Inventor
杨培文
倪凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heduo Technology Guangzhou Co ltd
Original Assignee
HoloMatic Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HoloMatic Technology Beijing Co Ltd filed Critical HoloMatic Technology Beijing Co Ltd
Priority to CN202010032149.3A priority Critical patent/CN111212241B/en
Publication of CN111212241A publication Critical patent/CN111212241A/en
Application granted granted Critical
Publication of CN111212241B publication Critical patent/CN111212241B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion, which comprises the following steps: acquiring a frame of image by a camera; carrying out gradient and image entropy weighting on the image to obtain a measurement value of the exposure quality of the image; determining an optimal gamma value by utilizing gamma transformation; and obtaining the predicted value of the exposure time of the next frame of image shot by the camera by using the metric value and the gamma value. The dynamic range of the camera for acquiring the image is effectively enlarged, so that better exposure control performance can be ensured under complex light, and the quality of the image acquired by the automatic driving vehicle is further improved.

Description

High-speed automatic driving automatic exposure control method based on image gradient and entropy fusion
Technical Field
The invention relates to the technical field of automatic driving camera control, in particular to a high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion.
Background
The method for improving the exposure quality of the camera mainly comprises the step of calculating the exposure time of the camera by utilizing the gradient of the image, so that the dynamic range of the image acquired by the camera is very limited and the method is not robust enough, and a new method needs to be developed for ensuring the more accurate control of the exposure time of the camera for acquiring the image.
Disclosure of Invention
An object of the present invention is to solve at least the above problems and to provide at least the advantages described later.
The invention also aims to provide a high-speed automatic driving exposure control method based on image gradient and entropy fusion, which effectively increases the dynamic range of images acquired by a camera, ensures better exposure control performance under complex light and further improves the quality of images acquired by automatic driving vehicles.
In order to achieve the above objects and other objects, the present invention adopts the following technical solutions:
a high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion comprises the following steps:
acquiring a frame of image by a camera;
carrying out gradient and image entropy weighting on the image to obtain a measurement value of the exposure quality of the image; the weighting method of the gradient and the image entropy is shown as formula 1:
I=α∑mi+ (1- α). H equation 1;
wherein, I is a measurement value, α is a fusion proportion parameter of gradient and image entropy, miThe gradient value of the ith pixel point in the image is obtained; h is an image entropy value;
determining an optimal gamma value by utilizing gamma transformation; and
and obtaining the predicted value of the exposure time of the next frame of image shot by the camera by using the metric value and the gamma value.
Preferably, in the high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion, the number of the cameras is not less than 1, and the exposure consistency of a plurality of cameras is ensured by the following method:
acquiring a public area image containing each camera shooting image;
calculating the public area image by using a formula 2 to obtain the optimal exposure time of each camera;
Figure BDA0002364717420000021
wherein N is the total number of cameras, i is the ith camera, j is the jth camera, βijIs ROIijThe weight of (c);
Figure BDA0002364717420000022
is a common area in camera i with camera j;
Figure BDA0002364717420000023
is the common area in camera j with camera i.
Preferably, in the method for controlling high-speed automatic driving exposure based on image gradient and entropy fusion, acquiring a common area image including images captured by the cameras includes: the method comprises the steps of obtaining feature points in images shot by adjacent cameras, carrying out matching association on the images to obtain the same feature point set, and determining a square minimum pixel area containing all the feature points in the feature point set by using the feature point set, wherein the minimum pixel area is the public area image.
Preferably, in the method for controlling high-speed automatic driving automatic exposure based on image gradient and entropy fusion, the formula 1
Figure BDA0002364717420000024
Is determined by equation 3:
Figure BDA0002364717420000025
wherein N is log (lambda (1-delta) + 1); λ is the gradient gain coefficient; δ is the gradient threshold.
Preferably, in the high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion, a nonlinear method is used to obtain a predicted value of the exposure time of the next frame of image taken by the camera by using the metric value and the gamma value; the nonlinear method is specifically shown in formula 4:
ti+1=(1+Kp(R-1))tiformula 4;
wherein, ti+1Is the exposure time value of the next frame image predicted; kpThe convergence speed value from the current exposure time value to the target exposure time value; r is an updating function; t is tiThe exposure time value of the image of the current frame.
Preferably, in the automatic exposure control method for high-speed automatic driving based on image gradient and entropy fusion, the update function R is obtained by formula 5:
Figure BDA0002364717420000026
wherein d is a nonlinear degree value of the updated parameter;
Figure BDA0002364717420000027
the optimal gamma value is obtained.
Preferably, in the method for controlling high-speed automatic driving exposure based on image gradient and entropy fusion, obtaining a predicted value of exposure time for the camera to capture the next frame of image by using the metric value and the γ value further includes writing the predicted value of exposure time into the camera as the value of exposure time for obtaining the next frame of image.
Preferably, in the method for controlling high-speed automatic driving exposure based on image gradient and entropy fusion, writing the predicted exposure time value into the camera as the exposure time value for obtaining the next frame of image further comprises calculating the predicted exposure time value for the next frame of image by using the image shot by the predicted exposure time value.
The invention at least comprises the following beneficial effects:
in the high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion, the image acquired by the camera is processed by a gradient and image entropy weighting method, so that the obtained measurement value of the exposure quality is more accurate than the measurement value obtained by processing the image by the existing simple gradient, and the analysis of the brightness change in the image is more detailed, namely the dynamic range of the processed image is enlarged, the finally obtained predicted value of the exposure time of the next frame of image is ensured to be more accurate, the quality of the image acquired by the camera is improved, and the powerful guarantee is provided for the driving safety of the subsequent automatic driving.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention.
Drawings
FIG. 1 is a graph of optimal gamma value versus camera exposure time;
FIG. 2 is a graph of exposure time versus exposure value for real-time control;
FIG. 3 is a diagram showing the relationship between the convergence speed value from the current exposure time value to the target exposure time value and the preset exposure time value when the non-linear degree value of the update parameter is 0.3;
fig. 4 is a graph showing the relationship between the nonlinear degree value of the update parameter and the predicted exposure time value when the convergence speed value is 1.0.
Detailed Description
The present invention is described in detail below with reference to the attached drawings so that those skilled in the art can implement the invention by referring to the description.
As shown in fig. 1 and 2, a high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion includes: acquiring a frame of image by a camera;
carrying out gradient and image entropy weighting on the image to obtain a measurement value of the exposure quality of the image; the weighting method of the gradient and the image entropy is shown as formula 1:
I=α∑mi+ (1- α). H equation 1;
wherein, I is a measurement value, α is a fusion proportion parameter of gradient and image entropy, miThe gradient value of the ith pixel point in the image is obtained; h is an image entropy value;
determining an optimal gamma value by utilizing gamma transformation; and
and obtaining the predicted value of the exposure time of the next frame of image shot by the camera by using the metric value and the gamma value.
In the scheme, the image acquired by the camera is processed by a gradient and image entropy weighting method, so that the obtained measurement value of the exposure quality is more accurate than the measurement value obtained by processing the image by the existing simple gradient, and the analysis of the brightness change in the image is more detailed, namely, the dynamic range of the processed image is increased, the finally obtained predicted value of the exposure time of the next frame of image is more accurate, the quality of the image acquired by the camera is improved, and the powerful guarantee is provided for the driving safety of subsequent automatic driving.
As shown in fig. 1 and fig. 2, the predicted value of the exposure time of the next frame of image is determined by the optimal gamma value and the metric value, and compared with a method for controlling the exposure time in real time, the calculation of the optimal gamma value can enable the gamma value to quickly converge to the optimal value in the quick and complicated change process of light, so that the image with the optimal exposure can be quickly obtained, that is, the image with the optimal gradient can be quickly obtained, and the navigation method based on the image gradient information in automatic driving can obtain better precision.
In a preferred scheme, the number of the cameras is not less than 1, and the exposure consistency of a plurality of cameras is ensured by the following method:
acquiring a public area image containing each camera shooting image;
calculating the public area image by using a formula 2 to obtain the optimal exposure time of each camera;
Figure BDA0002364717420000041
wherein N is the total number of cameras, i is the ith camera, j is the jth camera, βijIs ROIijThe weight of (c);
Figure BDA0002364717420000042
is a common area in camera i with camera j;
Figure BDA0002364717420000043
is the common area in camera j with camera i.
In the above solution, in an automatic driving system, more than one camera is often installed on one vehicle, for example, cameras are needed to be installed on the left side, the right side and the front of the vehicle, and a traditional exposure control method can only be used for a single camera and is not suitable for the automatic driving field, so that a common area image including images taken by the cameras is obtained; and the public area image is calculated by using the formula 2, so that the optimal exposure time of each camera can be obtained, the intensity difference of the same area shot by different cameras is minimum, namely, the brightness consistency of the images shot by a plurality of cameras is ensured, and the obstacle analysis is more accurate.
In a preferred embodiment, acquiring a common area image including images captured by the respective cameras includes: the method comprises the steps of obtaining feature points in images shot by adjacent cameras, carrying out matching association on the images to obtain the same feature point set, and determining a square minimum pixel area containing all the feature points in the feature point set by using the feature point set, wherein the minimum pixel area is the public area image.
In a preferred embodiment, the formula 1
Figure BDA0002364717420000044
Is determined by equation 3:
Figure BDA0002364717420000051
wherein N is log (lambda (1-delta) + 1); λ is the gradient gain coefficient; δ is the gradient threshold.
In the scheme, the gradient value of each pixel point in the image is calculated by introducing the gradient threshold value, so that the noise in the image can be effectively filtered, and the accuracy of the obtained gradient value is improved.
In a preferred scheme, a nonlinear method is adopted to obtain a predicted value of the exposure time of the next frame of image shot by the camera by utilizing the metric value and the gamma value; the nonlinear method is specifically shown in formula 4:
ti+1=(1+Kp(R-1))tiformula 4;
wherein, ti+1Is the exposure time value of the next frame image predicted; kpThe convergence speed value from the current exposure time value to the target exposure time value; r is an updating function; t is tiThe exposure time value of the image of the current frame.
In a preferred embodiment, the update function R is obtained by equation 5:
Figure BDA0002364717420000052
wherein d is a nonlinear degree value of the updated parameter;
Figure BDA0002364717420000053
the optimal gamma value is obtained.
In the above scheme, referring to fig. 3, the influence of the convergence rate value from the current exposure time value to the target exposure time value on the preset exposure time value is found by fixing the nonlinear degree value of the updated parameter, and the larger the convergence rate value is, the larger the curvature of the image is, the faster the convergence of the exposure time value is; referring to fig. 4, it is found that by fixing the convergence speed value and then considering the influence of the non-linear degree value of the update parameter on the predicted value of the exposure time, the non-linear degree value of the update parameter can control the stability of the update of the exposure value, i.e., the closer to the predicted value of the exposure time (optimal value of the exposure time), the smoother the gain of the exposure, and thus the predicted value of the exposure time is predicted by combining both the non-linear degree value of the update parameter and the convergence speed value from the current exposure time value to the target exposure time value, while the stability and the convergence speed of the update of the exposure value are.
In a preferred embodiment, obtaining the predicted value of the exposure time for the next frame of image taken by the camera using the metric and the γ value further includes writing the predicted value of the exposure time into the camera as the value of the exposure time for obtaining the next frame of image.
In the above scheme, the predicted exposure time value is written into the camera, so that the camera can obtain the next frame of image according to the predicted exposure time value.
In a preferred embodiment, writing the predicted exposure time value into the camera as the exposure time value for obtaining the next frame of image further includes calculating the predicted exposure time value for the next frame of image by using the image captured by the predicted exposure time value.
In the above scheme, the exposure time value of the next frame image is calculated by using the current frame image, and the process is repeated, so that the camera can continuously obtain the image with higher exposure quality.
While embodiments of the invention have been described above, it is not limited to the applications set forth in the description and the embodiments, which are fully applicable in various fields of endeavor to which the invention pertains, and further modifications may readily be made by those skilled in the art, it being understood that the invention is not limited to the details shown and described herein without departing from the general concept defined by the appended claims and their equivalents.

Claims (8)

1. A high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion comprises the following steps:
acquiring a frame of image by a camera;
carrying out gradient and image entropy weighting on the image to obtain a measurement value of the exposure quality of the image; the weighting method of the gradient and the image entropy is shown as formula 1:
I=α∑mi+ (1- α). H equation 1;
wherein, I is a measurement value, α is a fusion proportion parameter of gradient and image entropy, miThe gradient value of the ith pixel point in the image is obtained; h is an image entropy value;
determining an optimal gamma value by utilizing gamma transformation; and
and obtaining the predicted value of the exposure time of the next frame of image shot by the camera by using the metric value and the gamma value.
2. The high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion as claimed in claim 1, wherein the number of the cameras is not less than 1, and the exposure consistency of a plurality of the cameras is ensured by the following method:
acquiring a public area image containing each camera shooting image;
calculating the public area image by using a formula 2 to obtain the optimal exposure time of each camera;
Figure FDA0002364717410000011
wherein N is the total number of cameras, i is the ith camera, j is the jth camera, βijIs ROIijThe weight of (c);
Figure FDA0002364717410000012
is a common area in camera i with camera j;
Figure FDA0002364717410000013
is the common area in camera j with camera i.
3. The high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion of claim 2, wherein acquiring a common area image including an image taken by each of the cameras comprises: the method comprises the steps of obtaining feature points in images shot by adjacent cameras, carrying out matching association on the images to obtain the same feature point set, and determining a square minimum pixel area containing all the feature points in the feature point set by using the feature point set, wherein the minimum pixel area is the public area image.
4. The high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion as claimed in claim 1, wherein in formula 1
Figure FDA0002364717410000014
Is determined by equation 3:
Figure FDA0002364717410000015
wherein N is log (lambda (1-delta) + 1); λ is the gradient gain coefficient; δ is the gradient threshold.
5. The high-speed automatic driving automatic exposure control method based on image gradient and entropy fusion of claim 1, wherein a non-linear method is adopted to obtain an exposure time predicted value of the next frame of image shot by the camera by utilizing the metric value and the gamma value; the nonlinear method is specifically shown in formula 4:
ti+1=(1+Kp(R-1))tiformula 4;
wherein, ti+1Is the exposure time value of the next frame image predicted; kpThe convergence speed value from the current exposure time value to the target exposure time value; r is an updating function; t is tiThe exposure time value of the image of the current frame.
6. The high-speed autopilot automatic exposure control method based on image gradient and entropy fusion of claim 5 wherein the update function R is obtained by equation 5:
Figure FDA0002364717410000021
wherein d is a nonlinear degree value of the updated parameter;
Figure FDA0002364717410000022
the optimal gamma value is obtained.
7. The method of claim 1 wherein deriving a predicted value of exposure time for a next frame of image captured by the camera using the metric and a gamma value further comprises writing the predicted value of exposure time to the camera as the value of exposure time for the next frame of image.
8. The method of image gradient and entropy fusion-based high-speed autopilot-based automatic exposure control of claim 7 wherein writing the predicted exposure time value to the camera as the exposure time value for the next frame of image further comprises calculating the predicted exposure time value for the next frame of image using images captured with the predicted exposure time value.
CN202010032149.3A 2020-01-13 2020-01-13 High-speed automatic driving automatic exposure control method based on image gradient and entropy fusion Active CN111212241B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010032149.3A CN111212241B (en) 2020-01-13 2020-01-13 High-speed automatic driving automatic exposure control method based on image gradient and entropy fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010032149.3A CN111212241B (en) 2020-01-13 2020-01-13 High-speed automatic driving automatic exposure control method based on image gradient and entropy fusion

Publications (2)

Publication Number Publication Date
CN111212241A true CN111212241A (en) 2020-05-29
CN111212241B CN111212241B (en) 2021-06-18

Family

ID=70789046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010032149.3A Active CN111212241B (en) 2020-01-13 2020-01-13 High-speed automatic driving automatic exposure control method based on image gradient and entropy fusion

Country Status (1)

Country Link
CN (1) CN111212241B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112581440A (en) * 2020-12-10 2021-03-30 合肥英睿系统技术有限公司 Method and device for maintaining image quality of vehicle-mounted camera and vehicle-mounted camera
CN112689100A (en) * 2020-12-25 2021-04-20 北京灵汐科技有限公司 Image detection method, device, equipment and storage medium
CN113206949A (en) * 2021-04-01 2021-08-03 广州大学 Semi-direct monocular vision SLAM method based on entropy weighted image gradient
CN115379129A (en) * 2022-08-19 2022-11-22 广州虎牙信息科技有限公司 Exposure processing method, device, equipment and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881854B (en) * 2015-05-20 2017-10-31 天津大学 High dynamic range images fusion method based on gradient and monochrome information
CN107569248A (en) * 2017-08-07 2018-01-12 沈阳东软医疗系统有限公司 The exposure method and mammary gland machine equipment of a kind of mammary gland machine equipment
US9894285B1 (en) * 2015-10-01 2018-02-13 Hrl Laboratories, Llc Real-time auto exposure adjustment of camera using contrast entropy

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881854B (en) * 2015-05-20 2017-10-31 天津大学 High dynamic range images fusion method based on gradient and monochrome information
US9894285B1 (en) * 2015-10-01 2018-02-13 Hrl Laboratories, Llc Real-time auto exposure adjustment of camera using contrast entropy
CN107569248A (en) * 2017-08-07 2018-01-12 沈阳东软医疗系统有限公司 The exposure method and mammary gland machine equipment of a kind of mammary gland machine equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112581440A (en) * 2020-12-10 2021-03-30 合肥英睿系统技术有限公司 Method and device for maintaining image quality of vehicle-mounted camera and vehicle-mounted camera
CN112689100A (en) * 2020-12-25 2021-04-20 北京灵汐科技有限公司 Image detection method, device, equipment and storage medium
CN113206949A (en) * 2021-04-01 2021-08-03 广州大学 Semi-direct monocular vision SLAM method based on entropy weighted image gradient
CN115379129A (en) * 2022-08-19 2022-11-22 广州虎牙信息科技有限公司 Exposure processing method, device, equipment and readable storage medium

Also Published As

Publication number Publication date
CN111212241B (en) 2021-06-18

Similar Documents

Publication Publication Date Title
CN111212241B (en) High-speed automatic driving automatic exposure control method based on image gradient and entropy fusion
CN110728697B (en) Infrared dim target detection tracking method based on convolutional neural network
CN110248112B (en) Exposure control method of image sensor
JP5890547B2 (en) Image processing device
CN109510949B (en) Camera automatic exposure method based on effective brightness of image feature points
CN111325711A (en) Chromosome split-phase image quality evaluation method based on deep learning
CN113344972B (en) Fish track prediction method based on intensive culture
CN114926498B (en) Rapid target tracking method based on space-time constraint and leachable feature matching
CN114708615B (en) Human body detection method based on image enhancement in low-illumination environment, electronic equipment and storage medium
CN110533692B (en) Automatic tracking method for moving target in aerial video of unmanned aerial vehicle
CN117036397A (en) Multi-target tracking method based on fusion information association and camera motion compensation
CN114495025A (en) Vehicle identification method and device, electronic equipment and storage medium
CN113888607A (en) Target detection and tracking method and system based on event camera and storage medium
CN112200829A (en) Target tracking method and device based on correlation filtering method
CN114387484B (en) Improved mask wearing detection method and system based on yolov4
CN116012421A (en) Target tracking method and device
CN115797205A (en) Unsupervised single image enhancement method and system based on Retinex fractional order variation network
CN110909670B (en) Unstructured road identification method
CN110147747B (en) Correlation filtering tracking method based on accumulated first-order derivative high-confidence strategy
CN109993776B (en) Related filtering target tracking method and system based on multi-level template
CN114140428A (en) Method and system for detecting and identifying larch caterpillars based on YOLOv5
CN109815905B (en) Method and system for detecting face image by backlight source
CN112465865A (en) Multi-target tracking method based on background modeling and IoU matching
CN116523774B (en) Shadow correction method suitable for video image
CN117635637B (en) Autonomous conceived intelligent target dynamic detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Automatic exposure control method for high-speed autopilot based on image gradient and entropy fusion

Effective date of registration: 20230228

Granted publication date: 20210618

Pledgee: Bank of Shanghai Co.,Ltd. Beijing Branch

Pledgor: HOLOMATIC TECHNOLOGY (BEIJING) Co.,Ltd.

Registration number: Y2023980033668

CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 201, 202, 301, No. 56-4 Fenghuang South Road, Huadu District, Guangzhou City, Guangdong Province, 510806

Patentee after: Heduo Technology (Guangzhou) Co.,Ltd.

Address before: 100089 101-15, 3rd floor, building 9, yard 55, zique Road, Haidian District, Beijing

Patentee before: HOLOMATIC TECHNOLOGY (BEIJING) Co.,Ltd.