CN108362205B - Space distance measuring method based on fringe projection - Google Patents
Space distance measuring method based on fringe projection Download PDFInfo
- Publication number
- CN108362205B CN108362205B CN201810040762.2A CN201810040762A CN108362205B CN 108362205 B CN108362205 B CN 108362205B CN 201810040762 A CN201810040762 A CN 201810040762A CN 108362205 B CN108362205 B CN 108362205B
- Authority
- CN
- China
- Prior art keywords
- stripe
- point
- distance
- projection
- modulation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/022—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
A space distance measuring method based on stripe projection comprises the steps of 1), installing a camera shooting projection device in a space to be positioned, utilizing a chessboard pattern calibration plate to carry out calibration experiments to obtain internal and external parameters of a camera, 2), extracting and modulating stripe images, and obtaining a coordinate point set of a stripe central line by adopting a stripe thinning algorithm. Step 3), judging the relative position of a target point in a modulation fringe image by applying a minimum Euclidean distance method, step 4), finding two modulation fringe central lines closest to the target point and end point coordinates on two sides, calculating the space distance between the two end points as the actual fringe width, and step 5), establishing a projection model according to the projection relation, and determining the relation between the projection distance and the fringe width in a projection picture; the distance measurement can be carried out aiming at any space target; the number of sensors carried by the robot itself can be reduced.
Description
Technical Field
The invention provides a space distance measuring method based on fringe projection, and relates to the fields of grating projection technology, digital image processing and robots.
Background
The robot technology is a development direction of future scientific research and is an important embodiment of the national science and technology level. With the continuous development of science and technology, highly intelligent and visual machine systems become the hot spot of current research, and the research of service robots is more and more concerned. A complete service robot system generally needs to have functions of environmental awareness, positioning navigation, path planning, and object search, wherein the positioning technology is the focus of the service robot research.
When the robot enters an unknown environment, corresponding environment information is acquired, and positioning, navigation and other tasks can be carried out. It is very difficult for the service robot to accurately recognize a complicated indoor environment. The existing machine vision technology enables the robot to carry too many sensors, and limits the overall development of the mobile robot. Therefore, a positioning technology is researched, which can not only assist the robot to obtain more environmental information, but also enable the robot to be lightly loaded with battlements.
The main tasks of the positioning technology are spatial distance estimation and attitude estimation. The existing distance measuring methods include a monocular vision distance measuring method, a monocular vision distance measuring method and the like, in a monocular vision system, a robot acquires external environment information only through one sensor, the information processing of the method is simple, but only two-dimensional information of the environment can be acquired; binocular and multi-view vision systems are relatively complex, have the capability of acquiring three-dimensional space information, but have poor real-time performance of acquiring information and large data processing capacity. It can be seen that although the conventional distance measurement method has been developed rapidly, it still depends on the number of cameras, artificial markers, ground constraints, and other conditions. Therefore, a high-efficiency and low-dependency spatial ranging method is an important point in the research of positioning technology.
Disclosure of Invention
The purpose of the invention is as follows:
the invention provides a space distance measuring method based on fringe projection, and aims to solve the problems in the prior art.
The technical scheme is as follows: a space distance measuring method based on fringe projection comprises the following steps:
step 1), installing a camera projection device in a space to be positioned, and performing a calibration experiment by using a checkerboard calibration plate to obtain internal and external parameters of the camera, wherein an imaging model of the camera is as follows:
wherein (X, Y) is two-dimensional coordinate in image coordinate system, (X, Y, Z) is three-dimensional coordinate in world coordinate system, and fu、fvFocal ratio of two directions of the image, s is distortion factor, (u)0,v0) For principal point coordinates, R, T are the rotation matrix and translation vector, respectively, transformed from the world coordinate system to the camera coordinate system. From the model it can be seen that:M1only related to the internal structure of the camera, called as the internal parameter matrix of the camera; m2Only the orientation of the camera with respect to the world coordinate system is referred to as the camera extrinsic parameter matrix.
And step 2), extracting a modulated stripe image after preprocessing, and acquiring a coordinate point set of a stripe center line by adopting a stripe thinning algorithm. The specific method comprises the following steps:
the actual stripe image I acquired by the camera1And a reference stripe image I0And (3) performing subtraction, namely subtracting the gray value of each corresponding pixel to obtain a modulation grating stripe I at the position of the target object, performing binarization on the modulation grating stripe I, and performing morphological processing on the modulation grating stripe I, namely obtaining a central line coordinate point set of the modulation stripe by adopting a stripe thinning algorithm.
Step 3), a minimum Euclidean distance method is applied to judge the relative position of the target point in the modulation fringe image, and the implementation steps are as follows:
step 3.1), storing the coordinate point set extracted in the step 2) in a cell array form, and recording as C (x, y), wherein (x, y) is an image coordinate;
step 3.2), classifying according to the positions of the stripes, wherein the coordinate of the central line of each stripe is stored in the subcellular array C1~CnIn, i.e. C ═ C1C2···Cn];
Step 3.3), if the target point A is just on the central line of the stripe, taking the average value of the stripe widths at the two sides of the point A as the final stripe width, otherwise, judging the distance d between the target point A and the central point of each stripe according to the minimum Euclidean distance methodij(i=1,2,...,n,j=1,2,...,length(Ci) Let the coordinate of point A be (x)0,y0) The calculation formula of the euclidean distance is as follows:
step 3.4), when d ═ min { d ═ d {ijAt this point, the modulation fringe central line C corresponding to the pointiThe distance from the target point is minimum, and the aim can be determinedThe punctuation is in the vicinity of the strip of modulated fringes.
Step 4), finding two modulation fringe central lines closest to the target point and endpoint coordinates on two sides, converting the two-dimensional endpoint coordinates into three-dimensional space coordinates by a calibration result, and calculating the space distance of two endpoints as the actual fringe width, wherein the specific steps are as follows:
step 4.1), taking the coordinate point corresponding to the minimum distance in the step 3.4) as an endpoint O for obtaining the width of the stripe1(O1∈Ci) Wherein x is1、y1Are each O1Abscissa and ordinate (x)1≠x0);
Step 4.2), according to the target point A and the point O1The abscissa magnitude relationship of (a) determines the position of the other end point on the modulation fringe image, that is:
wherein, O2Is the other end point, and O2Can be according to Ci+1The minimum distance between all points on the graph and the point A is determined;
step 4.3), respectively substituting the internal and external parameters of the camera into the imaging model according to the calibration result of the step 1) to obtain two end points O1、O2Corresponding three-dimensional space coordinate O10(X,Y,Z)、O20(X, Y, Z), and calculating the vertical distance between them as the width W of the modulation fringes, as follows:
step 5), establishing a projection model according to the projection relation, and determining the relation between the projection distance and the width of the stripes in the projection picture, wherein the steps are as follows:
step 5.1), determining the actual horizontal projection angle 2 α of the projector through a plurality of experiments, namely:
wherein M is0And L0The projection picture width and the vertical projection distance are respectively measured by experiments;
step 5.2), determining the actual vertical projection distance L according to the horizontal projection angle and the stripe width W, and converting the coordinates of the target point A and the projection central point P into three-dimensional space coordinates (X)0,Y0,Z0) And (X)P,YP,ZP) Calculating the distance L between two pointsP;
Step 5.3), calculating the space relative distance between the target point and the projector, namely:
the resolution of the projector is mxn, and the grating fringe period is T.
The advantages and effects are as follows:
the grating projection technology is widely applied to the three-dimensional measurement technology due to the advantages of simple measurement system, high precision, high resolution and the like. Therefore, the invention provides a space distance measuring method based on fringe projection by combining a grating projection technology and a distance measuring algorithm aiming at the problem of space distance estimation.
The invention firstly uses the grating projection technology and the related image processing algorithm to determine the actual width of the modulation fringe in the three-dimensional space, and then converts the modulation fringe into the space relative distance according to the projection relation, thereby realizing the purpose of space distance measurement. The invention has the advantages that:
1. a space distance measurement method based on stripe projection is provided, wherein a two-dimensional image processing algorithm and the projection relation of grating stripes are combined and applied to three-dimensional space distance measurement;
2. in the distance measurement process, the distance measurement can be carried out aiming at any space target without taking the ground as a reference;
3. the invention can be applied to a sensing system of a service robot, can assist the robot to acquire more environmental information and can reduce the number of sensors carried by the robot.
Drawings
FIG. 1 is an overall flow chart of the fringe projection-based spatial ranging method of the present invention;
FIG. 2 is a graph of the visualization of the calibration;
FIG. 3 is an input actual image and a reference image;
FIG. 4 is a modulation fringe image and fringe thinning results;
fig. 5 is a projection model relationship diagram.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings attached hereto.
The invention is a distance measuring process realized in Windows operating system, FIG. 1 is an overall flow chart of the invention, the concrete steps are as follows:
step 1), installing the camera shooting projection device in a space to be positioned, performing a calibration experiment by using a chessboard pattern calibration plate, storing internal and external parameters of a calibrated camera, and recording a calibrated reprojection error, wherein the specific steps are as follows:
step 1.1), installing and fixing a camera and a projector, and selecting a 7 x 8 black and white chessboard grid calibration plate, wherein the side length of each grid is 27 mm;
step 1.2), moving or rotating the calibration plate randomly, and shooting a group of images (20 sheets);
step 1.3), the imaging model of the camera is as follows:
wherein (X, Y) is two-dimensional coordinate in image coordinate system, (X, Y, Z) is three-dimensional coordinate in world coordinate system, and fu、fvFocal ratio of two directions of the image, s is distortion factor, (u)0,v0) For principal point coordinates, R, T are the rotation matrix and translation vector, respectively, transformed from the world coordinate system to the camera coordinate system. From the model it can be seen that: m1Only related to the internal structure of the camera, called as the internal parameter matrix of the camera; m2Only with camerasOrientation related to the world coordinate system is called a camera external parameter matrix;
and step 1.4), determining internal and external parameters of the camera by adopting a least square method, storing and recording the calibrated reprojection error, and analyzing the calibration result.
And 2) projecting the grating stripes into a space to be positioned, respectively acquiring a reference stripe image and an actual stripe image, extracting a modulation stripe image after preprocessing, and acquiring a coordinate point set of a stripe center line by adopting a stripe thinning algorithm. The specific method comprises the following steps:
step 2.1), respectively collecting actual stripe images I of indoor space1And a reference stripe image I0And carrying out binarization;
step 2.2), performing subtraction on the two images, namely subtracting the gray value of each corresponding pixel to obtain a modulation fringe image I at the position of the target object;
and 2.3) performing morphological processing of the image, namely refining the stripes, and extracting the central line of the black or white stripes to obtain a central line coordinate point set of the modulation stripes.
Step 3), judging the relative position of the target point in the modulation fringe image by applying a minimum Euclidean distance method, and realizing the following steps:
step 3.1), storing the coordinate point set extracted in the step 2) in a cell array form, and recording as C (x, y), wherein (x, y) is an image coordinate;
step 3.2), classifying according to positions of the stripes, and storing the coordinate of the central line of each modulation stripe in a subcellular array, namely C ═ C1C2···Cn];
Step 3.3), if the target point A is just on the central line of the stripe, taking the average value of the stripe widths at the two sides of the point A as the final stripe width, otherwise, judging the distance d between the target point A and the central point of each stripe according to the minimum Euclidean distance methodij(i=1,2,...,n,j=1,2,...,length(Ci) Let the coordinate of point A be (x)0,y0) The calculation formula of the euclidean distance is as follows:
step 3.4), when d ═ min { d ═ d {ijAt this point, the modulation fringe central line C corresponding to the pointiThe minimum distance from the target point may determine that the target point is near the modulation fringe.
Step 4), finding two modulation fringe central lines closest to the target point and endpoint coordinates on two sides, converting the two-dimensional endpoint coordinates into three-dimensional space coordinates by a calibration result, and calculating the space distance of two endpoints as the actual fringe width, wherein the specific steps are as follows:
step 4.1), taking the coordinate point corresponding to the minimum distance in the step 3.4) as an endpoint O for obtaining the width of the stripe1(O1∈Ci) Wherein x is1、y1Are each O1Abscissa and ordinate (x)1≠x0);
Step 4.2), according to the target point A and the point O1The abscissa magnitude relationship of (a) determines the position of the other end point on the modulation fringe image, that is:
wherein, O2Is the other end point, and O2Is according to Ci+1The minimum distance between all points on the graph and the point A is determined;
step 4.3), respectively substituting the internal and external parameters of the camera into the imaging model according to the calibration result of the step 1) to obtain two end points O1、O2Corresponding three-dimensional space coordinate O10(X,Y,Z)、O20(X, Y, Z), and calculating the vertical distance between them as the width W of the modulation fringes, as follows:
step 5), establishing a projection model according to the projection relation, and determining the relation between the projection distance and the width of the stripes in the projection picture, wherein the steps are as follows:
step 5.1), determining the actual horizontal projection angle 2 α of the projector through a plurality of experiments, namely:
wherein M is0And L0The projection picture width and the vertical projection distance are respectively measured by experiments;
step 5.2), determining the actual vertical projection distance L according to the horizontal projection angle and the stripe width W, and converting the coordinates of the target point A and the projection central point P into three-dimensional space coordinates (X)0,Y0,Z0) And (X)P,YP,ZP) Calculating the distance L between two pointsP;
Step 5.3), calculating the space relative distance between the target point and the projector, namely:
the resolution of the projector is mxn, and the grating fringe period is T. Thus, the stripe width W is converted into the space relative distance D, and the function of space distance measurement is realized.
Claims (3)
1. A space distance measuring method based on fringe projection is characterized in that: the method comprises the following specific steps:
step 1), installing a camera shooting projection device in a space to be positioned, performing a calibration experiment by using a checkerboard calibration board to obtain internal and external parameters of a camera, and recording a calibrated reprojection error;
step 2), projecting the grating stripes into a space to be positioned, respectively collecting a reference stripe image and an actual stripe image, extracting a modulation stripe image after preprocessing, and acquiring a coordinate point set of a stripe center line by adopting a stripe thinning algorithm;
step 3), judging the relative position of the target point in the modulation fringe image by applying a minimum Euclidean distance method;
the step of judging the relative position of the target point in the modulation fringe image in the step 3) is as follows:
step 3.1), storing the coordinate point set extracted in the step 2) in a cell array form, and recording as C (x, y), wherein (x, y) is an image coordinate;
step 3.2), classifying according to positions of the stripes, and storing the coordinate of the central line of each modulation stripe in a subcellular array, namely C ═ C1C2… Cn];
Step 3.3), if the target point A is just on the central line of the stripe, taking the average value of the stripe widths at the two sides of the point A as the final stripe width, otherwise, judging the distance d between the target point A and the central point of each stripe according to the minimum Euclidean distance methodij,i=1,2,...,n,j=1,2,...,length(Ci) Wherein: length (C)i):CiThe length of (i.e. the total number of coordinate points of the central line of the ith modulation stripe, and the coordinate of point A is (x)0,y0) The calculation formula of the euclidean distance is as follows:
step 3.4), when d ═ min { d ═ d {ijAt this point, the modulation fringe central line C corresponding to the pointiDetermining that the target point is near the modulation stripe when the distance between the target point and the target point is minimum;
step 4), finding two modulation fringe central lines closest to the target point and end point coordinates on two sides, converting the two-dimensional end point coordinates into three-dimensional space coordinates by a calibration result, and calculating the space distance of two end points as the actual fringe width;
the method for calculating the stripe width of the position of the target point in the step 4) comprises the following steps:
firstly, the coordinate point corresponding to the minimum distance in the step 3.4) is taken as an endpoint O for obtaining the width of the stripe1,O1∈CiLet x1、y1Are each O1Abscissa and ordinate of (1), x1≠x0(ii) a Then, according to the targetPoint A and point O1The size relationship of the abscissa of (2) determines the position of the coordinate of the other end point, namely:
wherein, O2Is the other end point, and O2According to Ci+1The minimum distance between the point A and the point B is determined; finally, respectively substituting the internal and external parameters of the camera into the imaging model according to the calibration result of the step 1), and calculating two end points O1、O2Corresponding three-dimensional space coordinate O10(X,Y,Z)、O20(X, Y, Z), and calculating the vertical distance between them as the width W of the modulation fringes, as follows:
step 5), establishing a projection model according to the projection relation, determining the functional relation between the projection distance and the width of the stripes in the projection picture, and converting the width of the stripes into a space distance to achieve the purpose of space distance measurement;
the step of calculating the spatial distance in step 5) is as follows:
step 5.1), determining the actual horizontal projection angle 2 α of the projector through experiments, namely:
wherein M is0And L0The projection picture width and the vertical projection distance are respectively measured by experiments;
step 5.2), determining the actual vertical projection distance L according to the horizontal projection angle 2 α and the stripe width W, and obtaining the distance L between two points according to the three-dimensional coordinates of the target point A and the projection central point PP;
Step 5.3), calculating the space relative distance between the target point and the projector, namely:
where mxn is the resolution of the projector, T is the grating fringe period, (X)0,Y0,Z0) And (X)P,YP,ZP) Three-dimensional space coordinates of the target point and the projection central point; therefore, the relative space distance D between the target point and the projector is obtained according to the calculation result W in the step 4), and the purpose of space distance measurement is achieved.
2. The fringe projection-based spatial ranging method as claimed in claim 1, wherein: the step 1) comprises the following specific steps:
step 1.1), installing and fixing a camera and a projector experimental device, and selecting a 7 multiplied by 8 black and white chessboard grid calibration plate, wherein the side length of each grid is 27 mm;
step 1.2), moving or rotating the calibration plate randomly, and shooting a group of images;
step 1.3), establishing an imaging model of the camera:
wherein (X, Y) is two-dimensional coordinate in image coordinate system, (X, Y, Z) is three-dimensional coordinate in world coordinate system, and fu、fvFocal ratio of two directions of the image, s is distortion factor, (u)0,v0) R, T are respectively a rotation matrix and a translation vector transformed from a world coordinate system to a camera coordinate system; m1Only related to the internal structure of the camera, called as the internal parameter matrix of the camera; m2Only the orientation of the camera relative to the world coordinate system is related, so that the camera external parameter matrix is called;
and step 1.4), optimizing internal and external parameters of the camera by adopting a least square method, storing and recording the calibrated reprojection error, and analyzing the calibration result.
3. The fringe projection-based spatial ranging method as claimed in claim 1, wherein: the method for acquiring the coordinate of the central line of the modulation fringe in the step 2) comprises the following steps:
the actual stripe image I acquired by the camera1And a reference stripe image I0And performing subtraction, namely subtracting the gray value of each corresponding pixel to obtain a modulation grating stripe I at the position of the target object, performing binarization on the modulation stripe, and obtaining a central line coordinate point set of the modulation stripe by adopting a stripe thinning algorithm.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711122948 | 2017-11-14 | ||
CN2017111229484 | 2017-11-14 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108362205A CN108362205A (en) | 2018-08-03 |
CN108362205B true CN108362205B (en) | 2020-04-28 |
Family
ID=63006333
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810040762.2A Active CN108362205B (en) | 2017-11-14 | 2018-01-16 | Space distance measuring method based on fringe projection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108362205B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109767492A (en) * | 2018-12-06 | 2019-05-17 | 国网经济技术研究院有限公司 | Space calculation method for three-dimensional model of transformer substation |
CN109781739A (en) * | 2019-03-04 | 2019-05-21 | 杭州晶耐科光电技术有限公司 | Automobile finish surface appearance defects automatic detection system and method |
CN110231034B (en) * | 2019-06-10 | 2023-05-09 | 国网江苏省电力有限公司南京供电分公司 | Indirect positioning method for materials of outdoor storage yard and visual model |
CN110196062B (en) * | 2019-06-27 | 2022-03-25 | 成都圭目机器人有限公司 | Navigation method for tracking lane line by single camera |
CN110706271B (en) * | 2019-09-30 | 2022-02-15 | 清华大学 | Vehicle-mounted vision real-time multi-vehicle-mounted target transverse and longitudinal distance estimation method |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6781676B2 (en) * | 2002-03-22 | 2004-08-24 | Trw Inc. | Structured lighting detection of vehicle occupant type and position |
US20040041996A1 (en) * | 2002-08-28 | 2004-03-04 | Fuji Xerox Co., Ltd. | Range finder and method |
US7066388B2 (en) * | 2002-12-18 | 2006-06-27 | Symbol Technologies, Inc. | System and method for verifying RFID reads |
JP2006162250A (en) * | 2004-12-02 | 2006-06-22 | Ushio Inc | Pattern inspection device for film workpiece |
WO2011013373A1 (en) * | 2009-07-29 | 2011-02-03 | Canon Kabushiki Kaisha | Measuring apparatus, measuring method, and program |
CN104111039B (en) * | 2014-08-08 | 2016-08-24 | 电子科技大学 | For arbitrarily putting the scaling method of fringe projection three-dimension measuring system |
JP6444233B2 (en) * | 2015-03-24 | 2018-12-26 | キヤノン株式会社 | Distance measuring device, distance measuring method, and program |
-
2018
- 2018-01-16 CN CN201810040762.2A patent/CN108362205B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN108362205A (en) | 2018-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108362205B (en) | Space distance measuring method based on fringe projection | |
Chen et al. | High-accuracy multi-camera reconstruction enhanced by adaptive point cloud correction algorithm | |
US9965870B2 (en) | Camera calibration method using a calibration target | |
US8988317B1 (en) | Depth determination for light field images | |
Alismail et al. | Automatic calibration of a range sensor and camera system | |
Herráez et al. | 3D modeling by means of videogrammetry and laser scanners for reverse engineering | |
JP6363863B2 (en) | Information processing apparatus and information processing method | |
CN111627072B (en) | Method, device and storage medium for calibrating multiple sensors | |
CN109801333B (en) | Volume measurement method, device and system and computing equipment | |
CN105716542B (en) | A kind of three-dimensional data joining method based on flexible characteristic point | |
US20150262346A1 (en) | Image processing apparatus, image processing method, and image processing program | |
CN107767456A (en) | A kind of object dimensional method for reconstructing based on RGB D cameras | |
JP2014112055A (en) | Estimation method for camera attitude and estimation system for camera attitude | |
CN113643436B (en) | Depth data splicing and fusion method and device | |
CN107084680A (en) | Target depth measuring method based on machine monocular vision | |
CN113658279B (en) | Camera internal reference and external reference estimation method, device, computer equipment and storage medium | |
CN115345942A (en) | Space calibration method and device, computer equipment and storage medium | |
TWI599987B (en) | System and method for combining point clouds | |
CN111724446B (en) | Zoom camera external parameter calibration method for three-dimensional reconstruction of building | |
CN103260008A (en) | Projection converting method from image position to actual position | |
US10580208B2 (en) | Ceiling map building method, ceiling map building device, and ceiling map building program | |
CN109410272B (en) | Transformer nut recognition and positioning device and method | |
CN113674353B (en) | Accurate pose measurement method for space non-cooperative target | |
CN110415292A (en) | Movement attitude vision measurement method of ring identification and application thereof | |
CN112907656A (en) | Robot position detection method, detection device, processor and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |