CA2070822A1 - Process and arrangement for optoelectronic measuring of objects - Google Patents
Process and arrangement for optoelectronic measuring of objectsInfo
- Publication number
- CA2070822A1 CA2070822A1 CA002070822A CA2070822A CA2070822A1 CA 2070822 A1 CA2070822 A1 CA 2070822A1 CA 002070822 A CA002070822 A CA 002070822A CA 2070822 A CA2070822 A CA 2070822A CA 2070822 A1 CA2070822 A1 CA 2070822A1
- Authority
- CA
- Canada
- Prior art keywords
- light
- camera
- light source
- cameras
- light sources
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/245—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Wire Bonding (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Photovoltaic Devices (AREA)
- Analysing Materials By The Use Of Radiation (AREA)
Abstract
An arrangement for optoelectronic measurement of the shape of objects (4) comprises at least two light sources (5) each of which projects at least one narrow light beam (6) onto the object to be measured (light-section procedure). The narrow light beams (6) are recorded by a number of video cameras (7) equal to the number of light sources (5). The video cameras (7) are connected to an evaluation device (9) which comprises a computer and which evaluates the photos or calculates the dimensions of the object (4). Each light source (5) is associated with an individual camera (7) which responds only to the light from that light source. Each light source (5) generates a narrow light beam (6) by emitting light rays of different wavelengths and/or polarizations, or the times at which the light strikes the object (4) and the times at which the individual cameras (7) are free to record the corresponding narrow light beam (6) are synchronized and the imaging and recording times of the individual pairs of light sources and cameras (5, 7) do not overlap.
Description
- Process and arrangement for the optoelectronic measuring of objects The invention relates to a process according to the preamble of Patent ~laim 1. In addition, the invention relates to an arrangement according to the preamble of Patent Claim 8.
Such processes are becoming increasingly important in practice for the purpose of rapidly and accurately measuring objects. Such a contactless optical measuring process for measuring cross sectional shapes is discussed in the journal Montanrundschau BHM, Vol. 13~ (1989), i~o. 3, pp. 64-67. In many applications of this light section process it is necessary to use several light sources which illuminate from various directions the object to be measured, so that complicated cross sectional shapes can also be measured. The light emitted by the individual light sources, in particular a laser beam, is spread by a cylindrical optical system so that a line as thin as possible, but af maximum intensity, is projected onto the object to be measured. However, the individual lines should all lie in one plane, and this necessitates an accurate and complicated adjustment of the cameras. These lines ~light sections) are then recorded by the cameras; if several strips are projected in one area of the object, this can give rise to measuring errors because the cameras record double lines or shifts occur in the maximum-brightness and thus it is not possible to evaluate the image clearly or the evaluation may be incorrect.
EP-A-0305107, and in particular Fig. 2(a) therein, discloses a light section process using several light sources and allocated cameras in which each camera receives only the light from the allocated light strip or light 25 source and in which the signals of the individual cameras are at first separately evaluated. This set-up is also prone to errors caused by extraneous light or incorrect illuminations.
Figs. 1 and 2 in the accompanying drawings show the brightness curves on the object to be measured in a section normal to the light section, 30 with maladjusted light sources. The graphs in Figs. 1 and 2 depict on the -''~ ' ' ` ~ ~D7032~
abscissas the spatial extent S of the object and on the ordinates the intensity T of the strip projected onto the object. It can be seen from Fig. 1 that the bri~htness cross section ~ of the li~ht section from a first laser light source, and the brightness cross section 2 of the li~ht section from a 5 second laser iight source are spatially offset from each other so that a cumulative brightness distribution is formed between these two brightness cross sections as depicted by curve 3; and a clear minimum forms between two maxima. Such brightness distributions cannot be distinguished by a camera; the ima~e evaluation system does not recognize which of the strips 10 is the correct one to process.
In Fig. 2 the two brightness cross sections have been moved closer together so that a cumulative brightness distribution according to curve 3 results and now only one brightness maximum is present; this is in fact evaluated by the image evaluation system, but it is spatially offset by 15 distance "d" from the brightness maximum of one of the two brightness cross sections which actually needs to be evaluated.
Such errors are disruptive, in particular when calibrating the measuring system because the calibrations form the basis for the subsequent measuring of objects and the evaluation of the measurements.
20 However, an exact adjustment of the light sources takes time and requires precise mechanical devices. In addition, disruptive thermal, mechanical or optical influences can cause the adjusted light sources to lose their out of adjustment once more so that re-adjustment is required in order to have a valid calibration for the subsequent measurement. Finally, errors due to 25 incorrect evaiuation of brightness maxima should be avoided also at the measurement stage.
The purpose of the invention is to produce a measuring or calibration process or an arrangement for implementing this process, by means of which it is possible, in the receiving systems, i.e. in one or more video 30 cameras, to clearly separate from each other the beams of light projected by the various light sources onto the object to be measured or onto the .
~J~J,~
: - -3-calibration object. This is the only way in which double lines can be clearly evaluated and shifts in the brightness maximum can be prevented and also re-adjustment of the optical systems can be avoided.
According to the invention, these objectives are met by a process of 5 the type mentioned at the beginning and having the characteristics set forth in the characterizing clause of Patent Claim 1. These light signals, which are clearly allocated from the start to the individual cameras, can now be evalua~ed in such a way that, following separate evaluation of the signals from the cameras, the image data are combined and the dimensions of the 10 object are calculated.
In a preferred embodiment of the process according to the invention steps are taken to ensure that the light strips produced by each individual light source are precisely allocated and recorded; this is accomplished by generating the light strips from the various iight sources by using light of 15 various wavelengths and/or different polarization, and by ensuring that the sensitivity of the camera allocated to the respective light strip(s) is tuned torespond only to the respective wavelength or polarization of the light emitted by the light source allocated to the camera. Thus, in the process according to the invention, light sources can be used which emit different 20 light which in each case can be received only by the camera allocated to that specific light source. For example, laser light sources can be used which emit different wavelengths, and the cameras are fitted with filters of various transmissivity to make them sensitive only to the specific wavelength of the respective light source allocated ~o them. At the same 25 time, or alternatively, it is possible for the light strips from various light sources to be projected or formed on the object in a temporally non-overlapping sequence and for the individual cameras to be allowed to record them in accordance with the allocation of the cameras to the individual strips just when they form, and also in accordance with the time interval 30 between the formation of the strips.
-4- ~70~7,2 In this way, it is possible to ensure that the iight emitted by an individual light source and projected onto the object as a light strip is clearly fed to the correspondingly adjusted camera allocated to the respective light source. Ligh~ emitted by the other light sources is not received by this 5 camera so that, especially when taking calibration measurements, a slight lack of adjustmen~ in the cameras and also brightness maxima which are offset in relation to each other, no longer play a role because no double lines or displacements of the brightness maximum are received and thus the parameters of the aforementioned optical device which are measured in the 10 course of the calibration process can be determined singly and with great accuracy for each li~ht-source/camera pair and the subsequent measurements of further objects can also be exactly carried out.
An arrangement according to the invention for conducting optoelectronic rneasurements in the manner described at the beginning is 15 characterized by the features set forth in the characterizing clause of Claim In the manner according to the invention, provision is also made to ensure that the individual light sources used to form the respective light strips are designed to emit light beams of different wavelength and/or 20 polarization from each other, and/or that the times when the light is beamed onto the object and the operating times of the camera allocated to the respective light strip(s) are temporally matched to each other, and the projection times and recording times of the individual light-source/ camera pairs do not overlap. Appropriata control circuits can be provided to 25 accomplish this.
Measuring arrangements according to the invention are especially suitable for checking the cross sectional dimensions of rolled items directly as they emerge from the roll stand, also to verify the cross sectional shape of continuously cast billets and for measuring castings and forgings, e.g.
30 turbine blades, and similar. The arrangement according to the invention may ~C~7~2~
also be used advantageously to take measurements for medical-technical and manufacturing purposes.
A process for optoelectronically measuring the shape or objects or for calibrating optoelectronic measurement systems, wherein in each case at 5 least one light strip is projected (light section process) by each of at least two light sources, preferably laser light sources onto the object to be measured or onto the calibration object, and wherein the light strips are recorded by the same number of video cameras as there are light sources or light strips to which the cameras are allocated, and wherein further the 10 camera signals are supplied to an evaluation unit, is essentially characterized by the fact that light having different properties, e.g. wavelength and polarization, is emitted by the individual light sources, and that a specific video camera is allocated to each light source or to each generated light strip, and the camera is sensitive only to the light emitted by that specific 15 light source, so that only the light from this (these) particular light strip(s) is supplied to the allocated video camera, and that signals are generated from the received light signals in each video camera and supplied to the evaluation unit, and the signals of each individual vide camera are evaluated separately from those of the other cameras, and then the image data and 20 image points are combined and the dimensions of the object are calculated.
Furthermore, an arrangement according to the invention for optoelectronically measuring the shape, specially the cross sectional shape, e.g. of work pieces, or for calibrating optoelectronic measuring systems, wherein in each case at least one light strip is projected (light section 25 process~ by each of at least two light sources, preferably laser light sources, onto ~he object to be measured or onto the calibration object, and wherein the light strips are recorded by the same number of video cameras, preferably CCD solid-state cameras, as there are light sources or light strips, wherein a video camera is allocated to each light source, and wherein the 30 video cameras are connected to an evaluation unit comprising a computer, is advantageously characterized by the fact that each light source is designed to emit light having different properties from the light emitted by the other light source~s), and by the fact that the video cameras are ) designed to receive light only from the respective light source allocated to them .
Especially when laser light is used to illuminate the objects, it is advantayeous, for the purpose of the invention, to cause the objects to 5 oscillate at least while they are being illuminated and scanned. in this way, interferences and imaging errors can be eliminated. The amplitude of the oscillation should be greater than half the wavelength of the light illuminating the object; and the period of the oscillation is equal to or less than the measuring time or the measuring time integration interval or the 10 electronic shutter speed of the camera, preferably a period of approximately 10 to 30 msec, and in particular approximately 20 msec. The object is actively caused to oscillate at an amplitude and period which differ considerably from the oscillation parameters which occur in manufacturing or processing processes, e.g. in rolling mills. It is particularly advantageous 15 to measure oscillating objects when the objects are work pieces such as turbine blades, etc.
Advantageously, the oscillation component of the oscillation in the - direction of the light source and/or video camera, or of the radiated and/or reflected laser light, should possess the indicated values. A total of 100 20 wavelengths, and preferably 10 wavelengths, is a sensible upper limit as regards the generation of the oscillations and a!so with regard to the measuring accuracy. The surface roughness is of little significance.
However, in the case of shiny objects, the measuring accuracy can be improved by applying a matte coating to the surface of the abjects before 25 carrying ou~ the measurements.
In the following, the invention is explained in more detail on the basis of the drawing. Fig. 3 shows in diagrammatic forrn an arrangement according to the invention. Figs. 4a and b depict diagrammatically the measurement of a complex object and Fig. 5 depicts the measurement of an 30 object mounted on an oscillating device.
7 2~708~
Fig. 3 illustrates the principle of the measuring process. The object to be measured 4 is illuminated by a number of light sources 5, in this case four laser light sources, which emit light of different wavelengths and/or polarization. The light beams of the lasers 5 are spread in one plane by an 5 optical system which is not shown here, so that a bright contour or a bright ligh~ strip 6 is projected onto the object to be measured. Each laser 5 creates a bright strip 6 and these generally overlap on the object. The light plane generated by each laser 5 is oriented advantageously at right angles to the longitudinal axis of the object (angle beta); the light planes of the 10 individual lasers 5 are aligned in such a way that as far as possible they are all adjusted in one common plane so that the light strips generated by the individual lasers overlap each other as far as possible or lie in a defined plane of intersection with the object. This is done in order, right from the start, to eliminate evaluation errors derived from positional inaccuracies.
15 However, these efforts are hampered by thermal or mechanical maladjustments. The light planes may be arranged at an angle to the longitudinal axis. This is of advantage, for example, when measuring the planarity of metal sheet.
The light sections 6 projected by the lasers onto the object 4 are 20 recorded by solid-state cameras 7 which are tilted at an angle alpha relativeto the optical axis or light plan0 of the laser 5 allocated to the respective camera 7. Each camera 7 is fitted with a filter which permits the passage only of light having the wavelength emitted by the associated laser 5, so that each camera 7 can receive light only from its associated light source 5.
25 In this way, any influencing of each camera 7 by light emitted by other, non-allocated light sources 5 is totally eliminated. Thus, the light sections orthe contour of the strips 6 on the object or on the calibration object can be evaluated with great accuracy.
The invention thus offers the advantage that not only can calibration 30 measurements be performed quickly and accurately, but also the accuracy of the measurements can be enhanced because light sections can be observed and evaluated without any mutual interference.
-B-lt is also possible to activate a single laser light source 5 and its associated camera 7, preferably simultaneously, for the same periods of illumination or observation. Thus, in each case one light-source/camera pair is activated while the other three pairs are left switched off. This eliminates 5 any undesired influence, even when using li~ht of the same wavelength.
The activa~ion times of the respective light-source/camera pairs do not overlap; the indivi~ual pairs are activated in sequence, e.g. associated light-source/camera pairs can be activated up to 50 times a second for observation intervals of about 5 ms while still retaining adequate evaluation 10 quality. Corresponding control wires for the light sources 5 or the cameras 7are denoted by the reference number 1 1; the control unit 10, which regulates light-dark settings can be coupled with or can operate with the evaluation unit 9 for the camera signals.
The evaluation unit 9 comprises a computer with which the video 15 signals from the cameras 7 are digitalized and stored. the computer selects from the image matrix those image points which depict the light section.
When a measurement is taken, the positional data of the object and the position and alignment of the camera relative to the light plane and also the focal length of the lens are known, so that the image points which have 20 been found are geometrically rectified and can be calculated back to give the actual coordinates of the object.
it should be mentioned that, in favourable cases, it is enough to use just two cameras; for objects of round cross section at least three cameras are needed, and when four cameras are used, as shown in Fig. 3, almost all 25 customary extruded sections with convex and concave shapes can be completely measured, as long as no undercuts are present.
The light sources 5 may be white light sources fitted with appropriate colour filters, or lasers and laser diodes with adjustable wavelengths, and similar. Appropriate optical systems are known for creating the very narrow 30 light sections on the object.
. . . .
3 ~
g The cameras which can be used for this purpose are video cameras, solid-state cameras, especially CCD cameras or cameras, known as colour cameras, fitted with sensors which respond to certain colours. Preferably, CCD cameras are used for imaging. These cameras possess a solid-state 5 sensor comprising a matrix of about 500 x 500 photodiodes which supplies substantially distortion-free images.
In order tO evaluate the camera video signals stored in the evaluation unit 9, the image is first scanned for contour starting points by checking the brightness contrasts of the image poimts. Once corresponding series of 10 connected lines (traverses) have been determined, separately for the individual signals put out by the video cameras, each traverse is transformed into the object coordinates so that the traverses received from the individual cameras are combined to give the overall structure and from this the desired dimensions are calculated.
The projection or rectification parameters are determined by calibration, for which purpose a calibration body whose exact dimensions are stored in the computer is placed in the measuring field of the cameras.
When the light section of the calibration body is recorded using the method referred to above, the rectification parameters can be calculated from a 20 comparison of the stored dimensions with the measured light sections. In particular, it is important when carrying out this calibration process, for the cameras to record clear signals from the light sections allocated to them, without interference from the light sections generated by the other light sources .
In Fig. 3 the reference number 12 denotes the unit used to take the signals received from the individual cameras and process them into traverses. Reference number 13 identifies the unit used to combine the individual traverses into the outline or cross section of the object. Reference number 14 refers to a monitor for displaying the measured object; 15 is a ~0 printer for printing out the dimensions of ~he object or of other data -10- 2~7~,~22 obtained by measuring or in some other way, and 16 is a plotter ~or drawing the evaluated contour of the object examined.
In two sections arranged perpendicular to each other Figs. 4a and 4b show the measuring of a roller-shaped body 19 with a cylindrical bearing 5 bolt 20. For this measurement, the light from two lasers 5 is projected in two planes (A,B) on to the roller 19, and polarization filters 17 are fitted to the lasers 5 to produce oppositely polarized light; the planes A, B intersect the measured roller 19 in one sin~le plane. It can be seen from the section in Fig. 4b, which runs perpendicular to the section in Fig. 4a, that the 10 cameras 7, which are not shown in Fi~. 4a, are arranged at an angle alpha to the respective p!ane of illumination A, B and are each fitted with a polarization filter 18 which is matched to the direction of polarization of the light of the allocated laser 5. The overlapping range of the two light sources 5, which would cause errors, is designated by the le~ter "G" in Fig. 4a.
In Fig. 4b only one camera 7 is shown; however, it is understood that two cameras are present, both of which are fitted with polarization filters whose directions of polarization are arranged at right angles to each other, corresponding to the perp0ndicular orientation of the polarization of both polarlzation fllters 17 of the laser 5.
Lasers which directly emit polarized light may also be used, so that the provision of polarization filters on the lasers is avoided. It is also possible to give the light a circular polarization and to arrange approprlate circular polarization filters in front of the cameras.
Fig. 5 shows a spatial arrangement in which the object 4 is illuminated by four lasers and the light sections 6 are observed by means of CCD cameras 7 fitted with filters 8. It can be seen that the optical axes 21 of the lasers 5 and the optical axes 22 of the cameras 7 enclose an angle of, for example, 45 and the planes generated by the laser light from each laser 5 lie in a common overall plane.
.
7 ~ 3 2 ~
Fig. 5 shows the object 4 arranged on a vibration device 23, by means of which the object 4 can be caused to oscillate while the measurements are being taken. These oscillations are dimensioned in such a way that in the light source - object direction they preferentially possess a 5 component or amplitude which is greater than half the smallest wavelength used in the measurement process. In addition, the period of the oscillation should be equal to or less than the measurement integration interval of the cameras (of the sensor element), i.e. it should be about 10 to 30 msec. As a result, the li~ht is integrated over a certain path length and the light 10 signals received by the cameras are uniformized; this is particularly important because laser light can produce interferences in the case of surface roughness or on technical surfaces, and these interferences can disrupt the image evaluation, especially in the case of the contrast recognition process. The oscillations of the devices 23 are actively 15 generated by mechanical, electromechanical or elec~rical oscillators, e.g.
ultrasonic heads or electromagnetic oscillators.
It is furthermore advantageous if, in order to eliminate interferences or imaging errors, a linear motion (possibly in addition to the oscillations) is imparted to the objects, e.g. turbine blades, such that within a measuring 20 time or a period corresponding to the measurement integration interval or the electronic shutter speed of the camera, preferably within a period of 10 to 30 msec, and in particular within a period of approximately 20 msec, the path travelled is greater than half the wavelength of the light from the light source.
. ' .
Such processes are becoming increasingly important in practice for the purpose of rapidly and accurately measuring objects. Such a contactless optical measuring process for measuring cross sectional shapes is discussed in the journal Montanrundschau BHM, Vol. 13~ (1989), i~o. 3, pp. 64-67. In many applications of this light section process it is necessary to use several light sources which illuminate from various directions the object to be measured, so that complicated cross sectional shapes can also be measured. The light emitted by the individual light sources, in particular a laser beam, is spread by a cylindrical optical system so that a line as thin as possible, but af maximum intensity, is projected onto the object to be measured. However, the individual lines should all lie in one plane, and this necessitates an accurate and complicated adjustment of the cameras. These lines ~light sections) are then recorded by the cameras; if several strips are projected in one area of the object, this can give rise to measuring errors because the cameras record double lines or shifts occur in the maximum-brightness and thus it is not possible to evaluate the image clearly or the evaluation may be incorrect.
EP-A-0305107, and in particular Fig. 2(a) therein, discloses a light section process using several light sources and allocated cameras in which each camera receives only the light from the allocated light strip or light 25 source and in which the signals of the individual cameras are at first separately evaluated. This set-up is also prone to errors caused by extraneous light or incorrect illuminations.
Figs. 1 and 2 in the accompanying drawings show the brightness curves on the object to be measured in a section normal to the light section, 30 with maladjusted light sources. The graphs in Figs. 1 and 2 depict on the -''~ ' ' ` ~ ~D7032~
abscissas the spatial extent S of the object and on the ordinates the intensity T of the strip projected onto the object. It can be seen from Fig. 1 that the bri~htness cross section ~ of the li~ht section from a first laser light source, and the brightness cross section 2 of the li~ht section from a 5 second laser iight source are spatially offset from each other so that a cumulative brightness distribution is formed between these two brightness cross sections as depicted by curve 3; and a clear minimum forms between two maxima. Such brightness distributions cannot be distinguished by a camera; the ima~e evaluation system does not recognize which of the strips 10 is the correct one to process.
In Fig. 2 the two brightness cross sections have been moved closer together so that a cumulative brightness distribution according to curve 3 results and now only one brightness maximum is present; this is in fact evaluated by the image evaluation system, but it is spatially offset by 15 distance "d" from the brightness maximum of one of the two brightness cross sections which actually needs to be evaluated.
Such errors are disruptive, in particular when calibrating the measuring system because the calibrations form the basis for the subsequent measuring of objects and the evaluation of the measurements.
20 However, an exact adjustment of the light sources takes time and requires precise mechanical devices. In addition, disruptive thermal, mechanical or optical influences can cause the adjusted light sources to lose their out of adjustment once more so that re-adjustment is required in order to have a valid calibration for the subsequent measurement. Finally, errors due to 25 incorrect evaiuation of brightness maxima should be avoided also at the measurement stage.
The purpose of the invention is to produce a measuring or calibration process or an arrangement for implementing this process, by means of which it is possible, in the receiving systems, i.e. in one or more video 30 cameras, to clearly separate from each other the beams of light projected by the various light sources onto the object to be measured or onto the .
~J~J,~
: - -3-calibration object. This is the only way in which double lines can be clearly evaluated and shifts in the brightness maximum can be prevented and also re-adjustment of the optical systems can be avoided.
According to the invention, these objectives are met by a process of 5 the type mentioned at the beginning and having the characteristics set forth in the characterizing clause of Patent Claim 1. These light signals, which are clearly allocated from the start to the individual cameras, can now be evalua~ed in such a way that, following separate evaluation of the signals from the cameras, the image data are combined and the dimensions of the 10 object are calculated.
In a preferred embodiment of the process according to the invention steps are taken to ensure that the light strips produced by each individual light source are precisely allocated and recorded; this is accomplished by generating the light strips from the various iight sources by using light of 15 various wavelengths and/or different polarization, and by ensuring that the sensitivity of the camera allocated to the respective light strip(s) is tuned torespond only to the respective wavelength or polarization of the light emitted by the light source allocated to the camera. Thus, in the process according to the invention, light sources can be used which emit different 20 light which in each case can be received only by the camera allocated to that specific light source. For example, laser light sources can be used which emit different wavelengths, and the cameras are fitted with filters of various transmissivity to make them sensitive only to the specific wavelength of the respective light source allocated ~o them. At the same 25 time, or alternatively, it is possible for the light strips from various light sources to be projected or formed on the object in a temporally non-overlapping sequence and for the individual cameras to be allowed to record them in accordance with the allocation of the cameras to the individual strips just when they form, and also in accordance with the time interval 30 between the formation of the strips.
-4- ~70~7,2 In this way, it is possible to ensure that the iight emitted by an individual light source and projected onto the object as a light strip is clearly fed to the correspondingly adjusted camera allocated to the respective light source. Ligh~ emitted by the other light sources is not received by this 5 camera so that, especially when taking calibration measurements, a slight lack of adjustmen~ in the cameras and also brightness maxima which are offset in relation to each other, no longer play a role because no double lines or displacements of the brightness maximum are received and thus the parameters of the aforementioned optical device which are measured in the 10 course of the calibration process can be determined singly and with great accuracy for each li~ht-source/camera pair and the subsequent measurements of further objects can also be exactly carried out.
An arrangement according to the invention for conducting optoelectronic rneasurements in the manner described at the beginning is 15 characterized by the features set forth in the characterizing clause of Claim In the manner according to the invention, provision is also made to ensure that the individual light sources used to form the respective light strips are designed to emit light beams of different wavelength and/or 20 polarization from each other, and/or that the times when the light is beamed onto the object and the operating times of the camera allocated to the respective light strip(s) are temporally matched to each other, and the projection times and recording times of the individual light-source/ camera pairs do not overlap. Appropriata control circuits can be provided to 25 accomplish this.
Measuring arrangements according to the invention are especially suitable for checking the cross sectional dimensions of rolled items directly as they emerge from the roll stand, also to verify the cross sectional shape of continuously cast billets and for measuring castings and forgings, e.g.
30 turbine blades, and similar. The arrangement according to the invention may ~C~7~2~
also be used advantageously to take measurements for medical-technical and manufacturing purposes.
A process for optoelectronically measuring the shape or objects or for calibrating optoelectronic measurement systems, wherein in each case at 5 least one light strip is projected (light section process) by each of at least two light sources, preferably laser light sources onto the object to be measured or onto the calibration object, and wherein the light strips are recorded by the same number of video cameras as there are light sources or light strips to which the cameras are allocated, and wherein further the 10 camera signals are supplied to an evaluation unit, is essentially characterized by the fact that light having different properties, e.g. wavelength and polarization, is emitted by the individual light sources, and that a specific video camera is allocated to each light source or to each generated light strip, and the camera is sensitive only to the light emitted by that specific 15 light source, so that only the light from this (these) particular light strip(s) is supplied to the allocated video camera, and that signals are generated from the received light signals in each video camera and supplied to the evaluation unit, and the signals of each individual vide camera are evaluated separately from those of the other cameras, and then the image data and 20 image points are combined and the dimensions of the object are calculated.
Furthermore, an arrangement according to the invention for optoelectronically measuring the shape, specially the cross sectional shape, e.g. of work pieces, or for calibrating optoelectronic measuring systems, wherein in each case at least one light strip is projected (light section 25 process~ by each of at least two light sources, preferably laser light sources, onto ~he object to be measured or onto the calibration object, and wherein the light strips are recorded by the same number of video cameras, preferably CCD solid-state cameras, as there are light sources or light strips, wherein a video camera is allocated to each light source, and wherein the 30 video cameras are connected to an evaluation unit comprising a computer, is advantageously characterized by the fact that each light source is designed to emit light having different properties from the light emitted by the other light source~s), and by the fact that the video cameras are ) designed to receive light only from the respective light source allocated to them .
Especially when laser light is used to illuminate the objects, it is advantayeous, for the purpose of the invention, to cause the objects to 5 oscillate at least while they are being illuminated and scanned. in this way, interferences and imaging errors can be eliminated. The amplitude of the oscillation should be greater than half the wavelength of the light illuminating the object; and the period of the oscillation is equal to or less than the measuring time or the measuring time integration interval or the 10 electronic shutter speed of the camera, preferably a period of approximately 10 to 30 msec, and in particular approximately 20 msec. The object is actively caused to oscillate at an amplitude and period which differ considerably from the oscillation parameters which occur in manufacturing or processing processes, e.g. in rolling mills. It is particularly advantageous 15 to measure oscillating objects when the objects are work pieces such as turbine blades, etc.
Advantageously, the oscillation component of the oscillation in the - direction of the light source and/or video camera, or of the radiated and/or reflected laser light, should possess the indicated values. A total of 100 20 wavelengths, and preferably 10 wavelengths, is a sensible upper limit as regards the generation of the oscillations and a!so with regard to the measuring accuracy. The surface roughness is of little significance.
However, in the case of shiny objects, the measuring accuracy can be improved by applying a matte coating to the surface of the abjects before 25 carrying ou~ the measurements.
In the following, the invention is explained in more detail on the basis of the drawing. Fig. 3 shows in diagrammatic forrn an arrangement according to the invention. Figs. 4a and b depict diagrammatically the measurement of a complex object and Fig. 5 depicts the measurement of an 30 object mounted on an oscillating device.
7 2~708~
Fig. 3 illustrates the principle of the measuring process. The object to be measured 4 is illuminated by a number of light sources 5, in this case four laser light sources, which emit light of different wavelengths and/or polarization. The light beams of the lasers 5 are spread in one plane by an 5 optical system which is not shown here, so that a bright contour or a bright ligh~ strip 6 is projected onto the object to be measured. Each laser 5 creates a bright strip 6 and these generally overlap on the object. The light plane generated by each laser 5 is oriented advantageously at right angles to the longitudinal axis of the object (angle beta); the light planes of the 10 individual lasers 5 are aligned in such a way that as far as possible they are all adjusted in one common plane so that the light strips generated by the individual lasers overlap each other as far as possible or lie in a defined plane of intersection with the object. This is done in order, right from the start, to eliminate evaluation errors derived from positional inaccuracies.
15 However, these efforts are hampered by thermal or mechanical maladjustments. The light planes may be arranged at an angle to the longitudinal axis. This is of advantage, for example, when measuring the planarity of metal sheet.
The light sections 6 projected by the lasers onto the object 4 are 20 recorded by solid-state cameras 7 which are tilted at an angle alpha relativeto the optical axis or light plan0 of the laser 5 allocated to the respective camera 7. Each camera 7 is fitted with a filter which permits the passage only of light having the wavelength emitted by the associated laser 5, so that each camera 7 can receive light only from its associated light source 5.
25 In this way, any influencing of each camera 7 by light emitted by other, non-allocated light sources 5 is totally eliminated. Thus, the light sections orthe contour of the strips 6 on the object or on the calibration object can be evaluated with great accuracy.
The invention thus offers the advantage that not only can calibration 30 measurements be performed quickly and accurately, but also the accuracy of the measurements can be enhanced because light sections can be observed and evaluated without any mutual interference.
-B-lt is also possible to activate a single laser light source 5 and its associated camera 7, preferably simultaneously, for the same periods of illumination or observation. Thus, in each case one light-source/camera pair is activated while the other three pairs are left switched off. This eliminates 5 any undesired influence, even when using li~ht of the same wavelength.
The activa~ion times of the respective light-source/camera pairs do not overlap; the indivi~ual pairs are activated in sequence, e.g. associated light-source/camera pairs can be activated up to 50 times a second for observation intervals of about 5 ms while still retaining adequate evaluation 10 quality. Corresponding control wires for the light sources 5 or the cameras 7are denoted by the reference number 1 1; the control unit 10, which regulates light-dark settings can be coupled with or can operate with the evaluation unit 9 for the camera signals.
The evaluation unit 9 comprises a computer with which the video 15 signals from the cameras 7 are digitalized and stored. the computer selects from the image matrix those image points which depict the light section.
When a measurement is taken, the positional data of the object and the position and alignment of the camera relative to the light plane and also the focal length of the lens are known, so that the image points which have 20 been found are geometrically rectified and can be calculated back to give the actual coordinates of the object.
it should be mentioned that, in favourable cases, it is enough to use just two cameras; for objects of round cross section at least three cameras are needed, and when four cameras are used, as shown in Fig. 3, almost all 25 customary extruded sections with convex and concave shapes can be completely measured, as long as no undercuts are present.
The light sources 5 may be white light sources fitted with appropriate colour filters, or lasers and laser diodes with adjustable wavelengths, and similar. Appropriate optical systems are known for creating the very narrow 30 light sections on the object.
. . . .
3 ~
g The cameras which can be used for this purpose are video cameras, solid-state cameras, especially CCD cameras or cameras, known as colour cameras, fitted with sensors which respond to certain colours. Preferably, CCD cameras are used for imaging. These cameras possess a solid-state 5 sensor comprising a matrix of about 500 x 500 photodiodes which supplies substantially distortion-free images.
In order tO evaluate the camera video signals stored in the evaluation unit 9, the image is first scanned for contour starting points by checking the brightness contrasts of the image poimts. Once corresponding series of 10 connected lines (traverses) have been determined, separately for the individual signals put out by the video cameras, each traverse is transformed into the object coordinates so that the traverses received from the individual cameras are combined to give the overall structure and from this the desired dimensions are calculated.
The projection or rectification parameters are determined by calibration, for which purpose a calibration body whose exact dimensions are stored in the computer is placed in the measuring field of the cameras.
When the light section of the calibration body is recorded using the method referred to above, the rectification parameters can be calculated from a 20 comparison of the stored dimensions with the measured light sections. In particular, it is important when carrying out this calibration process, for the cameras to record clear signals from the light sections allocated to them, without interference from the light sections generated by the other light sources .
In Fig. 3 the reference number 12 denotes the unit used to take the signals received from the individual cameras and process them into traverses. Reference number 13 identifies the unit used to combine the individual traverses into the outline or cross section of the object. Reference number 14 refers to a monitor for displaying the measured object; 15 is a ~0 printer for printing out the dimensions of ~he object or of other data -10- 2~7~,~22 obtained by measuring or in some other way, and 16 is a plotter ~or drawing the evaluated contour of the object examined.
In two sections arranged perpendicular to each other Figs. 4a and 4b show the measuring of a roller-shaped body 19 with a cylindrical bearing 5 bolt 20. For this measurement, the light from two lasers 5 is projected in two planes (A,B) on to the roller 19, and polarization filters 17 are fitted to the lasers 5 to produce oppositely polarized light; the planes A, B intersect the measured roller 19 in one sin~le plane. It can be seen from the section in Fig. 4b, which runs perpendicular to the section in Fig. 4a, that the 10 cameras 7, which are not shown in Fi~. 4a, are arranged at an angle alpha to the respective p!ane of illumination A, B and are each fitted with a polarization filter 18 which is matched to the direction of polarization of the light of the allocated laser 5. The overlapping range of the two light sources 5, which would cause errors, is designated by the le~ter "G" in Fig. 4a.
In Fig. 4b only one camera 7 is shown; however, it is understood that two cameras are present, both of which are fitted with polarization filters whose directions of polarization are arranged at right angles to each other, corresponding to the perp0ndicular orientation of the polarization of both polarlzation fllters 17 of the laser 5.
Lasers which directly emit polarized light may also be used, so that the provision of polarization filters on the lasers is avoided. It is also possible to give the light a circular polarization and to arrange approprlate circular polarization filters in front of the cameras.
Fig. 5 shows a spatial arrangement in which the object 4 is illuminated by four lasers and the light sections 6 are observed by means of CCD cameras 7 fitted with filters 8. It can be seen that the optical axes 21 of the lasers 5 and the optical axes 22 of the cameras 7 enclose an angle of, for example, 45 and the planes generated by the laser light from each laser 5 lie in a common overall plane.
.
7 ~ 3 2 ~
Fig. 5 shows the object 4 arranged on a vibration device 23, by means of which the object 4 can be caused to oscillate while the measurements are being taken. These oscillations are dimensioned in such a way that in the light source - object direction they preferentially possess a 5 component or amplitude which is greater than half the smallest wavelength used in the measurement process. In addition, the period of the oscillation should be equal to or less than the measurement integration interval of the cameras (of the sensor element), i.e. it should be about 10 to 30 msec. As a result, the li~ht is integrated over a certain path length and the light 10 signals received by the cameras are uniformized; this is particularly important because laser light can produce interferences in the case of surface roughness or on technical surfaces, and these interferences can disrupt the image evaluation, especially in the case of the contrast recognition process. The oscillations of the devices 23 are actively 15 generated by mechanical, electromechanical or elec~rical oscillators, e.g.
ultrasonic heads or electromagnetic oscillators.
It is furthermore advantageous if, in order to eliminate interferences or imaging errors, a linear motion (possibly in addition to the oscillations) is imparted to the objects, e.g. turbine blades, such that within a measuring 20 time or a period corresponding to the measurement integration interval or the electronic shutter speed of the camera, preferably within a period of 10 to 30 msec, and in particular within a period of approximately 20 msec, the path travelled is greater than half the wavelength of the light from the light source.
. ' .
Claims (13)
1. A process for optoelectronically measuring the shape, especially the cross sectional shape, of objects, e.g. work pieces, or for calibrating optoelectronic measuring systems, wherein at least one light strip is projected from each of at least two light sources, preferably laser light sources, onto the object to be measured or onto the calibration object (light section process), and the light strips are recorded by the same number of video cameras, preferably CCD solid-state cameras, as there are light sources, and the camera signals are fed to an evaluation unit comprising a computer for the purpose of image evaluation and for calculating the dimensions of the object or for determining the basic data and parameters for the calibration, and a specific camera is allocated to each light source or to each light strip generated by a light source, characterized in that each light source or each light strip generated by a light source is assigned its own recording camera which responds only to the light from that light source, and further characterized in that in the evaluation unit or in the computer, the signals from the individual cameras are interpreted at first separately from the signals of the other camera(s) with regard to shape and position of the light strip(s).
2. A process according to Claim 1, characterized in that after the signals from the individual cameras have been separately evaluated and the image points of the individual strips have been calculated, the image data and the image points are combined and the dimensions of the object are calculated.
3. A process according to Claim 1 or 2, characterized in that in order to ensure that the light strip projected by each individual light source and reflected by the object is clearly allocated to and recorded by a specific camera, the light strips are generated by the individual light sources by means of light of different wavelength and/or of different polarization, and the camera assigned to the respective light strip(s) responds only to the respective wavelength or polarization of the light emitted by the associated light source.
4. A process according to one of the Claims 1 to 3, characterized in that the light sources and cameras are activated simultaneously to illuminate the object and to record the light strip.
5. A process according to one of the Claims 1 to 4, characterized in that the light strips from various light sources are projected by these light sources onto the object in a temporally non-overlapping sequence, and the individual cameras are activated to record the strips, while the latter are being projected, in accordance with the allocation of each camera to each strip just as the latter is formed.
6. A process, in particular according to one of the Claims 1 to 5, characterized in that the objects are caused to oscillate in order to eliminate interferences or imaging errors at least while the objects are being illuminated or scanned, and the amplitude selected for the oscillation is greater than half the wavelength of the light illuminating the object, and the period of the oscillation is equal to or less than the measuring time, or the measuring time integration interval, or the electronic shutter speed of the camera, and preferably the period is 10 to 30 msec, in particular approximately 20 msec.
7. A process according to one of the Claims 1 to 6, characterized in that in order to eliminate interferences or imaging errors a linear motion is imparted to the objects such that within an interval corresponding to the measuring time or the measuring time integration interval, or the electronic shutter speed of the camera, and preferably an interval of 10 to 30 msec, and in particular approximately 20 msec, the distance travelled is greater than half the wavelength of the light from the light source.
8. An arrangement for optoelectronically measuring the shape, in particular the cross sectional shape, of objects, e.g. work pieces, or for calibrating optoelectronic measuring systems, wherein at least one light strip is projected by each of at least two light sources, preferably laser light sources, onto the object to be measured or onto the calibration object (light section process), and the light strips are recorded by the same number of video cameras, preferably CCD solid-state cameras, as there are light sources, and the cameras are connected to an evaluation unit comprising a computer for the purpose of evaluating the images or calculating the dimensions of the object or determining the basic data and basic parameters for carrying out calibration, and specifically for the purpose of implementing the process according to Claims 1 to 8, characterized in that to each light source (5) or to each light strip generated by a light source is allocated a recording camera (7) which responds only to the light from that light source (5).
9. An arrangement according to Claim 8, characterized in that in order to form their respective light strips (6) the individual light sources (5) are designed to transmit light rays having different wavelengths and/or polarizations, or the times when the light is projected onto the object (4) and the times when the camera (7) allocated to the respective strip(s) (6) is activated are matched to each other, and the projection and recording times of the individual light-source/camera pairs (5, 7) do not overlap each other.
10. An arrangement according to Claim 8 or 9, characterized in that the light sources (5) take the form of light sources fitted with colour filters (8) for different wavelengths of light, and /or laser light sources, e.g. laser diodes, operated at different frequencies, frequency tunable lasers, etc., and further characterized in that each camera (7) is fitted with a filter (8) permitting the transmission of light only at the wavelength of the associated light source (5), or each camera (7) (colour camera) is fitted with a sensor element which is specifically sensitive to the wavelength of the associated light source (5).
11. An arrangement according to one of the Claims 8 to 10, characterized in that the light sources (5) take the form of lasers emitting polarized light or of light sources fitted with polarization filters, and the light sources (5) aredesigned to emit light rays of different polarization, in particular different circular polarization, and further characterized in that each camera (7) is fitted with a polarization filter which permits transmission only of the polarized light from the associated light source (5).
12. An arrangement according to one of the Claims 8 to 11, characterized in that each light source (5), e.g. a laser light source, can be switched on and off by a switching device (10) to limit the illumination time of the object (4), and the recording time of the camera (7) allocated to this light source (5) is synchronized with the illumination time of the light source (5) by means of the switching device (10).
13. An arrangement in particular according to one of the Claims 8 to 12, characterized in that for the duration of the measurement process, the object (4) to be measured is connected to a vibration device or an oscillation generator (23), e.g. a coupled ultrasonic head, an electromagnet, a mechanical oscillator or similar, by means of which a velocity is imparted to the object (4) such that the object (4) travels a distance equal to at least half the wavelength of the light illuminating the object, within a time intervalwhich is equal to the measuring time or to the measuring time integration interval or to the electronic shutter speed of the camera (7), and preferably an interval of approximately 10 to 30 msec, but in particular approximately 20 msec.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ATA2764/89 | 1989-12-05 | ||
AT276489 | 1989-12-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2070822A1 true CA2070822A1 (en) | 1991-06-06 |
Family
ID=3539929
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002070822A Abandoned CA2070822A1 (en) | 1989-12-05 | 1990-12-05 | Process and arrangement for optoelectronic measuring of objects |
Country Status (8)
Country | Link |
---|---|
EP (1) | EP0502930B1 (en) |
JP (1) | JPH05502720A (en) |
KR (1) | KR920704092A (en) |
AT (1) | ATE109556T1 (en) |
AU (1) | AU647064B2 (en) |
CA (1) | CA2070822A1 (en) |
DE (1) | DE59006727D1 (en) |
WO (1) | WO1991008439A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8064072B2 (en) | 2006-12-15 | 2011-11-22 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Method and apparatus for thickness measurement |
US9285213B2 (en) | 2009-10-27 | 2016-03-15 | Formax, Inc. | Automated product profiling apparatus and product slicing system using the same |
EP1782929B2 (en) † | 1999-04-20 | 2019-12-25 | Formax, Inc. | Automated product profiling apparatus |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4208455A1 (en) * | 1992-03-17 | 1993-09-23 | Peter Dr Ing Brueckner | Contactless three=dimensional measurement e.g of teeth - using opto-electronic measurement in two dimensions and rotative or translation in third dimension and combining w.r.t actual position of measurement planes |
DE4301538A1 (en) * | 1992-03-17 | 1994-07-28 | Peter Dr Ing Brueckner | Method and arrangement for contactless three-dimensional measurement, in particular for measuring denture models |
US5383021A (en) * | 1993-04-19 | 1995-01-17 | Mectron Engineering Company | Optical part inspection system |
DE4411986A1 (en) * | 1994-04-10 | 1995-10-12 | Kjm Ges Fuer Opto Elektronisch | System for measuring cross-sectional accuracy of bar-shaped material |
US6175415B1 (en) * | 1997-02-19 | 2001-01-16 | United Technologies Corporation | Optical profile sensor |
IT1292544B1 (en) * | 1997-04-10 | 1999-02-08 | Microtec Srl | DEVICE FOR MEASURING THE DIMENSIONS OF A VERY LONGITUDINALLY EXTENDED OBJECT WITH A CURVED CONTOUR CROSS SECTION. |
DE19749435B4 (en) * | 1997-11-09 | 2005-06-02 | Mähner, Bernward | Method and device for the three-dimensional, areal, optical measurement of objects |
US6094269A (en) * | 1997-12-31 | 2000-07-25 | Metroptic Technologies, Ltd. | Apparatus and method for optically measuring an object surface contour |
US6636310B1 (en) | 1998-05-12 | 2003-10-21 | Metroptic Technologies, Ltd. | Wavelength-dependent surface contour measurement system and method |
EP2306228A1 (en) | 1998-05-25 | 2011-04-06 | Panasonic Corporation | Range finder device and camera |
US6285034B1 (en) * | 1998-11-04 | 2001-09-04 | James L. Hanna | Inspection system for flanged bolts |
EP1044770A1 (en) * | 1999-04-15 | 2000-10-18 | Hermann Wein GmbH & Co. KG, Schwarzwäder Schinkenräucherei | Method and apparatus for cutting pieces of predetermined weight out of smoked ham |
US6882434B1 (en) | 1999-04-20 | 2005-04-19 | Formax, Inc. | Automated product profiling apparatus and product slicing system using same |
DE10044032A1 (en) * | 2000-09-06 | 2002-03-14 | Deutsche Telekom Ag | 3-D vision |
WO2003027608A1 (en) * | 2001-09-28 | 2003-04-03 | Sergei Vasilievich Plotnikov | System for process control of parameters of rotating workpieces |
EP1391690B1 (en) * | 2002-08-14 | 2015-07-01 | 3D Scanners Ltd | Optical probe for measuring features of an object and methods therefor |
US7009717B2 (en) | 2002-08-14 | 2006-03-07 | Metris N.V. | Optical probe for scanning the features of an object and methods therefor |
DE102004057092A1 (en) * | 2004-11-25 | 2006-06-01 | Hauni Maschinenbau Ag | Measuring the diameter of rod-shaped articles of the tobacco processing industry |
DE102005054658A1 (en) * | 2005-11-16 | 2007-05-24 | Sick Ag | Method for automatically paramenting measuring systems |
DE102005058873A1 (en) * | 2005-12-09 | 2007-06-14 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Device and method for measuring the surface of a body |
DE102006040612B4 (en) * | 2006-08-30 | 2011-12-08 | Z-Laser Optoelektronik Gmbh | Distance measuring device |
DE102007063041A1 (en) * | 2007-12-28 | 2009-07-02 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Laser light section arrangement for determining e.g. elevation profile of object, has image processing device for identifying and separating laser sectional lines from each other in recorded image, and assigning lines to respective lasers |
DE102010011217A1 (en) * | 2010-03-11 | 2011-09-15 | Salzgitter Mannesmann Line Pipe Gmbh | Method and device for measuring the profile geometry of spherically curved, in particular cylindrical bodies |
DE102012102649A1 (en) * | 2012-03-27 | 2013-10-02 | Uwe Reifenhäuser | Method and device for weight-accurate cutting of a food strand |
EP3865813A1 (en) * | 2020-02-15 | 2021-08-18 | Hewlett-Packard Development Company, L.P. | Scanning of objects |
CN113551611B (en) * | 2021-06-15 | 2022-04-22 | 西安交通大学 | Stereo vision measuring method, system, equipment and storage medium for large-size moving object |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4183672A (en) * | 1977-11-26 | 1980-01-15 | United Technologies Corporation | Optical inspection system employing spherical mirror |
SE412286B (en) * | 1978-07-10 | 1980-02-25 | Saab Scania Ab | SET AND DEVICE FOR PHOTOELECTRIC DIMENSION OF WIDES OR SIMILAR GEOMETRICALLY DETERMINED FORMS |
US4279513A (en) * | 1979-04-02 | 1981-07-21 | Sangamo Weston, Inc. | Optical inspection system for large parts and for multiple measurements |
US4297513A (en) * | 1979-05-16 | 1981-10-27 | Ciba-Geigy Corporation | Process for the preparation of benzophenone thioethers |
JPS57159148A (en) * | 1981-03-25 | 1982-10-01 | Fujitsu Ltd | Adaptive modulation system |
US4756007A (en) * | 1984-03-08 | 1988-07-05 | Codex Corporation | Adaptive communication rate modem |
DE3511347C2 (en) * | 1985-03-28 | 1987-04-09 | Gerd Dipl.-Phys. Dr. 8520 Erlangen Häusler | Method and device for optical multidimensional shape detection of an object |
ATE64196T1 (en) * | 1985-08-28 | 1991-06-15 | Igm Ind Geraete Maschf Gmbh | METHOD OF DETECTING THE POSITION AND GEOMETRY OF WORKPIECE SURFACES. |
DE3712513A1 (en) * | 1987-04-13 | 1988-11-03 | Roth Electric Gmbh | METHOD AND DEVICE FOR DETECTING SURFACE DEFECTS |
GB8719951D0 (en) * | 1987-08-24 | 1987-09-30 | Lbp Partnership | Three-dimensional scanner |
-
1990
- 1990-12-05 CA CA002070822A patent/CA2070822A1/en not_active Abandoned
- 1990-12-05 AU AU69000/91A patent/AU647064B2/en not_active Ceased
- 1990-12-05 JP JP3500895A patent/JPH05502720A/en active Pending
- 1990-12-05 WO PCT/AT1990/000117 patent/WO1991008439A1/en active IP Right Grant
- 1990-12-05 DE DE59006727T patent/DE59006727D1/en not_active Expired - Fee Related
- 1990-12-05 AT AT91900122T patent/ATE109556T1/en not_active IP Right Cessation
- 1990-12-05 KR KR1019920701336A patent/KR920704092A/en not_active Application Discontinuation
- 1990-12-05 EP EP91900122A patent/EP0502930B1/en not_active Expired - Lifetime
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1782929B2 (en) † | 1999-04-20 | 2019-12-25 | Formax, Inc. | Automated product profiling apparatus |
US8064072B2 (en) | 2006-12-15 | 2011-11-22 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Method and apparatus for thickness measurement |
US8228488B2 (en) | 2006-12-15 | 2012-07-24 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Method and apparatus for thickness measurement |
US9285213B2 (en) | 2009-10-27 | 2016-03-15 | Formax, Inc. | Automated product profiling apparatus and product slicing system using the same |
US9888696B2 (en) | 2009-10-27 | 2018-02-13 | Formax, Inc. | Automated product profiling apparatus and product slicing system using the same |
Also Published As
Publication number | Publication date |
---|---|
EP0502930A1 (en) | 1992-09-16 |
EP0502930B1 (en) | 1994-08-03 |
AU6900091A (en) | 1991-06-26 |
WO1991008439A1 (en) | 1991-06-13 |
DE59006727D1 (en) | 1994-09-08 |
ATE109556T1 (en) | 1994-08-15 |
KR920704092A (en) | 1992-12-19 |
AU647064B2 (en) | 1994-03-17 |
JPH05502720A (en) | 1993-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2070822A1 (en) | Process and arrangement for optoelectronic measuring of objects | |
US5193120A (en) | Machine vision three dimensional profiling system | |
US6268923B1 (en) | Optical method and system for measuring three-dimensional surface topography of an object having a surface contour | |
US9170097B2 (en) | Hybrid system | |
KR100615576B1 (en) | Three-dimensional image measuring apparatus | |
US7679723B2 (en) | Measuring device and method that operates according to the basic principles of confocal microscopy | |
CN201974159U (en) | Contour sensor with MEMS reflector | |
JP7385743B2 (en) | equipment and equipment | |
US6765606B1 (en) | Three dimension imaging by dual wavelength triangulation | |
AU8869191A (en) | Process and device for the opto-electronic measurement of objects | |
US5379106A (en) | Method and apparatus for monitoring and adjusting the position of an article under optical observation | |
JPH06137826A (en) | Shape detecting method and its device | |
US10753734B2 (en) | Device, method and system for generating dynamic projection patterns in a confocal camera | |
US6529283B1 (en) | Method for measuring the width of a gap | |
JP4188515B2 (en) | Optical shape measuring device | |
CA2334225C (en) | Method and device for opto-electrical acquisition of shapes by axial illumination | |
CN105758294A (en) | Interference objective lens and light interference measuring device | |
JP2000111490A (en) | Detection apparatus for coating face | |
US5414517A (en) | Method and apparatus for measuring the shape of glossy objects | |
US5940566A (en) | 3D array optical displacement sensor and method using same | |
US11933597B2 (en) | System and method for optical object coordinate determination | |
JP2002131017A (en) | Apparatus and method of distance measurement | |
JPS6288906A (en) | Measuring method for solid shape | |
US7046348B2 (en) | Device for measuring color, luster and undulations on lacquered freeform surfaces | |
KR101677585B1 (en) | 3-D Shape Mesuring Apparatus Using Multi Frequency light Source For High Speed Foucs Position Movement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FZDE | Discontinued |