[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN118371857B - Real-time vision and characteristic spectrum laser welding interconnection monitoring method and device - Google Patents

Real-time vision and characteristic spectrum laser welding interconnection monitoring method and device Download PDF

Info

Publication number
CN118371857B
CN118371857B CN202410797266.7A CN202410797266A CN118371857B CN 118371857 B CN118371857 B CN 118371857B CN 202410797266 A CN202410797266 A CN 202410797266A CN 118371857 B CN118371857 B CN 118371857B
Authority
CN
China
Prior art keywords
data
defect
light
characteristic spectrum
welding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410797266.7A
Other languages
Chinese (zh)
Other versions
CN118371857A (en
Inventor
鲍桥梁
王文兵
温生文
吴炳乾
李洪海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Kenai Laser Technology Co ltd
Original Assignee
Nanjing Kenai Laser Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Kenai Laser Technology Co ltd filed Critical Nanjing Kenai Laser Technology Co ltd
Priority to CN202410797266.7A priority Critical patent/CN118371857B/en
Publication of CN118371857A publication Critical patent/CN118371857A/en
Application granted granted Critical
Publication of CN118371857B publication Critical patent/CN118371857B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/20Bonding
    • B23K26/21Bonding by welding
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/70Auxiliary operations or equipment
    • B23K26/702Auxiliary equipment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Multimedia (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Evolutionary Biology (AREA)
  • Mathematical Physics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention relates to a real-time vision and characteristic spectrum laser welding interconnection monitoring method and device. A real-time vision and characteristic spectrum laser welding interconnection monitoring method comprises the following steps: (1) Visual information and characteristic spectrum intensity electric signals of a welding part are collected according to a time sequence; (2) Respectively preprocessing the acquired visual information and characteristic spectrum intensity electric signals, and then fusing the preprocessed data; (3) And inputting the fused data into a neural network analysis module, carrying out reasoning calculation, outputting an interconnection monitoring result, and labeling the defect label. By adopting the neural network analysis module to integrate spectrum energy and vision to monitor laser welding quality, the accuracy of judging defects in welding monitoring is improved, and the contradiction rate of different data sources is solved.

Description

Real-time vision and characteristic spectrum laser welding interconnection monitoring method and device
Technical Field
The invention relates to laser welding regulation and control, in particular to a real-time vision and characteristic spectrum laser welding interconnection monitoring method and device.
Background
The laser welding is performed by utilizing the characteristics of excellent directivity, high power density and the like of laser beams, focusing the laser beams in a small area through an optical system, and forming a heat source area with high concentrated energy at a welded part in an extremely short time, so that a welded object is melted and a firm welding spot and a welding seam are formed.
Due to the interaction of the laser beam and the material, a molten pool is formed in a processing area of the material to be processed, and multiple characteristic spectrum signals such as a visible light signal, a reflected light signal, an infrared heat radiation signal and the like are radiated. Numerous studies have shown that the above-mentioned characteristic spectral signals are closely related to the quality of laser processing. If humps, lack of penetration, splashing, pollution and other defects occur during laser processing, the radiation signals can show different signal characteristics in different wave bands. Moreover, welds of different quality may exhibit different appearances. Therefore, the welding quality of the welding line can be monitored and judged in real time by utilizing the signal of the scattered light and the visual imaging.
The welding quality of the welded workpiece is affected by factors such as workpiece material, beam quality, adhesion degree between workpieces, auxiliary gas flow and the like, and once a certain factor changes, defects are easily generated, so that the quality of the whole product is affected. Therefore, how to quickly and accurately detect defects and diagnose the cause of the defects in an industrial field is always a pain spot of the industry.
The current ways of detecting laser welding defects are: visual inspection of artificial appearance, CCD photographing inspection after welding, energy inspection in welding, airtight inspection of a sealed workpiece, magnetic powder inspection, ultrasonic inspection, vortex inspection, penetration inspection, magneto-optical imaging inspection, infrared inspection, structured light visual inspection and the like. The most used detection modes are as follows under the influence of production takt and cost in industrial sites: visual inspection of the artificial appearance, CCD photographing detection after welding and energy detection in welding. In the industries of automobiles, lithium batteries and the like, a destructive post-detection means is often adopted, namely, the stability of a welding process is judged by sampling and sampling in a large number of samples to perform microscopic structural analysis of fracture so as to infer the welding quality of samples in the same batch. These detection methods have various drawbacks. At present, a combined monitoring mode integrating spectral energy and vision exists, but the monitoring mode only simply combines two monitoring results together, and the two monitoring results are not deeply fused, so that inaccurate judgment on welding defects can sometimes obtain contradictory results.
Disclosure of Invention
The invention solves the problem of the background technology through the real-time vision and characteristic spectrum laser welding interconnection monitoring method and device, and improves the accuracy of judging defects in welding by adopting the neural network analysis module to integrate the spectrum energy and vision to monitor the laser welding quality, thereby solving the contradiction between different data sources; through the regulation and control to laser welding, welding quality is improved, and the yield of workpieces is improved.
The application adopts the following scheme: a real-time vision and characteristic spectrum laser welding interconnection monitoring method comprises the following steps:
(1) Visual information and characteristic spectrum intensity electric signals of a welding part are collected according to a time sequence;
(2) Respectively preprocessing the acquired visual information and characteristic spectrum intensity electric signals, and then fusing the preprocessed data;
(3) And inputting the fused data into a neural network analysis module, carrying out reasoning calculation, outputting an interconnection monitoring result, and labeling the defect label. The interconnection monitoring result is defective or non-defective; if the defect exists, marking a defect label, wherein the information contained in the defect label comprises the following steps: defect grade, defect type, and cause; if the defect is not detected, marking the defect label as qualified.
Further, the visual information comprises n pictures arranged according to a time sequence, the data dimension of each picture is [ color, h, w ], wherein color is the number of channels of RGB colors of the picture, and h and w are the resolution in the height direction and the resolution in the width direction of the resolution of the picture corresponding to any color channel respectively;
The characteristic spectrum intensity electric signals are stored in a time sequence mode, the dimension is [ light, M ], wherein light represents the light path number of characteristic light, M is the number of time sequence data of the characteristic spectrum intensity electric signals corresponding to any path of characteristic light, and the characteristic light comprises visible light, infrared light and reflected light.
Further, the data preprocessing of the visual information specifically includes:
Splicing N pictures according to the acquired time sequence, splicing N rows and N columns from left to right from top to bottom to form a large picture with data dimension [ color, N x h, N x w ], wherein N=n (-2), then rounding N upwards, and filling a blank in a deficient part of the large picture when N-2 is larger than N after rounding upwards; then, carrying out at least one time of maximum pooling treatment, wherein the dimension of the pooled picture data is [ color, H, W ];
The size of the pooling window is khxkw, and the stride is sh×sw, and the height dimension and width dimension of the output data after one time of maximum pooling operation can be calculated by the following formula:
H=(N*h-kh)/sh+1;
W=(N*w-kw)/sw+1;
if multiple maximum pooling operations are needed, the H, W values calculated in the previous pooling are iterated to n×h and n×w in the next pooling, which belongs to the conventional technology in the art and is not described herein.
The data preprocessing of the characteristic spectrum intensity electric signal comprises the following steps:
setting a threshold value, and if the time series data at a certain moment is smaller than the threshold value in the time series data of any one of the three paths of characteristic spectrum intensity electric signals, synchronously deleting the data at the moment in the corresponding other two paths of time series data; if the number of the deleted time series data exceeds m data, intercepting the first m data; if the m data are less, 0 is supplemented at the end until the m data are reached; thereby reducing the dimension of the time series data from [ light, M ] to [ light, M ].
Kh. kw, sh and sw can be set according to the value of m, so that subsequent fusion is facilitated.
Further, the fusion method comprises the following steps:
Carrying out dimension lifting on the pooled picture data by adopting a reshape function of Python, and changing the [ color, H, W ] into [ k, color, H, W ];
Performing dimension lifting on the time series data of the simplified characteristic spectrum intensity electric signals by adopting a reshape function of Python, and changing from [ light, m ] to [ k, light, H, m/(kH) ];
The number of channels of RGB colors is 3 as same as the number of light paths of the characteristic light;
And splicing the picture data with the lifted dimension and the time sequence data of the characteristic spectrum intensity electric signal with the lifted dimension by adopting a concatate function of Python to obtain fused data, wherein the dimension is [ k, 3, H, [ W+m/(kH) ].
Further, the neural network analysis module adopts a neural network multi-classification model, and training of the neural network multi-classification model comprises the following steps:
(1) Constructing an initial neural network multi-classification model:
(2) Training an initial neural network multi-classification model by adopting a sample data set of characteristic spectrum intensity electric signals and visual information to obtain a trained neural network multi-classification model; the sample data set is marked with a defect label, and the information contained in the defect label comprises: the defect grade, the defect type and the cause, and the defect-free sample defect label is qualified; the characteristic spectrum intensity electric signal and the visual information are used as the input of the model, and the defect label is used as the output of the model.
Further, the training steps of the neural network multi-classification model are specifically as follows:
constructing a multi-input multi-output neural network multi-classification model by using a deep learning framework;
The input layer of the model comprises two input channels which are respectively used for inputting characteristic spectrum intensity electric signals and visual information; each channel has a data preprocessing step;
Defining a plurality of output neurons in an output layer of the model, wherein each neuron corresponds to one welding defect label, and outputting the probability of each defect label by using a softmax activation function;
collecting a sample data set containing characteristic spectrum intensity electric signals and visual information, and classifying and marking defect types in each sample so as to generate a defect label; the defect label contains information including: the defect grade, the defect type and the cause, and the defect-free sample defect label is qualified; dividing the sample data set into 10 parts averagely, taking 5-9 parts of the sample data set as a training set and the rest part as a test set; according to the condition of the sample data set, the training effect is best when the ratio of the test set is 10% -50%.
Training a neural network multi-classification model using the training set; in the training process, the model fuses the characteristic spectrum intensity electric signals and visual information to identify different types of welding defects;
Evaluating the performance of the model on the test set, and evaluating the performance of the model by adopting the accuracy rate and the recall rate; precision: the ratio of the number of samples that the model correctly identifies as positive samples to the number of samples that all identified as positive samples;
Precision=TP/[TP+FP];
Recall: the ratio of the number of positive samples successfully detected by the model to the number of actual positive samples;
Recall=TP/[TP+FN];
Wherein:
TP is a real instance (True Positives), indicating the number of samples for which the pattern is correctly identified as a positive instance;
FP is a false positive (False Positives), referring to the number of samples that the pattern error identified as positive;
FN is a false negative (FALSE NEGATIVES), meaning the number of samples that the pattern error identifies as negative;
The model parameters are adjusted according to the performance on the test set to improve the generalization capability of the model, and the method comprises the following steps:
1) Adjusting the model super parameters: according to the performance on the test set, the super parameters of the model are adjusted, and the learning rate is adjusted by increasing or decreasing the network layer [ layers ], and the regularization term is adjusted [ regularization _term ];
2) Retraining the model: after the model super parameters are adjusted, training the neural network multi-classification model by reusing the training set;
3) And (5) evaluating again: after model hyper-parameters are adjusted and retrained, the test set is used again to evaluate the performance of the neural network multi-class model, and training is stopped when the accuracy (Precision) and Recall (Recall) are both greater than 95% or performance is no longer improved.
A device for monitoring interconnection of real-time vision and characteristic spectrum laser welding, at least comprising:
the acquisition device is used for acquiring the characteristic spectrum intensity electric signals according to the time sequence and acquiring visual information according to the time sequence;
The data processing module is used for respectively preprocessing the acquired visual information and the characteristic spectrum intensity electric signals, and then fusing the preprocessed data;
The data analysis module is used for inputting the fused data into the neural network analysis module, carrying out reasoning calculation and outputting an interconnection monitoring result.
In some embodiments, the system further comprises a laser welding control module for regulating and controlling the laser welding process according to the interconnection monitoring result.
Further, the acquisition module comprises a characteristic spectrum intensity monitoring module and a visual information monitoring module, wherein the characteristic spectrum intensity monitoring module comprises an optical path calibration module, a focusing module and a first light splitting module which are sequentially arranged according to an optical path; the first light splitting module divides the light path into two parts, wherein one light path is sequentially provided with a first light filtering module and a first photoelectric conversion module, the other light path is provided with a second light splitting module, the second light splitting module divides the light path into two parts, one light path is sequentially provided with a second light filtering module and a second photoelectric conversion module, and the other light path is sequentially provided with a third light filtering module and a third photoelectric conversion module;
The optical path calibration module is used for enhancing the intensity of the input characteristic spectrum signal; the focusing module is used for focusing the enhanced characteristic spectrum signals; the first light splitting module and the second light splitting module are used for splitting the focused characteristic spectrum signals into visible light, infrared light and reflected light; the first filtering module, the second filtering module and the third filtering module are used for respectively filtering the three paths of divided light to obtain spectrums of required characteristic wave bands; the first photoelectric conversion module, the second photoelectric conversion module and the third photoelectric conversion module are used for converting the spectrum of the filtered characteristic wave band into an electric signal;
the visual information monitoring module comprises a camera for collecting pictures and an auxiliary light source for illuminating the welding part.
A method of regulating a laser welding process, comprising:
s1, executing the real-time vision and characteristic spectrum laser welding interconnection monitoring method to obtain a defect label;
s2, regulating and controlling a laser welding process according to the defect label, wherein the process comprises the following steps of:
If the defect label is slight defect-hole-moisture, the welding is stopped, and the pre-cleaning is performed;
if the defect label is a slight defect, a hole and greasy dirt, the welding is stopped, and the pre-cleaning is carried out;
If the defect label is a slight defect, the weld joint is formed poorly, and the workpiece is deformed slightly, modifying a welding path and avoiding a deformed area;
If the defect label is middle defect-virtual welding-power reduction, increasing the power of laser welding;
if the defect label is middle defect-splashing-power is overlarge, reducing the power of laser welding;
if the defect label is a serious defect, namely large-area incomplete welding is performed, and the power is too low, welding is stopped, the parameters of laser welding are reset, and then welding is performed;
if the defect label is serious defect-large crack-improper technological parameters, stopping welding, resetting parameters of laser welding, and then welding;
if the defect label is unqualified, welding is stopped, and the workpiece is invalidated;
if the defective label is qualified, the current laser welding parameters are kept unchanged.
An electronic device, comprising:
One or more processors;
A memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the real-time vision and characteristic spectrum laser welding interconnection monitoring method described above.
A computer readable storage medium having stored thereon computer instructions which when executed by a processor perform the steps of the real-time vision and characteristic spectrum laser welding interconnection monitoring method described above.
The invention has the beneficial effects that: by adopting the neural network analysis module to integrate spectral energy and vision to monitor the laser welding quality, the accuracy of the monitoring in the welding to the judgment of the defects is improved, the contradiction between different data sources is solved, and the judgment is quick and accurate.
Drawings
FIG. 1 is a flow chart of a real-time vision and characteristic spectrum laser welding interconnection monitoring method of the present invention;
FIG. 2 is a schematic block diagram of an apparatus for real-time visual and characteristic spectrum laser welding interconnection monitoring in accordance with the present invention;
FIG. 3 is a visual information picture of a defect label of a real-time visual and characteristic spectrum laser welding interconnection monitoring method of the invention as a slight defect-hole-greasy dirt;
FIG. 4 is a graph of electrical signals of spectral intensities of light defect-hole-greasy dirt on a defect label of a real-time visual and characteristic spectrum laser welding interconnection monitoring method according to the present invention;
FIG. 5 is a visual information picture of a defect label of a real-time visual and characteristic spectrum laser welding interconnection monitoring method of the invention as moderate defect-cold joint-power drop;
FIG. 6 is a graph of electrical signals of spectral intensity for a moderate defect-cold joint-power reduction for a defective label of a real-time vision and characteristic spectrum laser welding interconnection monitoring method of the present invention;
FIG. 7 is a photograph of visual information of moderate defect-spatter-power overload for a real-time visual and characteristic spectrum laser welding interconnection monitoring method of the present invention;
FIG. 8 is a graph of electrical signals of moderate defect-splash-power over spectral intensity for a real-time visual and characteristic spectrum laser welding interconnection monitoring method of the present invention;
FIG. 9 is a schematic illustration of a pre-processing of a real-time vision and characteristic spectrum laser welding interconnection monitoring method of the present invention;
The device comprises a characteristic spectrum intensity monitoring module 1, an optical path calibration module 1-1, a focusing module 1-2, a first light splitting module 1-3, a first light filtering module 1-3-1, a first photoelectric conversion module 1-3-2, a second light splitting module 1-4, a second light filtering module 1-4-1, a second photoelectric conversion module 1-4-2, a third light filtering module 1-4-3, a third photoelectric conversion module 1-4-4, a visual information monitoring module 2, a camera 2-1, an auxiliary light source 2-2, a data processing module 3, a data analysis module 4, a laser welding control module 5, a welding galvanometer system 6 and a workpiece 7.
Detailed Description
In order to further describe the technical means and effects adopted by the present invention for achieving the intended purpose, the following detailed description will refer to the specific implementation, structure, characteristics and effects according to the present invention with reference to the accompanying drawings and preferred embodiments.
By adopting the neural network analysis module to integrate spectrum energy and vision to monitor the laser welding quality, the accuracy of judging defects in welding monitoring is improved, and the contradiction between different data sources is solved; through the regulation and control to laser welding, welding quality is improved, and the yield of workpieces is improved.
Referring to fig. 1, a real-time vision and characteristic spectrum laser welding interconnection monitoring method comprises the following steps:
(1) Visual information and characteristic spectrum intensity electric signals of a welding part are collected according to a time sequence;
(2) Respectively preprocessing the acquired visual information and characteristic spectrum intensity electric signals, and then fusing the preprocessed data;
(3) Inputting the fused data into a neural network analysis module, carrying out reasoning calculation, and outputting an interconnection monitoring result, wherein the interconnection monitoring result is defective or non-defective; if the defect exists, marking a defect label, wherein the information contained in the defect label comprises the following steps: defect grade, defect type, and cause; if the defect is not detected, marking the defect label as qualified.
In the application, visual information comprises 100 pictures arranged according to a time sequence, wherein the data dimension of each picture is [ color, h, w ], color is the number of channels of RGB colors of the picture, color=3, h and w are the resolution in the height direction and the resolution in the width direction of the picture resolution corresponding to any color channel, and h=1200, w=1100.
In the application, the characteristic spectrum intensity electric signals are stored in a time sequence form, and the dimension is [ light, M ], wherein light represents the light path number of characteristic light, light=3, M represents the time sequence data number M=50000 of the characteristic spectrum intensity electric signals corresponding to any path of characteristic light, and the characteristic light comprises visible light, infrared light and reflected light.
In the application, the data preprocessing of the visual information is specifically as follows:
Splicing 100 pictures according to the acquired time sequence, and splicing 10 rows and 10 columns from left to right from top to bottom to form a large picture with data dimension of [3, 12000, 11000 ]; then, carrying out twice maximum pooling treatment, wherein the dimension of the pooled picture data is [3, 600, 550];
specifically, the size of the pooling window is khxkw, the stride is sh×sw, and kh, kw, sh, sw are adopted in the first pooling, and the height dimension and the width dimension of the output data after the maximum pooling operation can be calculated by the following formula:
H=(10*1200-kh)/sh+1;
W=(10*1100-kw)/sw+1;
H=2400,W=2200;
The size of a pooling window is khxkw, the steps are sh×sw, kh, kw, sh, sw are both 4, and the height dimension and the width dimension of the output data after the maximum pooling operation can be calculated by the following formula:
H=(2400-4)/4+1;
W=(2200-4)/4+1;
H=600,W=550。
The data preprocessing of the characteristic spectrum intensity electric signal comprises the following steps:
Setting a threshold value of 0.3, and if the time series data at a certain moment is smaller than 0.3 in the time series data of any one of the three paths of characteristic spectrum intensity electric signals, synchronously deleting the data at the moment in the corresponding other two paths of time series data; if the number of the deleted time series data exceeds 30000 data, intercepting the first 30000 data; if the number of data is less than 30000, 0 is supplemented at the end until 30000 data are reached; thereby reducing the dimension of the time series data from [3, 50000] to [3, 30000].
In some embodiments kh, kw, sh, sw may be set according to the value of m to facilitate subsequent fusion.
Then fusing the preprocessed picture data with the preprocessed characteristic spectrum intensity electric signal data, wherein the method comprises the following steps of:
Performing dimension lifting on the pooled picture data by adopting a reshape function of Python, and changing from [3, 600, 550] to [1, 3, 600, 550];
Specifically, the dimension of the picture is improved, the original data dimension is set to be the second dimension, the third dimension and the fourth dimension, and the first dimension is increased, so that new four-dimensional data [1,3,600,550] is obtained. During the transformation, the total data amount remains conserved: 3 x 600 x 550 = 1 x 3 x 600 x 550;
prod_img_data= [3,600,550], prod_img_data being the preprocessed picture Data;
Reshaped_Prod_Img_Data = reshape(Prod_Img_Data,[1,3,600,550]) = [1,3,600,550],Reshaped_Prod_Img_Data The dimension-lifted picture data is obtained;
reshape functions represent the transform data dimensions, parameters of reshape functions: the first part is the data that needs to be transformed and the second part is the dimension of the size of the transformed data.
Performing dimension lifting on the simplified time sequence data by adopting reshape functions of Python, and changing [3,30000] into [1, 3, 600, 50];
Specifically, the time series data is subjected to dimension lifting, the original data dimension is set to be a second dimension and a third dimension, and the first dimension and the fourth dimension are added to obtain new four-dimensional data [1,3,30000,1]. Further transformation to [1,3,600,50], the total data amount remains conserved during the transformation: 3×30000=1×3×30000×1=1×3×600×50;
Elec _sig_data= [3,30000], elec _sig_data is the time-series Data after preprocessing;
Reshaped_Elec_Sig_Data = reshape[Elec_Sig_Data,[1,3,30000,1]] = [1,3,30000,1];
Reshaped_Elec_Sig_Data = reshape[Elec_Sig_Data,[1,3,600,50]] = [1,3,600,50];
reshaped _ Elec _Sig_Data is time-series Data after dimension promotion.
Reshape functions represent the transform data dimensions, parameters of reshape functions: the first part is the data that needs to be transformed and the second part is the dimension of the size of the transformed data.
The number of channels of RGB colors is 3 as same as the number of light paths of the characteristic light;
And splicing the picture data with the lifted dimension and the time sequence data with the lifted dimension by adopting a concatate function of Python to obtain fused data, wherein the dimension of the fused data is [1, 3, 600, 600].
Specifically, the transformed electric signal data and the picture data are spliced to obtain a final four-dimensional array, and the dimension of the final four-dimensional array is [1,3,600,600]. This can be achieved by stitching the two arrays along the appropriate dimensions.
Final_Array
=concatenate((Reshaped_Elec_Sig_Data, Reshaped_Prod_Img_Data), axis=3)
= concatenate(([1,3,600,50], [1,3,600,550]), axis=3)
=[1,3,600,550+50]= [1,3,600,600];
Final_array is the fused data.
The concatate function indicates that a plurality of data arrays are spliced, the parameters of the concatate function, a plurality of data to be spliced, and the value of axis indicates the dimension to be spliced, and the dimension numbers are (0, 1,2,3 …).
In the application, the dimension promotion and the fusion are carried out by adopting a function carried by Python language.
Through these array transformation operations, the original two-dimensional electrical signal data and three-dimensional picture data can be integrated into a four-dimensional array with a higher dimension for further processing and analysis.
In the application, a neural network analysis module adopts a neural network multi-classification model, and the training of the neural network multi-classification model comprises the following steps:
constructing an initial neural network multi-classification model:
training an initial neural network multi-classification model by adopting a sample data set of characteristic spectrum intensity electric signals and visual information to obtain a trained neural network multi-classification model; the sample data set is marked with a defect label, and the information contained in the defect label comprises: the defect grade, the defect type and the cause, and the defect-free sample defect label is qualified; the characteristic spectrum intensity electric signal and the visual information are used as the input of the model, and the defect label is used as the output of the model.
In the application, the training steps of the neural network multi-classification model are specifically as follows:
constructing a multi-input multi-output neural network multi-classification model by using a deep learning framework;
The input layer of the model comprises two input channels which are respectively used for inputting characteristic spectrum intensity electric signals and visual information; each channel has a data preprocessing step;
Defining a plurality of output neurons in an output layer of the model, wherein each neuron corresponds to one welding defect label, and outputting the probability of each defect label by using a softmax activation function;
Collecting a sample data set containing characteristic spectrum intensity electric signals and visual information, and classifying and marking defect types in each sample so as to generate a defect label; the defect label contains information including: the defect grade, the defect type and the cause, and the defect-free sample defect label is qualified; then equally dividing the sample data set into 10 parts, taking 9 parts of the 10 parts as training sets and the rest part as test sets; in some embodiments, 5 of the training sets can be used as training sets, the rest is used as a test set, and the training effect is best when the ratio of the test set is 10% -50% according to the condition of the sample data set.
Training a neural network multi-classification model using the training set; in the training process, the model fuses the characteristic spectrum intensity electric signals and visual information to identify different types of welding defects;
Evaluating the performance of the model on the test set, and evaluating the performance of the model by adopting the accuracy rate and the recall rate; precision: the ratio of the number of samples that the model correctly identifies as positive samples to the number of samples that all identified as positive samples;
Precision=TP/[TP+FP];
Recall: the ratio of the number of positive samples successfully detected by the model to the number of actual positive samples;
Recall=TP/[TP+FN];
Wherein:
TP is a real instance (True Positives), indicating the number of samples for which the pattern is correctly identified as a positive instance;
FP is a false positive (False Positives), referring to the number of samples that the pattern error identified as positive;
FN is a false negative (FALSE NEGATIVES), meaning the number of samples that the pattern error identifies as negative;
The model parameters are adjusted according to the performance on the test set to improve the generalization capability of the model, and the method comprises the following steps:
1) Adjusting the model super parameters: according to the performance on the test set, the super parameters of the model are adjusted, and the learning rate is adjusted by increasing or decreasing the network layer [ layers ], and the regularization term is adjusted [ regularization _term ];
2) Retraining the model: after the model super parameters are adjusted, training the neural network multi-classification model by reusing the training set;
3) And (5) evaluating again: after model hyper-parameters are adjusted and retrained, the test set is used again to evaluate the performance of the neural network multi-class model, and training is stopped when the accuracy (Precision) and Recall (Recall) are both greater than 95% or performance is no longer improved.
Referring to fig. 2, an apparatus for real-time visual and characteristic spectrum laser welding interconnection monitoring includes:
The characteristic spectrum intensity monitoring module 1 is used for acquiring characteristic spectrum intensity electric signals according to a time sequence;
The acquisition module comprises a visual information monitoring module 2 and a data processing module 3, wherein the visual information monitoring module 2 is used for acquiring visual information according to a time sequence; the data processing module 3 is used for respectively preprocessing the acquired visual information and the characteristic spectrum intensity electric signals, and then fusing the preprocessed data;
And the data analysis module 4 is used for inputting the fused data into the neural network analysis module, carrying out reasoning calculation, outputting an interconnection monitoring result and labeling the defect label.
The device of the application also comprises a laser welding control module 5 for regulating and controlling the laser welding process according to the interconnection monitoring result.
The hardware of the data processing module 3, the data analysis module 4 and the laser welding control module 5 of the application consists of a digital signal conversion module, an IO communication module and an industrial personal computer.
In the application, a characteristic spectrum intensity monitoring module 1 comprises a light path calibration module 1-1, a focusing module 1-2 and a first light splitting module 1-3 which are sequentially arranged according to a light path; the first light splitting module 1-3 divides the light path into two parts, wherein one light path is sequentially provided with the first light filtering module 1-3-1 and the first photoelectric conversion module 1-3-2, the other light path is provided with the second light splitting module 1-4, the second light splitting module 1-4 divides the light path into two parts, one light path is sequentially provided with the second light filtering module 1-4-1 and the second photoelectric conversion module 1-4-2, and the other light path is sequentially provided with the third light filtering module 1-4-3 and the third photoelectric conversion module 1-4-4;
The optical path calibration module 1-1 is used for enhancing the intensity of the input characteristic spectrum signal and ensuring that the intensity reaches the maximum value; the focusing module 1-2 is used for focusing the enhanced characteristic spectrum signals; the first light-splitting module 1-3 and the second light-splitting module 1-4 are used for splitting the focused characteristic spectrum signals into visible light, infrared light and reflected light; the first filtering module 1-3-1, the second filtering module 1-4-1 and the third filtering module 1-4-3 are used for respectively filtering the divided three paths of light to obtain a spectrum of a required characteristic wave band; the first photoelectric conversion module 1-3-2, the second photoelectric conversion module 1-4-2 and the third photoelectric conversion module 1-4-4 are used for converting the spectrum of the filtered characteristic wave band into an electric signal;
In the application, the focusing module 1-2 selects the plano-convex lens with long focal length, the plane of the lens faces the first light splitting module, and the focusing module with long focal length has the following advantages:
1) The spherical aberration is small. The focal length is long, the curvature of the lens is small, and the curvature is small, so that the spherical aberration introduced by the lens is small;
2) The aberration is small. A lens with a long focal length causes less aberration than a lens with a short focal length
3) The depth of field is large. The lens with long focal length has larger depth of field than the lens with short focal length, the depth of field is proportional to the focal length of the object of the lens, and the focal length of the object is large and the depth of field is large;
4) The signal light outside the molten pool can be prevented from entering the sensor. In addition, during actual welding, as reflected light outside the molten pool is not a paraxial light beam when being coupled into a light path, the incident angle is larger, and the reflected light is focused at a position with a far distance from a focal plane through a lens, so that the reflected light cannot be received by a photoelectric conversion module with a small photosensitive area, the interference of the light outside the molten pool on the light in the molten pool is avoided, and only the light signal in the molten pool is collected, so that accurate detection is realized.
The visual information monitoring module 2 includes a camera 2-1 for taking pictures and an auxiliary light source 2-2 for illuminating the welding site.
Because most of traditional visual detection adopts a coaxial photographing mode, the incident light beam is attenuated by a plurality of lenses, and finally the light intensity of the camera is very weak, so that a bright photo cannot be taken. The visual information monitoring module 2 adopts a paraxial photographing mode. The camera 2-1 is composed of a protective lens, a filter module, a lens, a camera and an auxiliary light source, when the paraxial shooting is performed, the incident light intensity is enough, and then the filter module is matched, so that a clear picture can be obtained.
Based on the application of the real-time vision and characteristic spectrum laser welding interconnection monitoring method, the laser welding process is regulated and controlled according to the interconnection monitoring result, and the method specifically comprises the following steps:
If the defect label is slight defect-hole-moisture, the welding is stopped, and the pre-cleaning is performed;
If the defect label is a slight defect, a hole and greasy dirt, the welding is stopped, and the pre-cleaning is carried out; as shown in fig. 3-4, the defective label is slightly defective-hole-greasy;
the pre-cleaning process of interconnection monitoring comprises the following steps:
The interconnection monitoring discovers dirt through visual information monitoring, and the dirt is firstly pre-cleaned and then laser welding is carried out, so that the direct laser welding on the dirt is avoided, and the welding yield is improved. The main process comprises the following steps: 1) Feeding, and starting a visual information monitoring module 2; 2) The visual information monitoring module 2 finds that the surface of the workpiece is dirty; 3) The data analysis module 4 sends a pre-cleaning instruction to the laser welding control module 5; 4) The laser welding control module 5 controls the output low-power laser to clean dirt, and 5) the visual information monitoring module 2 continuously shoots the workpiece, so that no dirt is ensured; 6) Laser welding a workpiece; 7) The visual information monitoring module 2 and the characteristic spectrum intensity monitoring module 1 collect the picture and the characteristic energy spectrum in welding; 8) And the neural network analysis module in the data analysis module 4 outputs the image and the characteristic spectrum intensity electric signals collected by the data analysis module and outputs a welding quality result.
The surface soil may be oil stains, particles, and other adhering foreign matter, which may be captured by the photograph taken by the visual information monitoring module 2 and then fed back to the data analysis module 4.
The data analysis module 4 and the laser welding control module 5 are connected together through a DB25 data line, the pre-cleaning process is to preset process parameters, the process parameters are pre-stored in the laser welding control module 5, and the process parameters are called out when the process needs to be started. For example, for a 2000W fiber laser, varying the magnitude of the input voltage may vary the magnitude of the output power. When the input voltage is varied in the range of 0-4V, the output power is varied in the range of 0-100%.
The laser pre-cleaning path is the same as the actual welding path, the laser starts the pre-cleaning process, and the laser with the output of 100W is scanned once along the welding path, so that dirt on the welding path can be cleaned, and preparation is made for subsequent formal welding. The visual information monitoring module 2 continues to photograph the workpiece, ensuring that the workpiece is laser welded without dirt. Such as welding a thin aluminum plate to a thick aluminum plate, at a welding power of 2000W, a beam travel speed of 200mm/S, and a laser scanning the weld along a programmed path. The visual information monitoring module 2 and the characteristic spectrum intensity monitoring module 1 collect the in-welding pictures and the characteristic spectrum intensity electric signals. And the neural network analysis module in the data analysis module 4 collects the pictures and the characteristic spectrum intensity electric signals and outputs interconnection monitoring results to label the defect labels.
Then, a regulation and control method of a laser welding process is adopted, comprising the following steps:
s1, executing the real-time vision and characteristic spectrum laser welding interconnection monitoring method to obtain a defect label;
s2, regulating and controlling a laser welding process according to the defect label, wherein the process comprises the following steps of:
if the defect label is middle defect-virtual welding-power reduction, increasing the power of laser welding; as in fig. 5-6, the defective label is a moderate defect-cold joint-power drop;
if the defect label is middle defect-splashing-power is overlarge, reducing the power of laser welding; as in fig. 7-8, the defective label is a medium defect-splash-power excess;
If the defect label is a slight defect, the weld joint is formed poorly, and the workpiece is deformed slightly, modifying a welding path and avoiding a deformed area;
if the defect label is a serious defect, namely large-area incomplete welding is performed, and the power is too low, welding is stopped, the parameters of laser welding are reset, and then welding is performed;
if the defect label is serious defect-large crack-improper technological parameters, stopping welding, resetting parameters of laser welding, and then welding;
if the defect label is unqualified, welding is stopped, and the workpiece is invalidated;
if the defective label is qualified, the current laser welding parameters are kept unchanged.
In the application, the visual information monitoring module 2 can be adopted independently for pretreatment before the device for interconnection monitoring operates, so that the problems of workpiece deformation and the like are found, the welding path is modified, and then the welding process for interconnection monitoring is carried out, wherein the main steps are as follows:
1. the visual information monitoring module 2 finds the deformation of the workpiece and positions the deformation position of the workpiece;
2. the data analysis module 4 re-plans the weld path;
3. the data analysis module 4 sends the changed weld information to the laser welding control module 5 through a DB25 interface;
4. the laser welding control module 5 performs laser welding according to the new weld information, so that the welding quality of the deformed peripheral area of the workpiece is ensured;
5. the device for interconnection monitoring monitors visual information and characteristic spectrum intensity electric signals in the welding process;
6. And the device for interconnection monitoring outputs a monitoring result.
As shown in fig. 9, for welding the seal top cover of the cylindrical battery, the default welding path of the workpiece is a circular ring (left diagram of fig. 9), but the visual information monitoring module 2 photographs the workpiece to find that the surface of the workpiece has deformation (right diagram of fig. 9), and the deformation is sunken to one side, so that the visual information monitoring module 2 transmits the relevant picture to the data analysis module 4, the data analysis module 4 modifies the welding path according to the information such as the position size of the deformation, and sends the modified welding path to the laser welding control module 5, and then the laser welding control module 5 carries out laser welding according to the information, so that the welding path of the deformation area is finally modified, and the yield is improved.
The application itself also provides an electronic device comprising:
One or more processors;
A memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement a real-time vision and characteristic spectrum laser welding interconnection monitoring method.
The application itself also provides a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of a real-time vision and characteristic spectrum laser welding interconnection monitoring method.
The present invention is not limited in any way by the above-described preferred embodiments, but is not limited to the above-described preferred embodiments, and any person skilled in the art will appreciate that the present invention can be embodied in the form of a program for carrying out the method of the present invention, while the above disclosure is directed to equivalent embodiments capable of being modified or altered in some ways, it is apparent that any modifications, equivalent variations and alterations made to the above embodiments according to the technical principles of the present invention fall within the scope of the present invention.

Claims (8)

1. The real-time vision and characteristic spectrum laser welding interconnection monitoring method is characterized by comprising the following steps of:
(1) Visual information and characteristic spectrum intensity electric signals of a welding part are collected according to a time sequence;
(2) Respectively preprocessing the acquired visual information and characteristic spectrum intensity electric signals, and then fusing the preprocessed data;
(3) Inputting the fused data into a neural network analysis module, carrying out reasoning calculation, outputting an interconnection monitoring result, and labeling a defect label;
The visual information comprises n pictures arranged according to a time sequence, wherein the data dimension of each picture is [ color, h, w ], the color is the number of channels of RGB colors of the picture, and h and w are the resolution of the picture corresponding to any color channel in the height direction and the resolution of the picture in the width direction respectively;
The characteristic spectrum intensity electric signals are stored in a time sequence mode, the dimension is [ light, M ], wherein light represents the light path number of characteristic light, M is the number of time sequence data of the characteristic spectrum intensity electric signals corresponding to any path of characteristic light, and the characteristic light comprises visible light, infrared light and reflected light;
The data preprocessing of the visual information specifically comprises the following steps:
Splicing N pictures according to the acquired time sequence, splicing N rows and N columns from left to right from top to bottom to form a large picture with data dimension [ color, N x h, N x w ], wherein N=n (-2), then rounding N upwards, and filling a blank in a deficient part of the large picture when N-2 is larger than N after rounding upwards; then, carrying out at least one time of maximum pooling treatment, wherein the dimension of the pooled picture data is [ color, H, W ];
The data preprocessing of the characteristic spectrum intensity electric signal comprises the following steps:
Setting a threshold value, and if the time series data at a certain moment is smaller than the threshold value in the time series data of any one of the three paths of characteristic spectrum intensity electric signals, synchronously deleting the data at the moment in the corresponding other two paths of time series data; if the number of the deleted time series data exceeds m data, intercepting the first m data; if the m data are less, 0 is supplemented at the end until the m data are reached; thereby reducing the dimension of the time series data from [ light, M ] to [ light, M ];
The fusion method comprises the following steps:
Carrying out dimension lifting on the pooled picture data by adopting a reshape function of Python, and changing the [ color, H, W ] into [ k, color, H, W ];
Performing dimension lifting on the time series data of the simplified characteristic spectrum intensity electric signals by adopting a reshape function of Python, and changing from [ light, m ] to [ k, light, H, m/(kH) ];
The number of channels of RGB colors is 3 as same as the number of light paths of the characteristic light;
and splicing the picture data with the lifted dimension and the time sequence data of the characteristic spectrum intensity electric signal with the lifted dimension by adopting a concatate function of Python to obtain fused data, wherein the dimension is [ k, 3, H, [ W+m/[ kH ] ].
2. The method of claim 1, wherein the neural network analysis module employs a neural network multi-classification model, and wherein training of the neural network multi-classification model comprises the steps of:
(1) Constructing an initial neural network multi-classification model:
(2) Training an initial neural network multi-classification model by adopting a sample data set of characteristic spectrum intensity electric signals and visual information to obtain a trained neural network multi-classification model; the sample data set is marked with a defect label, and the information contained in the defect label comprises: the defect grade, the defect type and the cause, and the defect-free sample defect label is qualified; the characteristic spectrum intensity electric signal and the visual information are used as the input of the model, and the defect label is used as the output of the model.
3. The method according to claim 2, wherein the training step of the neural network multi-classification model is specifically as follows:
constructing a multi-input multi-output neural network multi-classification model by using a deep learning framework;
The input layer of the model comprises two input channels which are respectively used for inputting characteristic spectrum intensity electric signals and visual information; each channel has a data preprocessing step;
Defining a plurality of output neurons in an output layer of the model, wherein each neuron corresponds to one welding defect label, and outputting the probability of each defect label by using a softmax activation function;
collecting a sample data set containing characteristic spectrum intensity electric signals and visual information, and classifying and marking defect types in each sample so as to generate a defect label; the defect label contains information including: the defect grade, the defect type and the cause, and the defect-free sample defect label is qualified; dividing the sample data set into 10 parts averagely, taking 5-9 parts of the sample data set as a training set and the rest part as a test set;
Training a neural network multi-classification model using the training set; in the training process, the model fuses the characteristic spectrum intensity electric signals and visual information to identify different types of welding defects;
Evaluating the performance of the model on the test set, and evaluating the performance of the model by adopting the accuracy rate and the recall rate; precision: the ratio of the number of samples that the model correctly identifies as positive samples to the number of samples that all identified as positive samples;
Precision=TP/[TP+FP];
Recall: the ratio of the number of positive samples successfully detected by the model to the number of actual positive samples;
Recall=TP/[TP+FN];
Wherein:
TP is a real instance, and refers to the number of samples for which the model is correctly identified as a positive instance;
FP is a false positive, referring to the number of samples that the pattern error identified as positive;
FN is a false negative example, referring to the number of samples with pattern errors identified as negative;
The model parameters are adjusted according to the performance on the test set to improve the generalization capability of the model, and the method comprises the following steps:
1) Adjusting the model super parameters: according to the performance on the test set, the super parameters of the model are adjusted, and the learning rate is adjusted by increasing or decreasing the network layer [ layers ], and the regularization term is adjusted [ regularization _term ];
2) Retraining the model: after the model super parameters are adjusted, training the neural network multi-classification model by reusing the training set;
3) And (5) evaluating again: after the model super-parameters are adjusted and retrained, the test set is used again to evaluate the performance of the neural network multi-classification model, and training is stopped when the accuracy rate and the recall rate are both greater than 95% or the performance is no longer improved.
4. An apparatus for real-time vision and characteristic spectrum laser welding interconnection monitoring based on the method of claim 1, comprising at least:
acquisition means for acquiring the characteristic spectrum intensity electrical signal in time series and the visual information in time series according to the method of claim 1;
The data processing module is used for respectively preprocessing the acquired visual information and the characteristic spectrum intensity electric signals according to the method of claim 1, and then fusing the preprocessed data;
the data analysis module is used for inputting the fused data into the neural network analysis module according to the method of claim 1, carrying out reasoning calculation and outputting the interconnection monitoring result.
5. The device of claim 4, wherein the acquisition module comprises a characteristic spectrum intensity monitoring module and a visual information monitoring module, and the characteristic spectrum intensity monitoring module comprises an optical path calibration module, a focusing module and a first light splitting module which are sequentially arranged according to an optical path; the first light splitting module divides the light path into two parts, wherein one light path is sequentially provided with a first light filtering module and a first photoelectric conversion module, the other light path is provided with a second light splitting module, the second light splitting module divides the light path into two parts, one light path is sequentially provided with a second light filtering module and a second photoelectric conversion module, and the other light path is sequentially provided with a third light filtering module and a third photoelectric conversion module;
the visual information monitoring module comprises a camera for collecting pictures and an auxiliary light source for illuminating the welding part.
6. A method for regulating a laser welding process, comprising:
s1, executing the real-time vision and characteristic spectrum laser welding interconnection monitoring method of claim 1 to obtain a defect label;
s2, regulating and controlling a laser welding process according to the defect label, wherein the process comprises the following steps of:
If the defect label is slight defect-hole-moisture, the welding is stopped, and the pre-cleaning is performed;
if the defect label is a slight defect, a hole and greasy dirt, the welding is stopped, and the pre-cleaning is carried out;
If the defect label is a slight defect, the weld joint is formed poorly, and the workpiece is deformed slightly, modifying a welding path and avoiding a deformed area;
If the defect label is middle defect-virtual welding-power reduction, increasing the power of laser welding;
if the defect label is middle defect-splashing-power is overlarge, reducing the power of laser welding;
if the defect label is a serious defect, namely large-area incomplete welding is performed, and the power is too low, welding is stopped, the parameters of laser welding are reset, and then welding is performed;
if the defect label is serious defect-large crack-improper technological parameters, stopping welding, resetting parameters of laser welding, and then welding;
if the defect label is unqualified, welding is stopped, and the workpiece is invalidated;
if the defective label is qualified, the current laser welding parameters are kept unchanged.
7. An electronic device, comprising:
One or more processors;
A memory for storing one or more programs;
The one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-3.
8. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method of any of claims 1-3.
CN202410797266.7A 2024-06-20 2024-06-20 Real-time vision and characteristic spectrum laser welding interconnection monitoring method and device Active CN118371857B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410797266.7A CN118371857B (en) 2024-06-20 2024-06-20 Real-time vision and characteristic spectrum laser welding interconnection monitoring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410797266.7A CN118371857B (en) 2024-06-20 2024-06-20 Real-time vision and characteristic spectrum laser welding interconnection monitoring method and device

Publications (2)

Publication Number Publication Date
CN118371857A CN118371857A (en) 2024-07-23
CN118371857B true CN118371857B (en) 2024-10-01

Family

ID=91912555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410797266.7A Active CN118371857B (en) 2024-06-20 2024-06-20 Real-time vision and characteristic spectrum laser welding interconnection monitoring method and device

Country Status (1)

Country Link
CN (1) CN118371857B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112207482A (en) * 2020-09-23 2021-01-12 上海交通大学 Multi-information monitoring and control system and method for welding quality control

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6670574B1 (en) * 2002-07-31 2003-12-30 Unitek Miyachi Corporation Laser weld monitor
CN116900449A (en) * 2023-08-21 2023-10-20 广州虹科电子科技有限公司 Welding quality monitoring method and system based on signal fusion
CN118014938A (en) * 2024-01-11 2024-05-10 湖南大学 Laser welding penetration prediction method, terminal equipment and storage medium
CN118051871A (en) * 2024-01-30 2024-05-17 上海骄成超声波技术股份有限公司 Laser welding process monitoring method, system and medium based on multi-source data fusion

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112207482A (en) * 2020-09-23 2021-01-12 上海交通大学 Multi-information monitoring and control system and method for welding quality control

Also Published As

Publication number Publication date
CN118371857A (en) 2024-07-23

Similar Documents

Publication Publication Date Title
CN102497952B (en) Laser machining head and method of compensating for the change in focal position of a laser machining head
CN108489986B (en) Additive manufacturing online detection and repair method
CN110411346B (en) Method for quickly positioning surface micro-defects of aspheric fused quartz element
CN106984813A (en) A kind of melt-processed process coaxial monitoring method and device in selective laser
CN113260852B (en) Image denoising using stacked denoising autoencoder
CN110441329B (en) Laser welding defect identification method, device and equipment based on deep learning
CN115184359A (en) Surface defect detection system and method capable of automatically adjusting parameters
CN1709631A (en) High-energy beam welding process multi-signal fusion-monitoring instrument
Kumar et al. Flaws classification using ANN for radiographic weld images
CN117283133B (en) Earphone production line electrode laser welding method and laser welding equipment
CN106529510A (en) Wrinkle recognition method and apparatus for capacitor thin film
CN114113116A (en) Accurate detection process method for micro-defects on surface of large-diameter element
CN114820539A (en) Robot laser detection method based on automobile welding production line
CN113781458A (en) Artificial intelligence based identification method
CN116596982A (en) Garbage heat value prediction system and method based on short-wave hyperspectral camera
CN118371857B (en) Real-time vision and characteristic spectrum laser welding interconnection monitoring method and device
CN111798477B (en) Molten pool monitoring method based on visual technology
Aviles-Viñas et al. Acquisition of welding skills in industrial robots
CN114862777A (en) Connecting sheet welding detection method and system
CN118002802B (en) SLM online quality monitoring and repairing method and system based on deep learning
CN102565069A (en) Infrared microscopic non-destructive detector for integrated circuit
CN117455917A (en) Establishment of false alarm library of etched lead frame and false alarm on-line judging and screening method
CN116912189A (en) Welding spot detection method and system based on deep learning
CN115436390A (en) Material blank defect detection method based on automatic light modulation and multi-camera cooperative control technology
CN117116799B (en) Visual detection method and system for silicon wafer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant