CN112360699A - Intelligent inspection and diagnosis analysis method for blades of full-automatic wind generating set - Google Patents
Intelligent inspection and diagnosis analysis method for blades of full-automatic wind generating set Download PDFInfo
- Publication number
- CN112360699A CN112360699A CN202011139483.5A CN202011139483A CN112360699A CN 112360699 A CN112360699 A CN 112360699A CN 202011139483 A CN202011139483 A CN 202011139483A CN 112360699 A CN112360699 A CN 112360699A
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- aerial vehicle
- blade
- image
- blades
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F03—MACHINES OR ENGINES FOR LIQUIDS; WIND, SPRING, OR WEIGHT MOTORS; PRODUCING MECHANICAL POWER OR A REACTIVE PROPULSIVE THRUST, NOT OTHERWISE PROVIDED FOR
- F03D—WIND MOTORS
- F03D17/00—Monitoring or testing of wind motors, e.g. diagnostics
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U30/00—Means for producing lift; Empennages; Arrangements thereof
- B64U30/20—Rotors; Rotor supports
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F05—INDEXING SCHEMES RELATING TO ENGINES OR PUMPS IN VARIOUS SUBCLASSES OF CLASSES F01-F04
- F05B—INDEXING SCHEME RELATING TO WIND, SPRING, WEIGHT, INERTIA OR LIKE MOTORS, TO MACHINES OR ENGINES FOR LIQUIDS COVERED BY SUBCLASSES F03B, F03D AND F03G
- F05B2260/00—Function
- F05B2260/80—Diagnostics
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F05—INDEXING SCHEMES RELATING TO ENGINES OR PUMPS IN VARIOUS SUBCLASSES OF CLASSES F01-F04
- F05B—INDEXING SCHEME RELATING TO WIND, SPRING, WEIGHT, INERTIA OR LIKE MOTORS, TO MACHINES OR ENGINES FOR LIQUIDS COVERED BY SUBCLASSES F03B, F03D AND F03G
- F05B2260/00—Function
- F05B2260/84—Modelling or simulation
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F05—INDEXING SCHEMES RELATING TO ENGINES OR PUMPS IN VARIOUS SUBCLASSES OF CLASSES F01-F04
- F05B—INDEXING SCHEME RELATING TO WIND, SPRING, WEIGHT, INERTIA OR LIKE MOTORS, TO MACHINES OR ENGINES FOR LIQUIDS COVERED BY SUBCLASSES F03B, F03D AND F03G
- F05B2270/00—Control
- F05B2270/80—Devices generating input signals, e.g. transducers, sensors, cameras or strain gauges
- F05B2270/804—Optical devices
- F05B2270/8041—Cameras
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Sustainable Development (AREA)
- Sustainable Energy (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention provides an intelligent inspection and diagnosis analysis method for blades of a full-automatic wind generating set, which comprises the following steps of: the unmanned aerial vehicle utilizes machine vision and laser radar fusion calculation to reconstruct a fan model, and automatically generates a routing inspection path; the unmanned aerial vehicle flies through the control end to confirm the operation starting point, flies along the routing inspection path, and identifies blades by using the optimal image semantic segmentation model to automatically inspect; the unmanned aerial vehicle sequentially patrols three blades of the fan according to the patrol path and completely acquires images of all the blades; the unmanned aerial vehicle splices the collected blade foreground images under the same path to verify the integrity of the collection; and fault recognition is carried out on the image based on a deep learning algorithm, the category and the severity of the fault image are evaluated, the image is uploaded to a cloud-end platform, classified storage is carried out, and digital processing is carried out on the image. The invention combines machine vision, deep learning, image semantic segmentation technology and the like to realize full-automatic inspection and fault diagnosis of the unmanned aerial vehicle on the blade.
Description
Technical Field
The invention relates to the technical field of unmanned aerial vehicle inspection, in particular to an intelligent inspection and diagnosis analysis method for blades of a full-automatic wind generating set.
Background
The blade is a very critical component in the wind generating set, and the aerodynamic efficiency of the blade determines the capability of the wind generating set to utilize wind energy. This requires that the blade not only have optimum mechanical properties and fatigue strength, but also have properties of corrosion resistance, uv irradiation and lightning protection. The blades inevitably rub and collide with airborne dust and particles at high rotational speeds, causing the leading edges of the blades to become ground and the leading edge bonds to crack. In addition, with the increase of the operating life of the fan, the surface gel coat of the blade is worn and can generate sand holes and cracks after falling off. The sand hole can cause the blade resistance to increase and influence the generated energy, and once become to lead to the fact the lightning protection index to reduce after the chamber sand hole has been led to ponding.
In the traditional technology, a manual means is generally adopted to inspect fan blades, such as a telescope, a ground high-power camera, a hanging basket or a hand-flying unmanned aerial vehicle, and the like, so that the timeliness is low; therefore, carry out automatic through unmanned aerial vehicle and patrol and examine, substitute the manual work that can be good and detect. However, automatic routing inspection in the prior art is usually realized by planning a routing inspection path in advance, but because the fan is in a moving structure, the direction of stopping the fan every time is different, and the like, the fan needs to be modeled again, so that the workload of routing inspection is increased; in addition, errors exist in planning of an inspection path in advance or blade identification through a camera carried by the unmanned aerial vehicle, so that the unmanned aerial vehicle is easy to have a yaw phenomenon in the inspection process in the automatic inspection process; secondly, according to the blade image that unmanned aerial vehicle gathered, the actual size and the actual position of certain defect on the blade can't accurate calculation, and then lead to the degree of difficulty of failure analysis processing great.
Content of application
The invention provides an intelligent inspection and diagnostic analysis method for blades of a full-automatic wind generating set, which is used for realizing automatic inspection of fan blades by combining related technologies such as unmanned aerial vehicles, machine vision and deep learning, so as to solve the technical problems that the automatic inspection path error is large and the defects of the blades cannot be accurately calculated in the prior art, so that the fan blades can be efficiently inspected, more blade data can be obtained, the damage of the blades can be tracked, the maintenance work arrangement is facilitated, the frequency of serious problems is reduced, and the stability and the safety of the blades operated in a wind farm are further improved.
In order to solve the technical problem, the embodiment of the invention is realized by the following technical scheme: the invention provides an intelligent inspection and diagnosis analysis method for blades of a full-automatic wind generating set, which comprises the following steps of:
s1, automatically generating a routing inspection path, and enabling an unmanned aerial vehicle to receive a control signal sent by a control end and fly vertically upwards from the front of a fan hub until the unmanned aerial vehicle flies to the position of the hub; the unmanned aerial vehicle reconstructs a model of the fan under the assistance of fusion calculation of vision and the laser radar and automatically generates a preset routing inspection path;
s2, confirming an operation starting point, automatically searching a central point of a hub by the unmanned aerial vehicle, confirming whether the positions of the unmanned aerial vehicle and the central point of the hub have deviation through a control end, fine-tuning the unmanned aerial vehicle through the control end if the positions of the unmanned aerial vehicle and the central point of the hub have deviation, confirming the operation starting point of the unmanned aerial vehicle through the control end, and executing a step S3; if not, the unmanned aerial vehicle executes the step S3 after confirming the operation starting point through the control end;
s3, unmanned aerial vehicle automatic operation, wherein when the unmanned aerial vehicle flies along the preset inspection path from an operation starting point, the unmanned aerial vehicle obtains a blade foreground image according to the collected blade foreground and background images and an image semantic segmentation technology based on deep learning, so that the unmanned aerial vehicle can automatically identify the blade to realize automatic inspection;
s4, image splicing, namely splicing the blade foreground images segmented by the image semantic segmentation technology based on deep learning into a complete path by the unmanned aerial vehicle through a feature extraction and shape area entropy matching algorithm so as to verify the integrity of the acquisition process;
s5, fault recognition is carried out, the images acquired by the unmanned aerial vehicle are subjected to fault recognition through a deep learning algorithm, the images without faults are directly filtered, the classes and the severity of the images with faults are evaluated, and a fault recognition report is generated to be finally confirmed through the control end;
and S6, carrying out blade digital management, uploading the images with faults acquired by the unmanned aerial vehicle to a cloud platform through a control end, carrying out classified storage, and carrying out digital processing on the images.
According to a preferred embodiment, the method for automatically generating the routing inspection path of the step S1 includes:
s11, shooting a plurality of pictures of the fan by the unmanned aerial vehicle in the vertical upward flying process;
s12, obtaining pixel coordinates of a fan wind tower, the center of a fan hub and the tips of three blades from the picture through fusion calculation of machine vision and a laser radar;
s13, calculating a theoretical three-dimensional coordinate reconstruction fan 3D model of the pixel coordinate by using the position state of the unmanned aerial vehicle, the three-dimensional coordinate of the camera during shooting and camera internal reference information;
and S14, outputting key parameters of the fan, such as yaw direction, fan wind wheel rotation angle, hub center height and blade length, and automatically generating a routing inspection path by the unmanned aerial vehicle according to the key parameters.
According to a preferred embodiment, the unmanned aerial vehicle automatic operation method of step S3 includes:
s31, obtaining foreground and background images of the blade;
s32, labeling the foreground and background images of the blade to generate corresponding label images;
s33, segmenting the foreground and background images and the label images of the blades at a p ixel level by adopting an image semantic segmentation technology to obtain segmented foreground and background images of the blades and segmented label images; then, data enhancement is carried out to form a data set;
s34, building a semantic segmentation network model of a 60+ layer network;
s35, training and optimizing parameters of the semantic segmentation network model by using the data set to obtain an optimal semantic segmentation network model; therefore, the unmanned aerial vehicle carrying the optimal semantic segmentation network model can automatically identify the blade foreground, and automatic inspection is achieved.
According to a preferred embodiment, the method for acquiring the foreground and background images of the blade in step S31 includes:
s311, after confirming the operation starting point, the unmanned aerial vehicle collects the front edge windward side of the first blade to the blade tip of the front edge windward side;
s312, the unmanned aerial vehicle automatically flies for about 5 meters in the leaf length direction, whether the visual field of the unmanned aerial vehicle has a leaf tip or not is confirmed through a control end, if yes, the unmanned aerial vehicle continuously flies in the leaf length direction until no leaf tip exists in the visual field of the unmanned aerial vehicle; if not, the unmanned aerial vehicle can safely pass through the blade tip; the unmanned aerial vehicle automatically records the distance of automatic flight along the leaf length direction, so that the unmanned aerial vehicle can pass through the leaf tip next time without being confirmed by the control end;
s314, acquiring a front scene image and a back scene image of the blade by the unmanned aerial vehicle according to a mode that an operation starting point is from a blade root of a front edge windward side, a blade root of a front edge windward side is from a blade root of the front edge windward side to a blade tip of the front edge windward side, a blade tip of the front edge windward side is from a blade tip of the front edge windward side to a blade tip of a rear edge windward side, a blade tip of the rear edge windward side is from a blade tip of the rear edge windward side to a blade root of the rear edge windward side, a blade root of the rear edge windward side is from a blade root of the rear edge leeward side to a blade;
3315. and the unmanned aerial vehicle automatically repeats the step S314 to complete the blade foreground and background image acquisition process of the rest blades.
According to a preferred embodiment, the image stitching method of step S4 includes:
s41, inputting an image A and an image B which are adjacent in an image set and are formed by at least 20 leaf foreground images under the same path, taking the image A as a template image and taking the image B as an image to be matched;
s42, acquiring the inclination angle alpha of the blade when the fan is stopped and is in an inverted Y shape and the included angle between the blade and the horizontal plane is +/-30 degrees;
s43, moving the section edge part of the image B in the retrieval area of the image A according to the inclination angle alpha through an NCC algorithm to perform template matching;
s44, determining a maximum similarity metric value matched with the template according to difference summation, and preliminarily determining the position relation of the image A and the image B;
s45, detecting and judging the position relation through a feature extraction and shape area entropy matching algorithm to obtain the optimal position relation;
s45, splicing the overlapped areas of the image A and the image B;
s46, repeating the steps from S41 to S45, and automatically splicing the image set into a complete path.
According to a preferred embodiment, the fault identification method of step S5 includes:
s51, obtaining foreground and background images of the blade;
s52, marking the irregular fault from the foreground and background images of the blade at a p ixel level through a brush to generate a p ixel-level irregular fault image; meanwhile, based on the foreground and background segmentation of the blade, dividing a non-fault block and a fault block of the blade into negative and positive samples to form a data set;
s53, building a two-classification model;
s54, training the two classification models by using the data set to obtain the optimal two classification models;
s55, utilizing an optimal two-classification model to perform fault identification on the foreground and background images of the blade, directly filtering the images without faults, and evaluating the classification and severity of the images with faults;
and generating a fault identification report and finally confirming the fault identification report through the control terminal.
According to a preferred embodiment, the blade digital management method of step S6 includes:
the image that unmanned aerial vehicle gathered passes through the control end and uploads to high in the clouds platform, the high in the clouds platform carries out classified storage with the image of uploading according to the blade region to carry out digital processing to the image, in order to conveniently carry out operations such as defect marking, defect measurement and report derivation, realize the full life cycle management of blade.
According to a preferred embodiment, be provided with the module of relocating on the unmanned aerial vehicle, when unmanned aerial vehicle patrols and examines the in-process automatically and appears the blade loss, from the module of relocating of starting, obtain the blade position and trail through scanning from top to bottom, continue the automatic task of patrolling and examining after the relocation.
According to a preferred embodiment, the unmanned aerial vehicle is provided with a system monitoring module and a safety monitoring module, in the automatic inspection process of the unmanned aerial vehicle, when the system monitoring module and the safety monitoring module send out any abnormal warning, the unmanned aerial vehicle triggers an emergency hovering action, wherein the emergency hovering action comprises: and acquiring a one-key return command of the control end or starting a manual take-over so as to ensure the safe landing of the unmanned aerial vehicle.
The technical scheme of the embodiment of the invention at least has the following advantages and beneficial effects:
(1) the unmanned aerial vehicle can realize starting, taking off and landing, blade identification, flying and yaw relocation, can solve the problem of large workload of manual inspection, and can improve the inspection efficiency; (2) the unmanned aerial vehicle intelligently and automatically generates an inspection path, flight parameters of the unmanned aerial vehicle are obtained according to the state of the fan, the inspection path does not need to be preset in advance, the workload can be reduced, and the error of automatic inspection of the unmanned aerial vehicle can be reduced; (3) the image acquired by the unmanned aerial vehicle can completely cover the whole blade, the acquired image is spliced into a complete blade, the integrity of the acquisition process can be verified, missing detection caused by area omission is avoided, and the inspection precision is improved; (4) according to the invention, the spliced image semantic segmentation technology is used for segmenting the foreground and background images of the blade, so that on one hand, the automatic splicing can be used for verifying whether dead angles exist in shooting and routing inspection, on the other hand, other influence factors can be eliminated, the routing inspection precision is improved, and the position and the size of the defect on the blade are directly positioned by combining the laser radar technology; (5) the invention carries out fault identification through a deep learning algorithm, finds defects, judges the severity and the category of the defects, generates a fault identification report and finally confirms the fault identification report through a control end; data can be uploaded to a cloud-end platform to generate a patrol report, and full life cycle management of the blade can be realized through a digital management platform.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart of a method for inspecting, diagnosing and analyzing a blade according to embodiment 1 of the present invention;
FIG. 2 is a schematic diagram of a hardware system framework provided in embodiment 1 of the present invention;
fig. 3 is a schematic diagram of automatic inspection of an unmanned aerial vehicle according to embodiment 1 of the present invention;
fig. 4 is a schematic flow chart of an AI segmentation model according to embodiment 1 of the present invention;
fig. 5 is a schematic view of an automatic splicing process provided in embodiment 1 of the present invention;
fig. 6 is a schematic diagram of fault screening model training provided in embodiment 1 of the present invention;
fig. 7 is a schematic diagram of fault screening model identification provided in embodiment 1 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be further noted that unless otherwise explicitly stated or limited, the terms "disposed," "mounted," "connected," and "connected" should be interpreted broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Example 1
The following method embodiments are described by taking the execution subject as an unmanned aerial vehicle, and it should be noted that,
as shown in the schematic diagram of the hardware system framework of fig. 2, the hardware system framework of the unmanned aerial vehicle in the embodiment is composed of a pan-tilt, a laser radar, an airborne computer, a camera, a picture transmission radio station, a control end and an unmanned aerial vehicle body;
the airborne computer receives point cloud data sent by the laser radar; receiving state data sent by an unmanned aerial vehicle body, and sending a control instruction to the unmanned aerial vehicle; receiving the holder state of the holder and sending a control command to the holder; receiving a camera video stream of a camera, and sending an exposure adjusting and photographing controlling instruction to the camera; sending computer pictures to a picture-transferring and telegraph station; the image transmission radio station is in bidirectional connection with the unmanned aerial vehicle and sends control and state data; and the image transmission station sends real-time images and state data to the control end and receives the control data of the control end.
As shown in the flow chart of the blade inspection and diagnosis analysis method in fig. 1, the invention provides an intelligent inspection and diagnosis analysis method for a blade of a full-automatic wind turbine generator system, which may include the following steps:
s1, automatically generating a routing inspection path, wherein as shown in an automatic routing inspection schematic diagram of the unmanned aerial vehicle shown in FIG. 3, the unmanned aerial vehicle receives a control signal sent by a control end, and vertically flies upwards from the right front of a fan hub until flying to the position of the hub; the unmanned aerial vehicle reconstructs a model of the fan under the assistance of fusion calculation of vision and the laser radar, and automatically generates a preset routing inspection path, so that the routing inspection path is not required to be planned in advance, the workload can be reduced, and the automatic routing inspection error of the unmanned aerial vehicle can be reduced;
specifically, prepare before unmanned aerial vehicle takes off and need to inspect unmanned aerial vehicle self-checking state and hardware state to shut down the fan and be down "Y" font (+ -30 °), the blade is with the horizontal contained angle for (-30) to +/-30 °, deadly the fan wheel hub, and operating personnel places unmanned aerial vehicle in fan wheel hub dead ahead 3m flat air space region.
Optionally, the process of step S1 may be: an operator clicks a take-off button of a control end to send a take-off instruction to the unmanned aerial vehicle, the unmanned aerial vehicle receives a control signal sent by the control end and enters an automatic height measurement stage, the unmanned aerial vehicle flies upwards vertically from the front of a fan hub facing a fan, and the unmanned aerial vehicle takes a plurality of pictures of the fan in the vertical flying process until the pictures fly to the hub position; the unmanned aerial vehicle obtains pixel coordinates of a fan wind tower, a fan hub center and three blade tips from a picture through machine vision and point cloud data fusion calculation sent by a laser radar; the airborne computer calculates a theoretical three-dimensional coordinate reconstruction fan 3D model of the pixel coordinate by utilizing the unmanned aerial vehicle position, the three-dimensional coordinate (holder position) of the camera during shooting and the camera internal reference information; outputting key parameters of the yaw direction of the fan, the wind wheel rotation angle of the fan, the central height of the hub, the length of the blades and the like, and automatically generating a routing inspection path by the unmanned aerial vehicle according to the key parameters.
S2, confirming an operation starting point, automatically searching a central point of a hub by the unmanned aerial vehicle, confirming whether the positions of the unmanned aerial vehicle and the central point of the hub have deviation through a control end, fine-tuning the unmanned aerial vehicle through the control end if the positions of the unmanned aerial vehicle and the central point of the hub have deviation, confirming the operation starting point of the unmanned aerial vehicle through the control end, and executing step S3; if not, the unmanned aerial vehicle executes the step S3 after the control end confirms the operation starting point;
optionally, the process of step S2 may be: the unmanned aerial vehicle automatically searches for the central point of the fan hub air guide sleeve according to the hub central height parameter output in the step S1; the video collected by the camera is processed by the onboard computer and then is sent to the control end in real time for display, an operator determines whether the position of the unmanned aerial vehicle and the central point of the hub has deviation according to a display picture of the control end and a calibration frame displayed by the control end, and if the position of the unmanned aerial vehicle and the central point of the hub has deviation, the operator adjusts the postures of the cradle head and the camera carried on the cradle head according to a fine adjustment button on the picture of the control end so as to enable the cradle head and the camera to be; after confirming that no deviation exists, the operator confirms the button by the motor to determine the operation starting point.
S3, unmanned aerial vehicle automatic operation, starting from an operation starting point, according to the unmanned aerial vehicle flying distance and the set high, medium and low flying speed parameters automatically set according to the fan model, and setting shooting frequency (period or distance), when the unmanned aerial vehicle flies along a preset inspection path, according to the collected front and back scene images of the blade and an image semantic segmentation technology based on deep learning, obtaining a foreground image of the blade, so that the unmanned aerial vehicle can automatically identify the blade to realize automatic inspection; in the embodiment, the unmanned aerial vehicle obtains the blade foreground image by segmenting the blade front and rear images acquired by the camera, so that on one hand, the influence of the blade background image on the unmanned aerial vehicle blade identification can be avoided, the blade identification precision is improved, and the unmanned aerial vehicle yaw probability is reduced; on the other hand still is favorable to the monitoring to the blade safety condition, can greatly reduce the blade and patrol and examine the manual work volume.
Optionally, the process of step S3 may be: acquiring foreground and background images of the blades;
the process of obtaining the foreground and background images of the blade may be as follows: after confirming the operation starting point, the unmanned aerial vehicle collects the front edge windward side of the first blade to the blade tip of the front edge windward side; the unmanned aerial vehicle automatically flies for about 5 meters along the leaf length direction, whether the visual field of the unmanned aerial vehicle has the leaf tips or not is confirmed through the control end, if yes, the unmanned aerial vehicle continuously flies along the leaf length direction until no leaf tips exist in the visual field of the unmanned aerial vehicle; if not, the unmanned aerial vehicle can safely pass through the blade tip; the unmanned aerial vehicle automatically records the distance of automatic flight along the leaf length direction, so that the unmanned aerial vehicle can pass through the leaf tip next time without confirmation through a control end; the unmanned aerial vehicle is according to gathering the route: the method comprises the steps of acquiring a front scene image and a rear scene image of a blade in a mode that an operation starting point is from a blade root of a front edge windward side, a blade root of a front edge windward side is from the blade root of the front edge windward side, a blade root of the front edge windward side is from the blade root of the front edge windward side to the blade tip of a front edge windward side, a blade tip of a front edge windward side is from the blade tip of the front edge windward side to the blade root of a rear edge windward side, a blade root of the rear edge windward side is from the blade root of the rear edge windward side to the blade root of the rear edge leeward side; the unmanned aerial vehicle automatically repeats the acquisition paths to complete the blade foreground and background image acquisition process of the rest blades, and no dead angle is acquired by the blades.
The process of segmenting the leaf foreground image further comprises: as shown in the AI segmentation model flow diagram of fig. 4, the foreground and background images of the leaf are labeled to generate corresponding label images; adopting an image semantic segmentation technology to segment the leaf foreground and background images and the label images at a p ixel level to obtain segmented leaf foreground and background images and segmented label images; then, data enhancement is carried out to form a data set; building a semantic segmentation network model of a 60+ layer network; training and optimizing parameters of the semantic segmentation network model by using a data set to obtain an optimal semantic segmentation network model; therefore, the unmanned aerial vehicle carrying the optimal semantic segmentation network model can automatically identify the blade foreground so as to realize automatic inspection; wherein, be provided with system monitoring module and safety monitoring module on the unmanned aerial vehicle automatic in-process of patrolling and examining, work as when system monitoring module and safety monitoring module send any unusual warning, unmanned aerial vehicle triggers the action of promptly hovering, and wherein, the action of promptly hovering includes: and acquiring a one-key return command of the control end or starting a manual take-over so as to ensure the safe landing of the unmanned aerial vehicle.
S4, image splicing, as shown in an automatic splicing flow diagram of FIG. 5, splicing the blade foreground images segmented by the image semantic segmentation technology based on deep learning into a complete path by an unmanned aerial vehicle through a feature extraction and shape area entropy matching algorithm so as to verify the integrity of the acquisition process, avoid missing detection caused by region omission and improve the inspection precision;
optionally, the process of step S4 may be: inputting an image A and an image B which are adjacent in an image set and formed by at least 20 leaf foreground images under the same path, taking the image A as a template image and taking the image B as an image to be matched; acquiring the inclination angle alpha of the blades when the fan is stopped and is in an inverted Y shape and the included angle between the blades and the horizontal plane is +/-30 degrees; moving the section edge part of the image B in the retrieval area of the image A according to the inclination angle alpha by an NCC algorithm so as to carry out template matching; determining the maximum similarity metric value of template matching according to the difference summation, and preliminarily determining the position relationship between the image A and the image B; detecting and judging the position relation through a feature extraction and shape area entropy matching algorithm to obtain the optimal position relation; splicing the overlapped area of the image A and the image B; repeating the steps from S41 to S45, and automatically splicing the image sets into a complete path; according to the embodiment, the front and back scene images of the blade after being segmented by the spliced image semantic segmentation technology can be automatically spliced to verify whether shooting is performed or not and whether dead angles exist or not, other influence factors can be eliminated, the inspection precision is improved, and the position and the size of the defect on the blade are directly positioned by combining the laser radar technology.
S5, identifying faults, namely identifying the faults of the images acquired by the unmanned aerial vehicle through a deep learning algorithm, directly filtering the images without the faults, evaluating the categories and severity of the images with the faults, generating a fault identification report, and finally confirming the fault identification report through a control end;
optionally, the process of step S5 may be: acquiring foreground and background images of the blades; marking the irregular fault from the foreground and background images of the blade at the p ixel level by a brush to generate a p ixel-level irregular fault image; meanwhile, based on the foreground and background segmentation of the blade, dividing a non-fault block and a fault block of the blade into negative and positive samples to form a data set; building a two-classification model; as shown in the fault screening model training diagram of fig. 6, a binary model is trained by using a data set to obtain an optimal binary model; as shown in the schematic diagram of the fault screening model identification of fig. 7, the optimal binary classification model is used to perform fault identification on the foreground and background images of the blade, and a fault identification result is returned, for example, the image has no fault, and the returned identification result is 0; if the image has faults, returning an identification result of 1; directly filtering images without faults, and evaluating the category and the severity of the images with faults; and generating a fault identification report and finally confirming the fault identification report through the control terminal.
And S6, carrying out blade digital management, uploading the images with faults acquired by the unmanned aerial vehicle to a cloud platform through a control end, carrying out classified storage, and carrying out digital processing on the images.
Optionally, the process of step S6 may be: the image that unmanned aerial vehicle gathered passes through the control end and uploads to the high in the clouds platform, and the high in the clouds platform carries out classified storage with the image of uploading according to the blade region to carry out digital processing to the image, in order to conveniently carry out operations such as defect marking, defect measurement and report derivation, realize the full life cycle management of blade.
Optionally, be provided with the relocation module on the unmanned aerial vehicle, when unmanned aerial vehicle patrols and examines the in-process automatically and appears the blade and lose, from starting the relocation module, obtain the blade position and trail through scanning from top to bottom, continue the automatic task of patrolling and examining after the relocation to even guarantee that unmanned aerial vehicle sends the driftage phenomenon, also can make unmanned aerial vehicle return a journey through the relocation module.
Optionally, in order to guarantee the quality of the image shot by the camera, on the basis of the embodiment, the automatic exposure camera is selected, so that in the automatic inspection process, the photosensitive condition of the blade is automatically identified, and then reasonable exposure parameters are automatically set, so that the definition of the shot picture is higher.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes will occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (9)
1. An intelligent inspection and diagnosis analysis method for blades of a full-automatic wind generating set is characterized by comprising the following steps:
s1, automatically generating a routing inspection path, and enabling an unmanned aerial vehicle to receive a control signal sent by a control end and fly vertically upwards from the front of a fan hub until the unmanned aerial vehicle flies to the position of the hub; the unmanned aerial vehicle reconstructs a model of the fan under the assistance of fusion calculation of vision and the laser radar and automatically generates a preset routing inspection path;
s2, confirming an operation starting point, automatically searching a central point of a hub by the unmanned aerial vehicle, confirming whether the positions of the unmanned aerial vehicle and the central point of the hub have deviation through a control end, fine-tuning the unmanned aerial vehicle through the control end if the positions of the unmanned aerial vehicle and the central point of the hub have deviation, confirming the operation starting point of the unmanned aerial vehicle through the control end, and executing a step S3; if not, the unmanned aerial vehicle executes the step S3 after confirming the operation starting point through the control end;
s3, unmanned aerial vehicle automatic operation, wherein when the unmanned aerial vehicle flies along the preset inspection path from an operation starting point, the unmanned aerial vehicle obtains a blade foreground image according to the collected blade foreground and background images and an image semantic segmentation technology based on deep learning, so that the unmanned aerial vehicle can automatically identify the blade to realize automatic inspection;
s4, image splicing, namely splicing the blade foreground images segmented by the image semantic segmentation technology based on deep learning into a complete path by the unmanned aerial vehicle through a feature extraction and shape area entropy matching algorithm so as to verify the integrity of the acquisition process;
s5, fault recognition is carried out, the images acquired by the unmanned aerial vehicle are subjected to fault recognition through a deep learning algorithm, the images without faults are directly filtered, the classes and the severity of the images with faults are evaluated, and a fault recognition report is generated to be finally confirmed through the control end;
and S6, carrying out blade digital management, uploading the images with faults acquired by the unmanned aerial vehicle to a cloud platform through a control end, carrying out classified storage, and carrying out digital processing on the images.
2. The intelligent patrolling and diagnosing analysis method for the blades of the full-automatic wind generating set according to claim 1,
the method for automatically generating the routing inspection path in the step S1 comprises the following steps:
s11, shooting a plurality of pictures of the fan by the unmanned aerial vehicle in the vertical upward flying process;
s12, obtaining pixel coordinates of a fan wind tower, the center of a fan hub and the tips of three blades from the picture through fusion calculation of machine vision and a laser radar;
s13, calculating a theoretical three-dimensional coordinate reconstruction fan 3D model of the pixel coordinate by using the position state of the unmanned aerial vehicle, the three-dimensional coordinate of the camera during shooting and camera internal reference information;
and S14, outputting key parameters of the fan, such as yaw direction, fan wind wheel rotation angle, hub center height and blade length, and automatically generating a routing inspection path by the unmanned aerial vehicle according to the key parameters.
3. The intelligent patrolling and diagnosing analysis method for the blades of the full-automatic wind generating set according to claim 2,
the unmanned aerial vehicle automatic operation method of step S3 includes:
s31, obtaining foreground and background images of the blade;
s32, labeling the foreground and background images of the blade to generate corresponding label images;
s33, segmenting the foreground and background images and the label images of the blades at a pixel level by adopting an image semantic segmentation technology to obtain segmented foreground and background images of the blades and segmented label images; then, data enhancement is carried out to form a data set;
s34, building a semantic segmentation network model of a 60+ layer network;
s35, training and optimizing parameters of the semantic segmentation network model by using the data set to obtain an optimal semantic segmentation network model; therefore, the unmanned aerial vehicle carrying the optimal semantic segmentation network model can automatically identify the blade foreground, and automatic inspection is achieved.
4. The intelligent patrolling and diagnosing analysis method for the blades of the full-automatic wind generating set according to claim 3,
the method for acquiring the foreground and background images of the blade in the step S31 comprises the following steps:
s311, after confirming the operation starting point, the unmanned aerial vehicle collects the front edge windward side of the first blade to the blade tip of the front edge windward side;
s312, the unmanned aerial vehicle automatically flies for about 5 meters in the leaf length direction, whether the visual field of the unmanned aerial vehicle has a leaf tip or not is confirmed through a control end, if yes, the unmanned aerial vehicle continuously flies in the leaf length direction until no leaf tip exists in the visual field of the unmanned aerial vehicle; if not, the unmanned aerial vehicle can safely pass through the blade tip; the unmanned aerial vehicle automatically records the distance of automatic flight along the leaf length direction, so that the unmanned aerial vehicle can pass through the leaf tip next time without being confirmed by the control end;
s313, the unmanned aerial vehicle acquires a foreground image and a background image of the blade in a mode that the unmanned aerial vehicle automatically returns to the operation starting point after being collected to the blade root of the front edge leeward side;
3314. and the unmanned aerial vehicle automatically repeats the step S314 to complete the blade foreground and background image acquisition process of other blades, and after the acquisition is finished, the unmanned aerial vehicle automatically navigates back.
5. The intelligent patrolling and diagnosing analysis method for blades of a full-automatic wind generating set according to claim 4,
the image stitching method of step S4 includes:
s41, inputting an image A and an image B which are adjacent in an image set and are formed by at least 20 leaf foreground images under the same path, taking the image A as a template image and taking the image B as an image to be matched;
s42, acquiring the inclination angle alpha of the blade when the fan is stopped and is in an inverted Y shape and the included angle between the blade and the horizontal plane is +/-30 degrees;
s43, moving the section edge part of the image B in the retrieval area of the image A according to the inclination angle alpha through an NCC algorithm to perform template matching;
s44, determining a maximum similarity metric value matched with the template according to difference summation, and preliminarily determining the position relation of the image A and the image B;
s45, detecting and judging the position relation through a feature extraction and shape area entropy matching algorithm to obtain the optimal position relation;
s45, splicing the overlapped areas of the image A and the image B;
s46, repeating the steps from S41 to S45, and automatically splicing the image set into a complete path.
6. The intelligent patrolling and diagnosing analysis method for the blades of the full-automatic wind generating set according to claim 5,
the fault identification method of step S5 includes:
s51, obtaining foreground and background images of the blade;
s52, marking the irregular fault from the foreground and background images of the blade at a pixel level through a brush to generate a pixel-level irregular fault image; meanwhile, based on the foreground and background segmentation of the blade, dividing a non-fault block and a fault block of the blade into negative and positive samples to form a data set;
s53, building a two-classification model;
s54, training the two classification models by using the data set to obtain the optimal two classification models;
s55, utilizing an optimal two-classification model to perform fault identification on the foreground and background images of the blade, directly filtering the images without faults, and evaluating the classification and severity of the images with faults;
and generating a fault identification report and finally confirming the fault identification report through the control terminal.
7. The intelligent patrolling and diagnosing analysis method for the blades of the full-automatic wind generating set according to claim 6,
the blade digital management method of the step S6 comprises the following steps:
the image that unmanned aerial vehicle gathered passes through the control end and uploads to high in the clouds platform, the high in the clouds platform carries out classified storage with the image of uploading according to the blade region to carry out digital processing to the image, in order to conveniently carry out operations such as defect marking, defect measurement and report derivation, realize the full life cycle management of blade.
8. The intelligent patrolling and diagnosing analysis method for the blades of the full-automatic wind generating set according to claim 7,
be provided with the relocation module on the unmanned aerial vehicle, when unmanned aerial vehicle appeared the blade and lost in the automatic in-process of patrolling and examining, the relocation module was arrived from the start, and through scanning from top to bottom and acquire the blade position and trail, continued the automatic task of patrolling and examining after the relocation.
9. The intelligent patrolling and diagnosing analysis method for the blades of the full-automatic wind generating set according to claim 8,
be provided with system monitoring module and safety monitoring module on the unmanned aerial vehicle automatic in-process of patrolling and examining, work as when system monitoring module and safety monitoring module send any unusual warning, unmanned aerial vehicle triggers the action of promptly hovering, and wherein, the action of promptly hovering includes: and acquiring a one-key return command of the control end or starting a manual take-over so as to ensure the safe landing of the unmanned aerial vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011139483.5A CN112360699A (en) | 2020-10-22 | 2020-10-22 | Intelligent inspection and diagnosis analysis method for blades of full-automatic wind generating set |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011139483.5A CN112360699A (en) | 2020-10-22 | 2020-10-22 | Intelligent inspection and diagnosis analysis method for blades of full-automatic wind generating set |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112360699A true CN112360699A (en) | 2021-02-12 |
Family
ID=74511593
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011139483.5A Pending CN112360699A (en) | 2020-10-22 | 2020-10-22 | Intelligent inspection and diagnosis analysis method for blades of full-automatic wind generating set |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112360699A (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113107788A (en) * | 2021-05-27 | 2021-07-13 | 上海扩博智能技术有限公司 | Blade inspection method based on pure vision |
CN113323815A (en) * | 2021-05-28 | 2021-08-31 | 上海扩博智能技术有限公司 | Handheld fan blade inspection equipment |
CN113339206A (en) * | 2021-06-10 | 2021-09-03 | 槃汩工业技术(岳阳)有限公司 | Unmanned aerial vehicle wind power inspection method and unmanned aerial vehicle |
CN113723192A (en) * | 2021-07-30 | 2021-11-30 | 鹏城实验室 | Blade image acquisition method in running state of fan |
CN113764996A (en) * | 2021-07-30 | 2021-12-07 | 华能大理风力发电有限公司 | Method and device for judging state of power distribution cabinet pressure plate of booster station in non-contact manner |
CN113960068A (en) * | 2021-11-23 | 2022-01-21 | 北京华能新锐控制技术有限公司 | Wind power blade damage detection method |
CN114021906A (en) * | 2021-10-19 | 2022-02-08 | 广东邦鑫数据科技股份有限公司 | Unattended wind power generation operation and maintenance management method and system |
CN114439702A (en) * | 2022-01-28 | 2022-05-06 | 华能盐城大丰新能源发电有限责任公司 | Blade state monitoring method and device of wind driven generator |
CN114564031A (en) * | 2022-01-25 | 2022-05-31 | 西安因诺航空科技有限公司 | Path planning method for realizing fan inspection shooting based on rotor unmanned aerial vehicle |
CN114639025A (en) * | 2022-03-11 | 2022-06-17 | 湖南科技大学 | Unmanned aerial vehicle assisted lower wind turbine blade maintenance method and device |
CN114967741A (en) * | 2022-05-27 | 2022-08-30 | 国能定边新能源有限公司 | Path planning method and device for unmanned aerial vehicle automatic inspection fan and storage medium |
CN115078381A (en) * | 2022-06-15 | 2022-09-20 | 智冠华高科技(大连)有限公司 | Wind driven generator blade damage online detection method based on two-axis holder |
CN115661970A (en) * | 2022-12-26 | 2023-01-31 | 海外远景(北京)科技有限公司 | Wind power equipment inspection system based on image recognition technology |
CN116501091A (en) * | 2023-06-27 | 2023-07-28 | 珠海优特电力科技股份有限公司 | Fan inspection control method and device based on unmanned aerial vehicle automatic adjustment route |
CN117514646A (en) * | 2023-11-22 | 2024-02-06 | 辽宁高比科技有限公司 | Dynamic inspection analysis method and system for ground type fan blade |
TWI852003B (en) * | 2021-05-17 | 2024-08-11 | 日商日立電力解決方案股份有限公司 | Structure display device and structure display method |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150043769A1 (en) * | 2013-03-15 | 2015-02-12 | Digital Wind Systems, Inc. | Method and apparatus for remote feature measurement in distorted images |
US20150278224A1 (en) * | 2013-12-12 | 2015-10-01 | Nant Holdings Ip, Llc | Image Recognition Verification |
CN105869120A (en) * | 2016-06-16 | 2016-08-17 | 哈尔滨工程大学 | Image stitching real-time performance optimization method |
US20170011520A1 (en) * | 2015-07-09 | 2017-01-12 | Texas Instruments Incorporated | Window grouping and tracking for fast object detection |
CN106683040A (en) * | 2016-11-21 | 2017-05-17 | 云南电网有限责任公司电力科学研究院 | NCC algorithm based infrared panoramic image splicing method |
CN108915959A (en) * | 2018-06-27 | 2018-11-30 | 上海扩博智能技术有限公司 | By unmanned plane to blower tip region detour detection method and system |
CN110282143A (en) * | 2019-06-14 | 2019-09-27 | 中国能源建设集团广东省电力设计研究院有限公司 | A kind of marine wind electric field unmanned plane method for inspecting |
CN110554704A (en) * | 2019-08-15 | 2019-12-10 | 成都优艾维智能科技有限责任公司 | unmanned aerial vehicle-based fan blade autonomous inspection method |
KR20200033822A (en) * | 2018-03-15 | 2020-03-30 | (주)니어스랩 | Apparatus and Method for Detecting/Analyzing Defect of Windturbine Blade |
CN111259809A (en) * | 2020-01-17 | 2020-06-09 | 五邑大学 | Unmanned aerial vehicle coastline floating garbage inspection system based on DANet |
CN111259898A (en) * | 2020-01-08 | 2020-06-09 | 西安电子科技大学 | Crop segmentation method based on unmanned aerial vehicle aerial image |
WO2020163455A1 (en) * | 2019-02-05 | 2020-08-13 | Urugus S.A. | Automatic optimization of machine learning algorithms in the presence of target datasets |
-
2020
- 2020-10-22 CN CN202011139483.5A patent/CN112360699A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150043769A1 (en) * | 2013-03-15 | 2015-02-12 | Digital Wind Systems, Inc. | Method and apparatus for remote feature measurement in distorted images |
US20150278224A1 (en) * | 2013-12-12 | 2015-10-01 | Nant Holdings Ip, Llc | Image Recognition Verification |
US20170011520A1 (en) * | 2015-07-09 | 2017-01-12 | Texas Instruments Incorporated | Window grouping and tracking for fast object detection |
CN105869120A (en) * | 2016-06-16 | 2016-08-17 | 哈尔滨工程大学 | Image stitching real-time performance optimization method |
CN106683040A (en) * | 2016-11-21 | 2017-05-17 | 云南电网有限责任公司电力科学研究院 | NCC algorithm based infrared panoramic image splicing method |
KR20200033822A (en) * | 2018-03-15 | 2020-03-30 | (주)니어스랩 | Apparatus and Method for Detecting/Analyzing Defect of Windturbine Blade |
CN108915959A (en) * | 2018-06-27 | 2018-11-30 | 上海扩博智能技术有限公司 | By unmanned plane to blower tip region detour detection method and system |
WO2020163455A1 (en) * | 2019-02-05 | 2020-08-13 | Urugus S.A. | Automatic optimization of machine learning algorithms in the presence of target datasets |
CN110282143A (en) * | 2019-06-14 | 2019-09-27 | 中国能源建设集团广东省电力设计研究院有限公司 | A kind of marine wind electric field unmanned plane method for inspecting |
CN110554704A (en) * | 2019-08-15 | 2019-12-10 | 成都优艾维智能科技有限责任公司 | unmanned aerial vehicle-based fan blade autonomous inspection method |
CN111259898A (en) * | 2020-01-08 | 2020-06-09 | 西安电子科技大学 | Crop segmentation method based on unmanned aerial vehicle aerial image |
CN111259809A (en) * | 2020-01-17 | 2020-06-09 | 五邑大学 | Unmanned aerial vehicle coastline floating garbage inspection system based on DANet |
Non-Patent Citations (2)
Title |
---|
AKITO TAKEKI ET AL.: "DETECTION OF SMALL BIRDS IN LARGE IMAGES BY COMBINING A DEEP DETECTOR WITH SEMANTIC SEGMENTATION", 《2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING》 * |
王丽娟: "基于深度卷积神经网络的绝缘子故障检测算法研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 * |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI852003B (en) * | 2021-05-17 | 2024-08-11 | 日商日立電力解決方案股份有限公司 | Structure display device and structure display method |
CN113107788A (en) * | 2021-05-27 | 2021-07-13 | 上海扩博智能技术有限公司 | Blade inspection method based on pure vision |
CN113323815A (en) * | 2021-05-28 | 2021-08-31 | 上海扩博智能技术有限公司 | Handheld fan blade inspection equipment |
CN113339206A (en) * | 2021-06-10 | 2021-09-03 | 槃汩工业技术(岳阳)有限公司 | Unmanned aerial vehicle wind power inspection method and unmanned aerial vehicle |
CN113723192A (en) * | 2021-07-30 | 2021-11-30 | 鹏城实验室 | Blade image acquisition method in running state of fan |
CN113764996A (en) * | 2021-07-30 | 2021-12-07 | 华能大理风力发电有限公司 | Method and device for judging state of power distribution cabinet pressure plate of booster station in non-contact manner |
CN113764996B (en) * | 2021-07-30 | 2024-05-03 | 华能大理风力发电有限公司 | Method and device for judging state of power distribution cabinet pressing plate of booster station in non-contact mode |
CN114021906A (en) * | 2021-10-19 | 2022-02-08 | 广东邦鑫数据科技股份有限公司 | Unattended wind power generation operation and maintenance management method and system |
CN113960068A (en) * | 2021-11-23 | 2022-01-21 | 北京华能新锐控制技术有限公司 | Wind power blade damage detection method |
CN114564031A (en) * | 2022-01-25 | 2022-05-31 | 西安因诺航空科技有限公司 | Path planning method for realizing fan inspection shooting based on rotor unmanned aerial vehicle |
CN114439702A (en) * | 2022-01-28 | 2022-05-06 | 华能盐城大丰新能源发电有限责任公司 | Blade state monitoring method and device of wind driven generator |
CN114639025A (en) * | 2022-03-11 | 2022-06-17 | 湖南科技大学 | Unmanned aerial vehicle assisted lower wind turbine blade maintenance method and device |
CN114639025B (en) * | 2022-03-11 | 2024-08-27 | 湖南科技大学 | Wind turbine blade overhauling method and device under assistance of unmanned aerial vehicle |
CN114967741A (en) * | 2022-05-27 | 2022-08-30 | 国能定边新能源有限公司 | Path planning method and device for unmanned aerial vehicle automatic inspection fan and storage medium |
CN115078381B (en) * | 2022-06-15 | 2024-05-31 | 智冠华高科技(大连)有限公司 | Online wind driven generator blade damage detection method based on biaxial holder |
CN115078381A (en) * | 2022-06-15 | 2022-09-20 | 智冠华高科技(大连)有限公司 | Wind driven generator blade damage online detection method based on two-axis holder |
CN115661970A (en) * | 2022-12-26 | 2023-01-31 | 海外远景(北京)科技有限公司 | Wind power equipment inspection system based on image recognition technology |
CN116501091B (en) * | 2023-06-27 | 2023-11-07 | 珠海优特电力科技股份有限公司 | Fan inspection control method and device based on unmanned aerial vehicle automatic adjustment route |
CN116501091A (en) * | 2023-06-27 | 2023-07-28 | 珠海优特电力科技股份有限公司 | Fan inspection control method and device based on unmanned aerial vehicle automatic adjustment route |
CN117514646A (en) * | 2023-11-22 | 2024-02-06 | 辽宁高比科技有限公司 | Dynamic inspection analysis method and system for ground type fan blade |
CN117514646B (en) * | 2023-11-22 | 2024-06-07 | 辽宁高比科技有限公司 | Dynamic inspection analysis method and system for ground type fan blade |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112360699A (en) | Intelligent inspection and diagnosis analysis method for blades of full-automatic wind generating set | |
CN108416294B (en) | Fan blade fault intelligent identification method based on deep learning | |
CN112884931B (en) | Unmanned aerial vehicle inspection method and system for transformer substation | |
CN110282143B (en) | Inspection method for offshore wind farm unmanned aerial vehicle | |
CN111198004A (en) | Electric power inspection information acquisition system based on unmanned aerial vehicle | |
CN110703800A (en) | Unmanned aerial vehicle-based intelligent identification method and system for electric power facilities | |
CN109060826B (en) | Wind-powered electricity generation blade detection device that does not shut down | |
CN113610749B (en) | Fan blade defect detection method based on neural network | |
CN111952883B (en) | Power transmission line fault recognition system and method based on three-dimensional laser radar | |
WO2024040566A1 (en) | Transformer substation intelligent inspection system and method based on image recognition | |
CN112506214B (en) | Operation flow of unmanned aerial vehicle autonomous fan inspection system | |
CN113406107B (en) | Fan blade defect detection system | |
CN111038721A (en) | Wind turbine blade inspection unmanned aerial vehicle and inspection method based on image recognition | |
CN112950634A (en) | Method, equipment and system for identifying damage of wind turbine blade based on unmanned aerial vehicle routing inspection | |
CN112947511A (en) | Method for inspecting fan blade by unmanned aerial vehicle | |
CN112455676A (en) | Intelligent monitoring and analyzing system and method for health state of photovoltaic panel | |
CN113762183A (en) | Intelligent checking and analyzing system for existing building safety and operation method | |
CN116501091B (en) | Fan inspection control method and device based on unmanned aerial vehicle automatic adjustment route | |
CN110967600A (en) | Composite insulator degradation diagnosis method based on unmanned aerial vehicle infrared detection | |
CN111708380B (en) | Wind turbine generator appearance defect detection method, platform, unmanned aerial vehicle and system | |
CN112801432B (en) | Intelligent inspection system for fan set blades and inspection method for fan set blades | |
CN112085694A (en) | Artificial intelligence automatic inspection wind energy fan blade system | |
CN117514646B (en) | Dynamic inspection analysis method and system for ground type fan blade | |
WO2022117616A1 (en) | Unmanned airborne visual diagnosis of an operating wind turbine generator | |
CN116906276A (en) | Intelligent inspection method for fan blade |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210212 |