1. Introduction
The transition of the energy sector from fossil-based energy carriers to renewable energy sources is one of the biggest challenges of the 21st century. The proportion of renewable energy in power distribution networks is continuously increasing [
1,
2]. However, renewable energies are not continuously available but depend on environmental weather conditions such as wind and sun [
3]. This leads to challenging requirements with respect to network stability, the individual control of the energy feed-in from renewables, and the monitoring of energy facilities and infrastructure [
4,
5]. To overcome these challenges, control systems apply the latest digitalization technologies available, i.e., industrial Internet of Things (IoT) and Industry 4.0, to enable smart and adaptable load configurations. These can take into account information about the surroundings, the environment, and other impact factors. As huge amounts of data are handled, the control systems have to be on par with the latest digitalization technology to allow for an overall interconnection and mutual harmonization using internet and cloud technologies. Automated cloud-centric condition monitoring systems for power generators or switch gears can contribute to this demanding challenge.
Optical fiber sensors (OFSs) offer distinctive advantages in harsh environments, as they are chemically inert, non-conductive, and flexible in size and setup [
6]. They are especially very well suited for sensing in high-power environments because of their inherent immunity to electromagnetic interference (EMI) [
7]. To monitor the power infrastructure, e.g., power generators or switch gears, ideally, distributed optical fiber sensing would be combined with point-sensing to supervise both the environment and the so-called critical hot-spots [
8]. Fiber Bragg gratings (FBGs) are well-established point sensors for temperature and strain sensing [
9]. With interrogator systems based on optical time or frequency domain reflectometry (OTDR or OFDR), it is possible to read out hundreds to thousands of closely spaced FBGs [
10]. Thus, FBG sensor arrays with many measurement points can provide quasi-distributed sensing for energy applications. Especially in incoherent OFDR, the measurement range and spatial resolution of the interrogator system are determined by the electrical modulation parameters and, thus, are easily adaptable to the requirements of the FBG sensor arrays. However, the interrogators for optical fiber sensors come with disadvantages with regard to IoT connectivity: OFS systems often are standalone solutions and do not communicate with the control systems or other IoT devices and participants. In addition, fiber sensors are truly passive devices and cannot identify themselves.
In this paper, we present a solution for a digitalized optical sensor network for intelligent facility monitoring in the energy sector. The interrogator system is based on bidirectional iOFDR (biOFDR) and is able to identify the connected sensor fibers by an all-optical fiber identification marker inside the fiber. This is a central part of the concept, allowing for continuous identification and tracking of process data from the point of origin through the data processing up to the cloud level. In addition, it enables the straightforward deduction of auxiliary information ranging from simple statistical values from soft sensors up to more sophisticated value compensation means by digital twin technology. Communication of the proposed system is enabled by the Open Platform Communications Unified Architecture (OPC UA) standard, which is one of the most important communication protocols in IoT and Industry 4.0 [
11]. Cloud connectivity is enabled by the industrial IoT platform Insights Hub (formerly known as MindSphere) established by Siemens. Sensor data fusion from different sources is realized, respecting the minimum standards for data quality to increase the integrity and reliability of processed operational characteristics. The digitalized optical sensor network was developed in the joint research project DigiMonet, and a demonstration of the system’s functionality was performed during a field test at the power plant test facility of Siemens Energy AG in Mulheim, Germany.
This paper is organized as follows.
Section 2 gives an overview of the developed bidirectional iOFDR measurement technology and its implementation.
Section 3 describes the manufacturing and realization of the used fiber Bragg identification markers.
Section 4 gives an overview of the network topology of the digital sensor network architecture and the functionality of the implemented OPC UA microservices and the cloud onboarding services. The field test is described in
Section 5.
Section 6 concludes this paper with a discussion and summary of the results.
2. Optical Measurement Technology
The physical basis of the presented optical sensor network is an FBG interrogation setup based on incoherent optical frequency domain reflectometry (iOFDR) [
12,
13]. Compared to classical iOFDR systems, our approach enables homodyne-optical down-conversion and thus works without vector network analyzer (VNA), offering a cheaper alternative to conventional iOFDR systems [
13]. An electro-optical Mach–Zehnder modulator is employed in bidirectional operation for amplitude modulation with a stepped RF frequency for both modulation and demodulation [
14]. The configuration of the optical setup enables the interrogation of arrays with FBGs at different Bragg wavelengths (wavelength division multiplexing, WDM) as well as spatial resolution of FBGs with the same Bragg wavelength in one sensor fiber (time division multiplexing, TDM) [
14]. This facilitates the interrogation of a high number of measurement points (FBGs) in a single fiber. The experimental setup and the concept of the fiber identification is depicted in
Figure 1a and described in detail in the following.
Figure 1b shows an image of the portable demonstrator of the biOFDR setup.
To achieve wavelength resolution, broadband light at a wavelength from 1525 nm to 1575 nm from amplified spontaneous emission (ASE) of an erbium-doped fiber amplifier (EDFA, EDFA100S from Thorlabs, Bergkirchen, Germany) is used. A fiber-coupled optical circulator (model 6015-3-APC from Thorlabs, Bergkirchen, Germany) directs the broadband light to a Mach–Zehnder modulator (MZM, IOAP-MOD 9201 from SDL, San Jose, CA, USA). A signal generator (SG, SML01 from Rhode&Schwarz, Munich, Germany) supplies the electrical RF signal, which drives the MZM. The RF-signal with frequency
is stepped over an RF frequency bandwidth
in steps of
, imprinting an amplitude modulation on the optical carrier. The amplitude modulation of the laser power
over time
is described according to [
13] by:
where
is the optical pump power,
the combined insertion and bias loss of the modulator, and
the modulation index.
The amplitude-modulated light then passes through a polarization scrambler (PS, PolaMIX PCD-104 from General Photonics, Chino, USA), which provides uniformly distributed polarization states. This step is important as the fiber is not polarization-maintaining and the MZM is polarization-sensitive. Also, the MZM is used in backward-direction for optical re-modulation with the same RF frequency. The light is guided to the sensor fiber with the FBG point sensors. The sensor response to the measurand will imprint amplitude and phase changes along the fiber onto the modulated light, which can be described by the transfer function:
where
denotes the magnitude and
the phase.
Subsequently, the back-reflected light with power
now carries the sensors’ information:
where
is the DC return loss of the fiber. It is then demodulated by the same modulator and RF frequency. The power of the demodulated signal can be expressed by:
The demodulated light is then detected by a spectrometer (I-MON USB 256 from Ibsen Photonics, Farum, Denmark), and one spectrometer measurement is taken for each RF modulation frequency step. As the spectrometer exhibits a low-pass characteristic, only the DC component
of
is detected over
. Inserting Equation (3) in Equation (4) and observing the DC component yields:
Thus, with this method, the real part of the transfer function
of the sensor fiber is recorded in the frequency domain [
13]. To obtain the backscattering trace in the time domain, the data are subjected to a Fourier transform. Since both the spectrum and impulse response are detected simultaneously, the described bidirectional IOFDR setup enables the attainment of both spatial and wavelength resolution of the sensor fiber.
In iOFDR, the spatial distance
is derived from the time–frequency correspondence of the Fourier transform, with the speed of light
, where
is the speed of light in vacuum and
is the effective refractive index of the fiber [
13]. The factor ½ accounts for the round-trip propagation through the sensor fiber. The characteristic values for the spatial resolution (i.e., two-point resolution) and the maximum unambiguous range of iOFDR systems are
and
, respectively. The factor of 4 takes into account that only real values are detected by homodyne down-conversion. In this study, the modulation bandwidth
is 1.1 GHz, and the minimum step size
9 kHz. Therefore, we achieve a maximum measurement range of
5.75 km and a two-point resolution of
9.4 cm with our bidirectional iOFDR. In general, the bidirectional usage of the MZM limits the maximum modulation bandwidth for this setup [
14].
The implementation of the signal processing is conducted using MATLAB on a measurement laptop. A graphical user interface has been developed for the control of the iOFDR system. Additionally, the measurement laptop functions as an OPC UA client, establishing a connection between the bidirectional iOFDR and the server. Along with the optical fiber identification, the OPC UA connectivity represents a key feature of the presented system, as it enables the integration of an optical sensor with an embedded identification feature into a comprehensive monitoring system. This will be discussed in
Section 3 and
Section 4, respectively.
3. Optical Fiber Identification
In contrast to electrical sensors, optical fiber sensors lack the ability to self-identify because of their passive nature. To adapt optical fiber sensors for IoT and Industry 4.0 applications, we investigate a fiber identification method based on an all-optical marker. This marker utilizes a unique combination of weakly reflecting fiber Bragg gratings. Specifically, we employ fiber identification markers (ID markers) consisting of three FBGs with a specific combination of position and Bragg wavelength. These identification markers are inscribed near the connector of the sensor fiber and are read out with the proposed biOFDR.
The wavelength and position information of all ID markers are stored in a database. When connecting a sensor fiber to the interrogator, first, the ID marker is read out and compared with entries in the database by a correlation-based algorithm to find the matching one. To allow for slight temperature changes, margins for the Bragg wavelength are taken into account. To obtain the correct position, a calibration measurement is performed to define a reference plane [
13,
14].
An example of the FBG-based fiber ID marker is shown in
Figure 2a. Here, the marker 1-1-1 is shown over wavelength and position. Taking into account that each ID consists of the fiber Bragg gratings and position and wavelength, we have two degrees of freedom. Thus, when choosing three out of five wavelengths in a specific order (position), 20 individual combinations are possible. The created fiber ID markers are shown in
Figure 2b, where the ID markers applied during the field test are highlighted in blue and red. Each of these ID markers is connected to one sensor fiber such that the information on the sensor fiber can be fetched directly from the OPC UA server, as described in detail in
Section 4.
To enable the identification of additional fibers, truly unique patterns can be inscribed into the fiber. This capability is facilitated by the femtosecond (fs) direct inscription method used for fabricating fiber Bragg gratings. The presented optical identification approach enables the digitization of the optical sensor network, as the sensors can be recognized by software allowing for the corresponding calibration parameters to be retrieved from the OPC UA server. The process of digitalizing the optical sensor network will be described in the following section.
4. Digital Sensor Network
4.1. Sensor Network Architecture
In this section, we present our solution for an automated condition monitoring system based on the iOFDR optical fiber sensor system. The optical system becomes a digital sensor network by the interplay of OPC UA-based microservices (µ-services) from the local to cloud level [
11,
15]. The architecture of the proposed digital sensor network is depicted in
Figure 3 and explained in the following, starting from the biOFDR system.
The fiber optic sensor arrays are connected to the DigiMonet biOFDR system that is operated by the MATLAB user interface (implemented in MATLAB R2021b). The sensor arrays consist of multiple FBG sensor arrays and their corresponding identification marker, as shown in
Figure 2. The optical system generates measurement files with a specific data format, including the measured temperature, together with the complete spectral and local position and identification information. Next, the measurement files are transmitted via file transfer to the main OPC UA data server on a connected network drive. This simple, thus powerful, solution was selected to allow for independent development of the bidirectional iOFDR measurement system, on the one hand, and the OPC UA network services, on the other hand. The hardware configuration parameters for the optical measurement are provided by the OPC UA CONFIG (
configuration) server, as shown by the grey connections in
Figure 3. Python was used to execute embedded OPC UA client calls from within MATLAB control software to allow for the subscription and application of configuration parameters provided by OPC UA nodes of the CONFIG server. With the identification information embedded in the fiber sensor array, a unique link can be established to corresponding parameters in the OPC UA node structure. According to this link, all related individual calibration information can be derived from the OPC UA DCS (
digital calibration sheet) server. The advantage of this approach is that all associated fiber and calibration parameters are automatically retrievable as soon as a fiber array is connected to the measurement system. Hence, errors typically associated with manual onboarding, especially for a large number of fiber sensor arrays, are eliminated. Data safety is realized by copying the original measurement data files to a connected backup drive, while processed measurement data can be additionally secured by the data server itself (server backup).
With the measured data now being available on the OPC UA data server, several OPC UA based µ-services with an integrated client/server functionality can subscribe for published values on the data server and on related value properties. These can include, for example, the SI-unit, the number of valid digits, or the address of the service where the value originates from. Daisy-chained µ-services append their own address to that of the predecessor. Hence, all services that modify the values are traceable. If necessary, the services have access to the DCS and CONFIG server to subscribe to necessary calibration parameters or to post changes in control parameters, which can be used to reconfigure the iOFDR measurement system for the next measurements. If too many µ-services are running, it turns out to be beneficial to collect relevant values on a central server before pushing them to the cloud level. This functionality is embedded in the optional OPC UA EXIT server. EXIT servers are used in the industrial IoT to move data from a local network to the cloud. In our solution, it has the additional function of comparing processed values against given individual limit values defined in the configuration file provided by the CONFIG server. The triggering of events by the violation of a value limit is a further function of the EXIT server and can be monitored by Python-based event clients or any suited commercially available OPC UA client tool.
The here-presented solution of a digitalized optical sensor network is based on the industrial IoT platform
MindSphere Digital Service Platform (MDSP) from Siemens, nowadays known as Insights Hub. This platform allows the user to connect and operate industrial devices and infrastructure. The configuration and operation space (highlighted in green in
Figure 2) is spanned by the MDSP and µ-services, as well as the CONFIG, DCS, and EXIT server. Configuration management can be performed by LAN access to the underlying network. This network can be realized by discrete components and industry computers, as it was performed during our field test installation described in
Section 5. If needed, all related services could be hosted on virtual devices on the cloud level. The transfer of measurement and processed data to the cloud level is realized by the MDSP service with an integrated OPC UA client. This service subscribes selected values directly from different µ-services or preferably from the EXIT server. The Siemens Insights Hub timeline service, evolving from the Siemens MindSphere services, finally gathers all transferred data and provides standard tools for the display of the obtained data in web browser-based dashboards. In addition, it raises alarms for limit violations or operates other Insights Hub data services, which are managed by one or more control systems with internet access to the cloud services.
4.2. OPC UA Microservices
In this section, the developed microservice and server properties are described. With these, we want to enable the integration of measurement data from fiber sensors as well as from electrical sensors and the safe storage of those data. At the same time, the possibility needs to be opened up for processing the data by suited algorithms, offering them to succeeding services and finally transferring the data to the cloud level. For this purpose, the unambiguous fiber identification marker serves as a link to connect every sensor element in an FBG sensor array to the individual calibration and manufacturing data previously determined. These data are saved in xml-type configuration sheets and calibration sheets. Their content is published by the CONFIG and DCS server. The raw measurement data are published by the OPC UA data server along with auxiliary information for each value. This information is provided by several node properties in the OPC UA structure as depicted in
Figure 4.
These properties are exemplarily shown for the measurement value of the spectral full-width half-maximum (FWHM) bandwidth of an FBG reflection spectrum of temperature sensor element number 1 in attached fiber 0, resulting in the variable “FWHM_Fiber[0]FBG_T[1]”. Using these properties, each subscribing OPC UA µ-service or client can gain information on the physical SI-unit of the measurement value. For this purpose, four different properties of SI-unit descriptors were used to broaden the applicability. The first SI-unit property “D-SI” is derived from the conventions of the joint research project SmartCom [
16]. The other SI-unit properties, “Description”, “UNECE”, and “UnitID”, are based on the EUInformation data type structure of the OPC Foundation [
17]. A further property of the measurement value is “Decimals”, which gives the number of valid decimals regarding the accuracy of the measured value. This is, in particular, beneficial for the case that values undergo different processing steps by different daisy-chained services and shall maintain the given accuracy during the complete data processing. By the property “Origin”, tracking of the data processing is possible. For this task, every µ-service that is involved in avalue data processing will append its own name to the “Origin” value that was acquired from the predecessor. To illustrate the implemented strategy, we propose the “OPC UA data server 34841” as the data source, a first “µ-service A 34842” for the first processing, and a second “µ-service B 34843” for the final processing. The “Origin” values with their appended content can be seen in
Figure 4. On the one hand, every µ-service has to identify the correct source of the data value needed for its own processing. On the other hand, a µ-service has no a priori knowledge of already invoked µ-services that preprocess that value. To solve this problem, each µ-service starts a value search directly at the OPC UA data server. The “WhereIs” property of that value contains information about the address, where the actual value can be found. In our example, the “WhereIs” address of the “OPC UA data server 34841” points to port 34842 of the µ-service A, while the “WhereIs” property points to port 34843 of the µ-service B. When the address of the µ-service is equal to the “WhereIs” property (see µ-service B), there are no additional µ-services involved, and a succeeding µ-service can subscribe to that value after registering its own server address in the µ-service B “WhereIs” property. Using this technique, a succeeding µ-service will find the server address of the last data-processing-incorporated µ-service starting from the OPC UA data server.
4.3. MindSphere Digital Service Platform Onboarding
Once the data processing steps are completed within all contributing microservices, a measurement dataset is prepared for transfer to cloud storage. To facilitate this transfer, the cloud connection of the sensor platform is established through an automated, secure, two-step process. This process is illustrated in
Figure 5.
The first onboarding step is to establish a secure connection between the sensor platform and the cloud services. This is accomplished through a user token (2), which is issued by the cloud platform following a secure login process. This token is loaded into the MDSP Manager application and grants secure access to the cloud platform for a limited duration. Before transferring measurement data to the cloud storage, an appropriate structure must be created in the cloud. This involves the generation of assets and aspect variables that accurately represent the required data structure. This structure is provided using a thing description model (1). This model is generated automatically using the selected OPC UA node structure of the incorporated OPC UA services. The MDSP Manager creates hereof the MDSP model (3) and automatically establishes the related data structure in the Insights Hub Cloud from the OPC UA data hierarchy and node names.
Once the onboarding is completed, the real measurement data can be transferred to the cloud in a second step, either immediately or even after prolonged interruption. In this phase, the processed measurement data of the involved OPC UA services are transferred to the MDSP OPC UA Client by the IE connector, a containerized application. During the data uplink (5) by the MDSP client, a correct assignment of measurement values to related assets and aspects is achieved through the utilization of the previously generated data point mapping (4). Consequently, the processed values can be transferred and stored as time series data within the Insights Hub data cloud through an automated process, making a time-consuming and complex manual onboarding procedure superfluous.
5. Field Test and Results
To test the digitalized optical sensor network, a field test was conducted on a mockup section of a power plant power generator at a test facility of Siemens Energy AG in Mulheim, Germany. The mockup is used to run test cycles on the stator bars with repeated current loads to examine the isolation material behavior. It consists of an original replica of a standalone section of the stator windings, which will heat up during the test cycles due to the current load. A picture of the mockup with two installed fiber sensor arrays on top of the stator cooling channels is shown in
Figure 6.
5.1. Preparational Work
For the preparation of the field test, two sensor strips with integrated FBG fiber arrays were manufactured. Therefore, a sturdy design based on a glass fiber-reinforced epoxy strip was chosen. To eliminate strain effects, the FBG fibers were fed into PEEK capillaries, as described in detail in the next section. In lab tests, the sensor arrays were calibrated, allowing for the creation of a digital calibration sheet (DCS) deposited at the DCS server. The configuration of the field test setup regarding all auxiliary information, i.e., connected sensor type, sensor identification, location, and measurement hardware parameters, were compiled into the configuration sheet and stored at the CONFIG server. The cloud data structure was created according to the procedure described in the previous section, enabling the measurement data handover to the Insights Hub aspects and assets created during the onboarding step.
5.1.1. Packaging of the FBG Sensor Fibers and Application to Mockup
Two fs-direct inscribed FBG sensor fibers from Engionic Femto Gratings were packaged for the field test. The sensor arrays consist of a 2 m long standard single-mode fiber with 10 FBGs at 1530 nm and 20 cm distance. The coating was stripped off from the whole fiber length, i.e., for nearly 2 m excluding the connector end, to eliminate strain influences of the coating. The stripped FBG sensor fiber was then fed into a PEEK tube with a 0.9 mm diameter. The fiber was only fixed at the connector side, lying loosely in the tube in a way that strain effects of the generator or the packaging could not affect the temperature measurements. The tube was glued onto a ground fiber-reinforced epoxy strip and protected with glass fiber textile tape. This packaging allowed for easy transportation of the sensor arrays to the field test and mounting onto the generator mockup.
Figure 6 shows the packaged sensor strips mounted to the generator.
5.1.2. Hardware Setup and Service Configuration
During the field test, all measurement equipment had to be placed in a shielded container next to the generator mockup. The biOFDR setup was mounted in a 19” rack, including also a SIMATIC industrial computer of type IPC 427E (MICROBOX) serving as the OPC UA data server and a FritzBox router of type 6850LTE for internet access. The Industrial Edge Management (IEM) and the configuration management were performed by additional laptops within the local LAN network spanned by the FritzBox router and LAN switches. An additional SIMATIC industrial edge computer of type IPC 227E (IE NANOBOX) was included to host additional services. A configuration scheme of the used hardware and the connected OPC UA services is depicted in
Figure 7.
The data transfer mechanism between the MATLAB-controlled biOFDR instrument and the OPC UA data server is based on ASCII file transfer. Control and calibration parameters are provided from the CONFIG or DCS server by calling a Python script from MATLAB control software. In the case of a network breakdown or pre-initialization in the startup phase, a copy of relevant configuration and calibration information is stored in file format.
During the field test, two additional µ-services were active. The µ-service Gauss subscribed to the FBG reflection spectra data saved in the original measurement files and published by the OPC UA data server. The µ-service conducted a standalone Gauss-fit to determine the Bragg center wavelength and FWHM in spectrometer pixel units. By subscribing additionally to the calibration information at the DCS server, the Gauss µ-service can transform the calculated data for the peak center into units of degree Celsius and the FWHM data into units of picometer. Therefore, access to the µ-service Gauss data processing allows for an automatic adapted background correction or changes in the fit parameters due to induced attenuation or polarization effects occurring during the measurements. The µ-service Statistics subscribed to the original temperature values at the OPC UA data server and delivered additional information regarding the minimum, mean, and maximum temperature values of each FBG sensor within the array for each sampling time step. For cloud transfer of the original temperature data, recalculated temperature data by the Gauss µ-service and the statistical temperature data by the Statistics µ-service were selected.
The transfer of data to the cloud during the field test was organized in a parallel twofold way: first to the Insights Hub cloud service and second to the ThingsBoard service. For data structure onboarding and data transfer to both cloud services, the same onboarding and transfer services were used as described in
Section 4. The onboarding and transfer service to the Insights Hub cloud level was executed on a Siemens Industrial Edge IE NANOBOX PC in a containerized Docker environment, while the onboarding and transfer service to the ThingsBoard cloud level was executed on a separate standard Windows laptop.
5.2. Field Test Measurements and Results
The field test measurements were performed during an accelerated aging test of the generator mockup. Such a test is designed to represent the complete lifetime stress within a relatively short time scale of several weeks. Therefore, the stator bars of the generator are exposed to driving currents, simulating the expected operation conditions. The driving current cycles lead to periodic heating and cooling of the generator, over which the structure itself and the behavior of the isolation material are monitored. During the test, the applied current and voltage values, as well as the temperature distribution in the mockup, are tracked continuously to provide insights into the performance of the insulation structure.
In order to test our digitalized fiber sensor network, we installed the packaged FBG arrays to the mockup, as depicted in
Figure 6. The sensor points are marked with black dots, and it can be seen from the picture that some FBGs were mounted directly on the stator stack while others were mounted above cooling air slots. A set of seven single heating cycles was monitored for 20 h. One biOFDR measurement took 150 s; thus, overall, more than 500 measurement files were created. With the start of the measurements in the MATLAB control program, the initialization routine was triggered: The identification marker of the fiber was read out automatically. From this information, the configuration parameters for the FBG sensors with coding “D42200036112-BT-0” to “-BT-9” were acquired from the CONFIG server. The experimental results of the 10 measurement FBGs, together with the three identification FBGs acquired with the biOFDR, are depicted in
Figure 8a. The heating cycles and temperatures detected with each sensor element are shown in
Figure 8b. The heating phase of every cycle took approx. two hours, while the cooling took one hour. The measured temperature levels of the individual sensors varied depending on the location and vicinity of the cooling slots of the stator stack and, to a certain amount, on the conditions in the test facility. The sensor element “…FBG_T_0” showed the lowest temperature over time, while one of the maximum temperature time series was measured for sensor element “…FBG_T_2” (see
Figure 9).
After each measurement, the temperature values, spectral data, and location data of all FBG sensors were saved into a file on the MICROBOX network drive. The OPC UA data server has access to this network drive and publishes the values accordingly. The measured temperatures over time were transferred to the ThingsBoard cloud and are depicted in
Figure 9.
The ThingsBoard transfer service subscribed the temperature values from the OPC UA data server and applied fine smoothing before displaying the data in a browser-based time series plot. Because the OPC UA data server allows for parallel subscriptions of data nodes for different clients, the same measurement data are transferred in parallel to the Insights Hub cloud and can be displayed in a browser-based time series plot. To test the processing impact of the µ-service Gauss, the calibration parameters were modified by the
Gauss µ-service in a way that the stator bar temperature values between 17.5 °C and 104 °C could be mapped to a temperature range between 22.0 °C and 22.1 °C. These modified values were subscribed from the µ-service
Gauss and transferred into the Insights Hub cloud to be displayed in any browser. To demonstrate the impact of a statistical evaluation of the stator bar temperature values, the determined minimum (Min), mean (Mean), and maximum (Max) temperature values are subscribed from the
Statistics µ-service. These values were transferred to the Insights Hub cloud and can be visualized on the related Insights Hub Monitor dashboard, as displayed in a simplified screenshot in
Figure 10, to support intuitive understanding.
6. Discussion and Conclusions
In this paper, we proposed a possible solution for the digitalization of an optical fiber sensor system for the monitoring of energy infrastructure. In our case, the optical fiber sensor system is based on a bidirectional approach of incoherent optical frequency domain reflectometry for the interrogation of fiber Bragg gratings in a standard single-mode fiber. Combining a high number of gratings in a single fiber with a close spacing of the FBGs, quasi-distributed sensing of the temperature is achieved. Here, the packaged sensor fibers include 10 FBGs at the same Bragg wavelength; however, an interrogation of hundreds of gratings in one or more wavelength channels is feasible. The realization of the digital optical sensor network was implemented by the OPC UA protocol, enabling both the remote control of the optical system and the monitoring of the energy infrastructure in the cloud. The utilized cloud structures are based exemplarily on the industrial IoT platform MindSphere Digital Service Platform from Siemens. With the optical identification markers integrated into each sensor fiber, the optical sensor network is able to identify the individual FBG arrays, and calibration and measurement parameters can be retrieved automatically from the server. The digitalization of the optical system by the described OPC UA server structure can be transferred to any sensor system and, therefore, represents a universal solution for the integration of standalone systems into IoT infrastructure. The supervision of sensor data quality, as well as data fusion and modeling, is performed by the various implemented microservices, which facilitate the possibility of digital twin modeling of the sensors. The performance of the proposed digital sensor network was demonstrated successfully during a field test in an energy infrastructure-related environment. The described approach for packaging the FBGs did not influence the temperature distribution of the mockup, but care has to be taken depending on the application. To conclude, we presented a comprehensive solution for a digital optical sensor network to enable intelligent facility monitoring from the optical system level to the cloud level and thus established a milestone for the future economic and scientific usability of the optical sensor network in a digital control system.