[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2021032800A1 - Sensor data processing - Google Patents

Sensor data processing Download PDF

Info

Publication number
WO2021032800A1
WO2021032800A1 PCT/EP2020/073248 EP2020073248W WO2021032800A1 WO 2021032800 A1 WO2021032800 A1 WO 2021032800A1 EP 2020073248 W EP2020073248 W EP 2020073248W WO 2021032800 A1 WO2021032800 A1 WO 2021032800A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sensor
sensor data
conduit
components
Prior art date
Application number
PCT/EP2020/073248
Other languages
French (fr)
Inventor
Neil Edwards
Original Assignee
Carfiguano Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carfiguano Ltd filed Critical Carfiguano Ltd
Publication of WO2021032800A1 publication Critical patent/WO2021032800A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M3/00Investigating fluid-tightness of structures
    • G01M3/005Investigating fluid-tightness of structures using pigs or moles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M3/00Investigating fluid-tightness of structures
    • G01M3/02Investigating fluid-tightness of structures by using fluid or vacuum
    • G01M3/26Investigating fluid-tightness of structures by using fluid or vacuum by measuring rate of loss or gain of fluid, e.g. by pressure-responsive devices, by flow detectors
    • G01M3/28Investigating fluid-tightness of structures by using fluid or vacuum by measuring rate of loss or gain of fluid, e.g. by pressure-responsive devices, by flow detectors for pipes, cables or tubes; for pipe joints or seals; for valves ; for welds
    • G01M3/2807Investigating fluid-tightness of structures by using fluid or vacuum by measuring rate of loss or gain of fluid, e.g. by pressure-responsive devices, by flow detectors for pipes, cables or tubes; for pipe joints or seals; for valves ; for welds for pipes
    • G01M3/2815Investigating fluid-tightness of structures by using fluid or vacuum by measuring rate of loss or gain of fluid, e.g. by pressure-responsive devices, by flow detectors for pipes, cables or tubes; for pipe joints or seals; for valves ; for welds for pipes using pressure measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M3/00Investigating fluid-tightness of structures
    • G01M3/02Investigating fluid-tightness of structures by using fluid or vacuum
    • G01M3/26Investigating fluid-tightness of structures by using fluid or vacuum by measuring rate of loss or gain of fluid, e.g. by pressure-responsive devices, by flow detectors
    • G01M3/28Investigating fluid-tightness of structures by using fluid or vacuum by measuring rate of loss or gain of fluid, e.g. by pressure-responsive devices, by flow detectors for pipes, cables or tubes; for pipe joints or seals; for valves ; for welds
    • G01M3/2807Investigating fluid-tightness of structures by using fluid or vacuum by measuring rate of loss or gain of fluid, e.g. by pressure-responsive devices, by flow detectors for pipes, cables or tubes; for pipe joints or seals; for valves ; for welds for pipes
    • G01M3/2823Investigating fluid-tightness of structures by using fluid or vacuum by measuring rate of loss or gain of fluid, e.g. by pressure-responsive devices, by flow detectors for pipes, cables or tubes; for pipe joints or seals; for valves ; for welds for pipes using pigs or moles traveling in the pipe
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M3/00Investigating fluid-tightness of structures
    • G01M3/38Investigating fluid-tightness of structures by using light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M3/00Investigating fluid-tightness of structures
    • G01M3/40Investigating fluid-tightness of structures by using electric means, e.g. by observing electric discharges

Definitions

  • the present disclosure relates to a system for processing sensor data.
  • the disclosure relates to a system that may be used in processing the data generated by a plurality of sensor devices arranged to move freely through a fluid medium, such as drinking water in utilities pipework.
  • Leak detection can be achieved by laborious and often inconclusive techniques such as pressure testing (isolating sections of pipework at a time to introduce a neutral, compressible fluid, such as air, and determining whether the compressible fluid maintains an initial pressure).
  • pressure testing isolated sections of pipework at a time to introduce a neutral, compressible fluid, such as air, and determining whether the compressible fluid maintains an initial pressure.
  • Figures 1A to IF illustrate the deployment of a collection of sensor devices in the case of leak detection
  • Figure 2 is a plot of speed against time that illustrates a case where sensor devices are deployed and encounter a leak
  • Figure 3 is an example of a visualisation of a length of pipework in accordance with an exemplary embodiment of the disclosed subject matter
  • Figure 4 is a schematic diagram summarising the main functional blocks in the operation of a sensor data processing system in accordance with some exemplary embodiments of the disclosed subject matter;
  • Figure 5 is a schematic diagram illustrating the main steps in the analysis and processing of sensor data obtained by sensor devices, in accordance with some exemplary embodiments of the disclosed subject matter;
  • Figure 6 is a schematic illustration of certain components of a sensor device suitable for generating a data stream such as that processed by the sensor data processing system of the disclosed subject matter;
  • FIG. 7 is a schematic illustration of certain components of a server in the sensor data processing system in accordance with an exemplary embodiment of the disclosed subject matter. DESCRIPTION OF EXAMPLE EMBODIMENTS
  • Individual sensor devices may include a plurality of component sensors.
  • the sensor devices deployed in a single investigation may include identical arrangements of component sensors.
  • certain sensor devices among the plurality of sensor devices deployed in a single investigation may differ from the other sensor devices, with different sensor devices including differing sets of component sensors.
  • the component sensors include at least one of an inertial measurement unit (IMU), a magnetometer, a gyroscope, accelerometer, acoustic sensor and a camera module.
  • the camera module may be arranged to provide a sequence of 360 degree images and/or 3D images.
  • the sensor device may further include a controller module for controlling the operation of the component sensors: the controller module may include one or more processors.
  • the sensor devices may be provided with a communication module for receiving and/or transmitting signals and a memory means configured to store instructions for the controller module and measurement data generated by the component sensors.
  • the communication module may include at least one of a LoRa, WiFi or cellular communications module.
  • the sensor device may include a location module, the location module providing an absolute location measurement: for example, the location module may be a GPS location module.
  • absolute location measurements are provided by an external location device (such as a smartphone or a dedicated GPS tracker) and the absolute location measurements are received by the sensor device via the communication module, when the external location device is in communicative connection with the communication module.
  • Sensor devices are conveniently provided with a power supply, such as a rechargeable battery, so that the components of the sensor device are each supplied with electrical power as required. Such rechargeable batteries may be arranged to be recharged wirelessly, through induction for example.
  • the sensor device is encased in a (spherical) sealed casing.
  • the sensor device is arranged to be neutrally buoyant at expected flow pressures: as a result, the sensor device is suspended within the flowing fluid and rises when the fluid becomes still (in water, the sensor device will remain immersed while moving with the flowing water but rise to the air/water interface in still water).
  • each of the sensor devices in the collection is captured (e.g. netted) at the end point of the trajectory and the data generated by the component sensors (and stored in the memory means) is transferred, via the communication module, to a receiving device.
  • the receiving device may be a dedicated reader or an external device having a transceiver suitable for establishing a communicative connection with the sensor device via the communication module, for example a smartphone having a Bluetooth, BLE, ZigBee, LoRa, NFC and/or WiFi communication function.
  • each data stream will include measurement data that act as markers of anomalies and, as each device passes each anomaly, the specific location and properties of the anomaly can be identified by comparing the change of movements in one sensor device relative to each other and a known initial starting point rather than their absolute movement and using their combined processed navigation data.
  • FIG. 1A-1F illustrate the deployment of a collection of sensor devices 104, 106, 108 in the case of leak detection.
  • a collector 122 e.g. a net
  • a dispenser for inserting the collection of sensor devices
  • access ports 110, 120 e.g. hydrant points
  • the external processing device 130 is a smartphone with an integral GPS location module.
  • the smartphone 130 executes an application (i.e. App) that accesses the GPS location generated by the GPS module and a system clock to set the initial GPS location value in each of the sensor devices and to ensure that the controller modules of each sensor device are synchronised.
  • the valve of the first hydrant point 110 at which the dispenser is installed, is opened releasing the sensor devices 104, 106, 108 one at a time, at (time) intervals, into the fluid flow. From this time onwards, the plurality of sensor components of each sensor generate sensor measurement data at successive times as the sensor device progresses along a trajectory along the pipework being investigated, carried by the fluid flow. The sensor measurement being stored together with a timestamp identifying the measurement time.
  • the sensor devices encounter a leak 150. Leaks create turbulence, speed/pressure loss and noise, which are experienced (and measured) by the respective sensor devices in turn as they are carried by the fluid flow.
  • the collector 122 retrieves the sensor devices 104, 106, 108 at the second hydrant point 120.
  • the sensor devices are arranged to begin signalling upon sensing that they are stationary in the collector trap.
  • the sensor devices 104, 106, 108 (re)connect with an external processing device 130’ (which may be the same external processing device used to initialise the sensor devices in Figure IB) in a second communication connection, via WiFi for example.
  • the second communication connection allows the respective sensor devices to upload data files including the measured sensor data to the external processing device for processing.
  • the external processing device again accesses the GPS location generated by a location module (e.g. the GPS module of the external processing device) and a system clock to fix the final GPS location value for the end point of the trajectory taken by the sensor devices.
  • Figure 2 is a plot of speed against time and illustrates a simple case in which a collection of three sensor devices is deployed and encounters a leak (as in Figure ID).
  • the velocity for the lead sensor device is plotted as a first curve, 204; for the first following sensor device, as a second curve, 206; and, for the second following sensor device, as a third curve, 208.
  • This allows the normal variances in pipe speed to be eliminated from the data allowing corrected navigational data to pinpoint the location of the leak in 3D space with greater accuracy.
  • the external processing device may itself perform analysis on the respective data streams
  • certain embodiments make use of the networking capabilities in processing devices such as smartphones, tablets and laptop devices to upload the data streams to a distributed processing network (i.e. a cloud computing platform).
  • Consistent and scalable levels of processing may be applied to data streams from collections of different numbers and types of sensor devices by the servers of the distributed processing network.
  • the output from the sensor data processing system may be visualised through a visualisation tool.
  • the visualisation tool may present a view of the pipework under investigation that highlights detected anomalies in their context. Examples of display representations include displays in the forms of tables, vector graphics, and point clouds.
  • Figure 3 is an exemplary visualisation from the visualisation tool. Anomalies 302 and 304 are identified and their special context is apparent from the neighbouring extent of the pipework.
  • Further visualisations may require the correlation of the absolute location values with map data provided by surface mapping tools.
  • the visualisation tool may be provided with an interface that facilitates the overlay of the generated model of the investigated pipework in a familiar mapping application such as Google Maps (e.g. using a marked up KML file).
  • Google Maps e.g. using a marked up KML file.
  • the sensor data processing system may generate a user presentation of movement analysis with the audio (and visual) streams. Combined multiple still images taken at multiple angles may be image processed to produce single, composite, 'internal view' of the pipe.
  • Such internal view visualisations may be conveniently presented through virtual reality display means such as the Hololens [RTM] system of Microsoft [RTM].
  • Figure 4 summarises the main functional blocks in the operation of the sensor data processing system.
  • the sensor data is acquired as data streams from respective sensor devices in a collection of such devices.
  • the sensor data is then processed to correlate the data in each of the data streams with relative locations along the trajectory of the sensor devices through a fluid conduit, step 404.
  • the correlated sensor data is used to generate a flow profile for a series of spatial locations along the trajectory, step 406.
  • This output flow profile is then, optionally, used to generate a visualisation for display, step 408.
  • the sensor data is received in source files of different file types. These source files may be grouped into a file folder for processing.
  • a first file type includes the following fields: GPS start location; for each measurement event, Time Stamp (e.g. from a millisecond-counting clock), Location co ordinates, Accelerometer values x, y, z, Gyro values x, y, z; and GPS end location.
  • Time Stamp e.g. from a millisecond-counting clock
  • Location co ordinates Accelerometer values x, y, z, Gyro values x, y, z
  • GPS end location includes the following fields: GPS start location; for each measurement event, Time Stamp (e.g. from a millisecond-counting clock), Location co ordinates, Accelerometer values x, y, z, Gyro values x, y, z; and GPS end location.
  • a second, optional, file type includes audio data (e.g. as an MP3 audio recording).
  • a third optional, file type includes video data (e.g. as a JPEG image sequence).
  • the timestamp for each file is aligned to a master clock time and a correction offset is applied to each file to normalise movement.
  • a correction offset is applied to each file to normalise movement.
  • step 504 conventional mathematical models are applied to derive xyz movement from xyz acceleration and gyro information. This is done by double integration (i.e. integrating acceleration values to derive velocity values and then integrating those velocity values, again, to derive distance values). [0054] Techniques such as Kalman filtering can also be applied at this stage. Kalman filtering applies forward analysis techniques to reduce error.
  • each individual inertial navigation course is normalised using the known values of the (GPS) start and end points in 3D space.
  • the normalisation applies a correction to the inertial navigation data (i.e. back propagates a 3D error correction).
  • processing of Stream A at step 504 may generate a location ⁇ 1.1, .97, 1.22 ⁇ over, for the sake of illustration, 1000 data points.
  • the value at each data point in the x axis say, may be corrected (i.e. normalised) by calculating
  • Vdiif is the velocity at or after the feature and VI is the velocity at a time point before the feature is encountered.
  • a leak gives rise to turbulence.
  • Water flow rates are typically set where the fluid flow is kept stable in lamina flow (i.e. smooth) rather than turbulent flow (i.e. chaotic) which is inefficient.
  • turbulent flow i.e. chaotic
  • the system may be arranged to alert users to the presence of features.
  • pressure variance is another marker of leakage. Where the sensor components include pressure sensors, this additional parameter may also be used.
  • the system may process sensor data in files of the second type in one or more of the following techniques a. make the audio signal available to the operator to listen to the audio b. accessing a database of reference feature profiles for different characteristic features (i.e. signatures of leaks of different types and sizes); and, applying machine learning techniques to pre-identify the size of the leak. As the system is applied in a greater number of live sites, the data received those live sites will improve the quality of the reference feature profiles and thus the ability to predict leak size will improve. c. matching the speed variance data from the analysis for files of the first type and the audio stream machine learning of b. to give highly accurate predicted leak size along with a specific location.
  • All of the above data streams may optionally be combined to produce a single output file (in XML format, for example) containing: a. GPS co-ordinates to 6DP. (the equivalent of ⁇ 40cm accuracy at UK latitudes) b. Velocity of flow at each point c. Leak size at each point d. Audio stream at the leak locations e. Image set at the leak locations
  • the file can then be processed by an image viewer and displayed to the operator, giving a view similar to Figure 3, for example.
  • Analysis of the collected data may further include the application of predictive movement algorithm to indicate when a sensor device has been deflected from its expected course by turbulence or an obstruction.
  • Figure 6 is a schematic illustration of certain components of a sensor device suitable for generating a data stream such as that processed by the sensor data processing system of the disclosed subject matter.
  • Figure 6 shows a diagrammatic representation of the sensor device 600 in the example form of a computer system, within which instructions 616 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the device 600 to perform any one or more of the methodologies discussed herein may be executed.
  • the instructions 616 may be used to implement modules or components described herein.
  • the instructions 616 transform the general, non-programmed device 600 into a particular machine 600 programmed to carry out the described and illustrated functions in the manner described.
  • the device 600 operates as a standalone device or may be coupled (e.g., networked) to other machines. Further, while only a single device 600 is illustrated, the device may operate as one of a collection of similar devices that individually or jointly execute the instructions 616 to perform any one or more of the methodologies discussed herein.
  • the device 600 may include processors 610, memory/storage 630, and I/O components 650, which may be configured to communicate with each other such as via a bus 602.
  • the memory/storage 630 may include a memory 632, such as a main memory, or other memory storage, and a storage unit 636, both accessible to the processors 610 such as via the bus 602.
  • the storage unit 636 and memory 632 store the instructions 616 embodying any one or more of the methodologies or functions described herein.
  • the instructions 616 may also reside, completely or partially, within the memory 632, within the storage unit 636, within at least one of the processors 610 (e.g., within the processor’s cache memory), or any suitable combination thereof, during execution thereof by the device 600. Accordingly, the memory 632, the storage unit 636, and the memory of processors 610 are examples of machine -readable media.
  • the I/O components 650 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on.
  • the specific I/O components 650 that are included in a particular device 600 will depend on the type of machine. It will be appreciated that the I/O components 650 may include many other components that are not shown in Figure 6.
  • the I/O components 650 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 650 may include output components and input components.
  • the I/O components 650 may include motion components 658, environment components 660, or position components 662 among a wide array of other components.
  • the motion components 658 may include an inertial measurement unit (IMU), acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope, gyrometer), and so forth.
  • IMU inertial measurement unit
  • acceleration sensor components e.g., accelerometer
  • gravitation sensor components e.g., gravitation sensor components
  • rotation sensor components e.g., gyroscope, gyrometer
  • the environment components 660 may include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometer that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., , one or more microphones/hydrophones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment.
  • illumination sensor components e.g., photometer
  • temperature sensor components e.g., one or more thermometer that detect ambient temperature
  • humidity sensor components e.g., pressure sensor components (e.g., barometer)
  • acoustic sensor components e.g., one or more microphones/hydrophones that detect background noise
  • proximity sensor components e.g., in
  • the position components 662 may include location sensor components (e.g., a Global Position system (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
  • location sensor components e.g., a Global Position system (GPS) receiver component
  • altitude sensor components e.g., altimeters or barometers that detect air pressure from which altitude may be derived
  • orientation sensor components e.g., magnetometers
  • the I/O components 650 may include communication components 664 operable to couple the device 600 to a network 680 or other devices 670 via coupling 682 and coupling 672 respectively.
  • the communication components 664 (referred to above as a “communication module”) may include at least one of a LoRa, WiFi or cellular communications module.
  • the communication components 664 may include a network interface component or other suitable device to interface with the network 680.
  • communication components 664 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities.
  • the other devices 670 may be devices identical to the device 600 or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).
  • USB Universal Serial Bus
  • the communication components 664 may detect identifiers or include components operable to detect identifiers.
  • the communication components 664 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals).
  • RFID Radio Frequency Identification
  • NFC smart tag detection components e.g., an optical sensor to detect one dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, and other optical codes
  • UPC Universal Product Code
  • QR Quick Response
  • acoustic detection components e.g., microphones to identify tagged audio signals.
  • IP Internet Protocol
  • Wi-Fi® Wireless Fidelity
  • the instructions 616 can be transmitted or received over the network 680 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 664) and utilizing any one of a number of well- known transfer protocols (e.g., HTTP). Similarly, the instructions 616 can be transmitted or received using a transmission medium via the coupling 672 (e.g., a peer-to-peer coupling) to devices 670.
  • the term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying the instructions 616 for execution by the device 600, and includes digital or analog communications signals (i.e., carrier signals) or other intangible medium to facilitate communication of such software.
  • Figure 7 is a schematic illustration of certain components of a server in the sensor data processing system in accordance with an exemplary embodiment of the disclosed subject matter.
  • Figure 7 shows a diagrammatic representation of a server 700 in the sensor data processing system in the example form of a computer system, within which instructions 716 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the server 700 to perform any one or more of the methodologies discussed herein may be executed.
  • the instructions 716 may be used to implement modules or components described herein.
  • the instructions 716 transform the general, non-programmed server 700 into a particular machine 700 programmed to carry out the described and illustrated functions in the manner described.
  • the server 700 may be coupled (e.g., networked) to other machines, including one or more sensor devices, such as sensor device 600 in Figure 6 . Further, while only a single server 700 is illustrated, the device may operate as one of a collection of similar servers that individually or jointly execute the instructions 716 to perform any one or more of the methodologies discussed herein.
  • the server 700 may include processors 710, memory/storage 730, and communication components 774, which may be configured to communicate with each other such as via a bus 702.
  • the memory/storage 730 may include a memory 732, such as a main memory, or other memory storage, and a storage unit 736, both accessible to the processors 710 such as via the bus 702.
  • the storage unit 736 and memory 732 store the instructions 716 embodying any one or more of the methodologies or functions described herein.
  • the instructions 716 may also reside, completely or partially, within the memory 732, within the storage unit 736, within at least one of the processors 710 (e.g., within the processor’s cache memory), or any suitable combination thereof, during execution thereof by the server 700. Accordingly, the memory 732, the storage unit 736, and the memory of processors 710 are further examples of machine- readable media.
  • Communication may be implemented using a wide variety of technologies.
  • the communication components 774 are operable to couple the server 700 to a network 780 and/or other devices 770 (e.g., devices 600, 670 of Figure 6) via couplings 782 & 772 respectively.
  • the network 780 may correspond with the network 680 illustrated in Figure 6.
  • the communication components 774 may include a network interface component or other suitable device to interface with the network 780.
  • the instructions 716 can be transmitted or received over the network 780 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 774) and utilizing any one of a number of well- known transfer protocols (e.g., HTTP). Similarly, the instructions 716 can be transmitted or received using a transmission medium via the coupling 772 (e.g., a peer-to-peer coupling) to devices 770.
  • a network interface device e.g., a network interface component included in the communication components 774
  • HTTP transfer protocol
  • the instructions 716 can be transmitted or received using a transmission medium via the coupling 772 (e.g., a peer-to-peer coupling) to devices 770.
  • the fluid medium may be a hydrocarbon “product” such as oil, diesel, petroleum, etc. as well as water-based medium.
  • the fluid conduit may be a waste pipe, fuel line etc. rather than water supply pipework.
  • Example 1 A method for characterising fluid flow in a conduit, the method comprising: receiving a plurality of data streams from a plurality of sensor devices, each data stream including sensor data gathered by a respective sensor device following a respective trajectory through the conduit, the sensor data including a time-separated series of acceleration and orientation measurements; processing the sensor data to determine a flow profile, the flow profile including at least one conduit feature; and outputting the flow profile characterising the fluid flow.
  • Example 2 The method of Example 1 , wherein the series of acceleration and orientation measurements includes corresponding timestamps for each measurement; and wherein the processing of the sensor data comprises aligning timestamps for respective data streams.
  • Example 3 The method of any one of the preceding Examples, wherein the processing of the sensor data comprises processing the acceleration and orientation measurements to derive a set inertial navigation data.
  • Example 4 The method of Example 3, wherein the sensor data further includes at least one absolute location measurement.
  • Example 5 The method of Example 4, wherein the or each absolute location measurement is determined by a GPS location device.
  • Example 6 The method of any one of Examples 3, 4 or 5, wherein the sensor data includes an initial location fix and a final location fix; and wherein the processing of the sensor data comprises applying a normalising correction to the inertial navigation data based on the initial location fix and the final location fix.
  • Example 7 The method of any one of Examples 3 to 6, wherein the respective inertial navigation data for each data stream are differentially analysed to identify conduit features.
  • Example 8 The method of any one of the preceding Examples, wherein the sensor data further includes video data.
  • Example 9 The method of any one of the preceding Examples, wherein the sensor data further includes audio data.
  • Example 10 The method of Example 9, further comprising: analysing the audio data to determine patterns; accessing a data base of reference patterns corresponding to the presence of a conduit feature; and comparing determined patterns with reference patterns to detect the signature of the conduit feature.
  • Example 11 The method of any one of the preceding Examples, wherein the acceleration and orientation measurements are output by at least one of an inertial measurement unit, IMU, an accelerometer and a gyrometer.
  • Example 12 The method of any one of the preceding Examples, wherein the conduit feature is at least one of a leak and a pump at a location in the conduit.
  • Example 13 A sensor data processing system having a memory means (730), communication means (774) and at least one processor means (710), the at least one processor means being configured to execute a set of instructions that causes the at least one processor to carry out the method of any one of Examples 1 to 12.
  • Example 14 A non-transitory computer readable storage medium having stored thereon instructions for causing a machine, when executing the instructions, to perform operations comprising the method of any one of Examples 1 to 12.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Arrangements For Transmission Of Measured Signals (AREA)

Abstract

There is provided a method and system for characterising fluid flow in a conduit. Using sensor devices that move freely with the flow of fluid through a conduit such as a pipe, data from more than one sensor device is collected and processed to derive a profile of the conduit features in the pipe.

Description

SENSOR DATA PROCESSING
[0001] The present disclosure relates to a system for processing sensor data. In particular, the disclosure relates to a system that may be used in processing the data generated by a plurality of sensor devices arranged to move freely through a fluid medium, such as drinking water in utilities pipework.
[0002] Utility companies transporting fuel, water and other fluids through legacy pipework and other conduits often experience unforeseen obstacles to efficient delivery of their services. Leaks in drinking water supplies, for example, cause loss of water pressure and unacceptable wastage.
[0003] Leak detection can be achieved by laborious and often inconclusive techniques such as pressure testing (isolating sections of pipework at a time to introduce a neutral, compressible fluid, such as air, and determining whether the compressible fluid maintains an initial pressure).
[0004] In larger diameter (>10cm diameter) pipework, it is possible to introduce probes bearing cameras and/or acoustic sensors at preinstalled access ports along the extent of the pipework in order to inspect the accessible portions of the pipework for anomalous features. It is even known to provide a sensor device arranged to move freely with the flow of the surrounding fluid medium so that probe sensors may be carried by the flow of the fluid to otherwise inaccessible points along the extent of such pipework. In many cases, the introduction of probes, whether tethered or free-moving, requires an interruption to the services delivered using the pipework under investigation.
[0005] Existing free-moving sensor devices, however, provide a data stream of images, audio data and other sensor measurements that is peculiar to the specific path taken by the sensor device and can lead to idiosyncratic data sets that are difficult to interpret.
[0006] The difficulties in obtaining reliable sample data sets lead to latency in extracting and processing that data and then outputting results in a form that the responsible engineer can interpret.
[0007] Furthermore, the deployment and retrieval of such sensor devices can be time consuming and error prone, requiring the specialist training for users. This cumulatively means that many pipework anomalies are deemed too trivial to warrant the resources necessary for an effective investigation.
[0008] It is an object of the invention to at least ameliorate one or more of the above or other shortcomings of prior art and/or to provide a useful alternative. SUMMARY OF THE INVENTION
[0009] Various aspects and embodiments of the present disclosure are defined by the appended claims. However, it will be appreciated that features and aspects of the present disclosure may be combined with other different aspects of the disclosure as appropriate, and not just in the specific illustrative combinations described herein.
BRIEF DESCRIPTION OF DRAWINGS
[0010] The present disclosed subject matter will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which corresponding or like numerals or characters indicate corresponding or like components. Unless indicated otherwise, the drawings provide exemplary embodiments or aspects of the disclosure and do not limit the scope of the disclosure. In the drawings:
[0011] Figures 1A to IF illustrate the deployment of a collection of sensor devices in the case of leak detection;
[0012] Figure 2 is a plot of speed against time that illustrates a case where sensor devices are deployed and encounter a leak;
[0013] Figure 3 is an example of a visualisation of a length of pipework in accordance with an exemplary embodiment of the disclosed subject matter;
[0014] Figure 4 is a schematic diagram summarising the main functional blocks in the operation of a sensor data processing system in accordance with some exemplary embodiments of the disclosed subject matter;
[0015] Figure 5 is a schematic diagram illustrating the main steps in the analysis and processing of sensor data obtained by sensor devices, in accordance with some exemplary embodiments of the disclosed subject matter;
[0016] Figure 6 is a schematic illustration of certain components of a sensor device suitable for generating a data stream such as that processed by the sensor data processing system of the disclosed subject matter; and
[0017] Figure 7 is a schematic illustration of certain components of a server in the sensor data processing system in accordance with an exemplary embodiment of the disclosed subject matter. DESCRIPTION OF EXAMPLE EMBODIMENTS
[0018] The detailed description set forth below in connection with the appended drawings is intended as a description of various exemplary embodiments of the disclosure and is not intended to represent the only forms in which the present disclosure may be practised. It is to be understood that the same or equivalent functions may be accomplished by different embodiments that are intended to be encompassed within the scope of the invention.
[0019] One technical solution is to expand the number of sensor devices deployed during any given investigation. While the increased number of sensor devices might appear to increase the demand for resources unnecessarily, the sensor devices themselves may be constructed more simply since they are to be used in numbers. The simpler individual construction facilitates a smaller form factor (for instance, <50mm diameter) allowing the sensors to be deployed efficiently in smaller-bore pipework and without significant disruption to service delivery.
[0020] Individual sensor devices may include a plurality of component sensors. The sensor devices deployed in a single investigation may include identical arrangements of component sensors. Alternatively, certain sensor devices among the plurality of sensor devices deployed in a single investigation may differ from the other sensor devices, with different sensor devices including differing sets of component sensors. In certain embodiments, the component sensors include at least one of an inertial measurement unit (IMU), a magnetometer, a gyroscope, accelerometer, acoustic sensor and a camera module. The camera module may be arranged to provide a sequence of 360 degree images and/or 3D images.
[0021] The sensor device may further include a controller module for controlling the operation of the component sensors: the controller module may include one or more processors. In addition, the sensor devices may be provided with a communication module for receiving and/or transmitting signals and a memory means configured to store instructions for the controller module and measurement data generated by the component sensors.
[0022] The communication module may include at least one of a LoRa, WiFi or cellular communications module. In certain embodiments, the sensor device may include a location module, the location module providing an absolute location measurement: for example, the location module may be a GPS location module. In certain embodiments, absolute location measurements are provided by an external location device (such as a smartphone or a dedicated GPS tracker) and the absolute location measurements are received by the sensor device via the communication module, when the external location device is in communicative connection with the communication module. [0023] Sensor devices are conveniently provided with a power supply, such as a rechargeable battery, so that the components of the sensor device are each supplied with electrical power as required. Such rechargeable batteries may be arranged to be recharged wirelessly, through induction for example.
[0024] To protect the sensor components from the surrounding fluid and to reduce the influence of the sensor device upon the flow of the fluid, the sensor device is encased in a (spherical) sealed casing.
[0025] In certain embodiments, the sensor device is arranged to be neutrally buoyant at expected flow pressures: as a result, the sensor device is suspended within the flowing fluid and rises when the fluid becomes still (in water, the sensor device will remain immersed while moving with the flowing water but rise to the air/water interface in still water).
[0026] In certain use cases, each of the sensor devices in the collection is captured (e.g. netted) at the end point of the trajectory and the data generated by the component sensors (and stored in the memory means) is transferred, via the communication module, to a receiving device. The receiving device may be a dedicated reader or an external device having a transceiver suitable for establishing a communicative connection with the sensor device via the communication module, for example a smartphone having a Bluetooth, BLE, ZigBee, LoRa, NFC and/or WiFi communication function.
[0027] The data from each sensor device is received as a respective data stream. In certain embodiments, each data stream will include measurement data that act as markers of anomalies and, as each device passes each anomaly, the specific location and properties of the anomaly can be identified by comparing the change of movements in one sensor device relative to each other and a known initial starting point rather than their absolute movement and using their combined processed navigation data.
[0028] By analysing and comparing multiple streams of data produced by a collection of sensor devices following a similar trajectory along a pipe, highly accurate information about the location of anomalies in the flow can be accurately identified. Examples of anomalies include leaks, pumps, and other causes of boundaries between laminar and chaotic flow.
[0029] By using and comparing multiple data streams, errors levels are greatly reduced giving much higher location accuracy than can be achieved by a single sensor device.
[0030] The data analysis is based on the fact that all data streams start in the same known location and all finish in a common, alternate, known location. The locations may be set using high resolution GPS values. [0031] Figures 1A-1F illustrate the deployment of a collection of sensor devices 104, 106, 108 in the case of leak detection.
[0032] In Figure 1A, a collector 122 (e.g. a net) and a dispenser (for inserting the collection of sensor devices) are installed at access ports 110, 120 (e.g. hydrant points) spaced apart along the extent of pipework to be investigated.
[0033] In Figure IB, while the sensor devices are in the dispenser, communication is established between an external processing device 130 and the sensor devices 104, 106, 108 via the communication module of the respective devices (e.g. using WiFi). In the illustrated case, the external processing device 130 is a smartphone with an integral GPS location module. The smartphone 130 executes an application (i.e. App) that accesses the GPS location generated by the GPS module and a system clock to set the initial GPS location value in each of the sensor devices and to ensure that the controller modules of each sensor device are synchronised.
[0034] In Figure 1C, the valve of the first hydrant point 110, at which the dispenser is installed, is opened releasing the sensor devices 104, 106, 108 one at a time, at (time) intervals, into the fluid flow. From this time onwards, the plurality of sensor components of each sensor generate sensor measurement data at successive times as the sensor device progresses along a trajectory along the pipework being investigated, carried by the fluid flow. The sensor measurement being stored together with a timestamp identifying the measurement time.
[0035] In Figure ID, the sensor devices encounter a leak 150. Leaks create turbulence, speed/pressure loss and noise, which are experienced (and measured) by the respective sensor devices in turn as they are carried by the fluid flow.
[0036] In Figure IE, the collector 122 retrieves the sensor devices 104, 106, 108 at the second hydrant point 120. In the illustrated case, the sensor devices are arranged to begin signalling upon sensing that they are stationary in the collector trap.
[0037] In Figure IF, the sensor devices 104, 106, 108 (re)connect with an external processing device 130’ (which may be the same external processing device used to initialise the sensor devices in Figure IB) in a second communication connection, via WiFi for example. The second communication connection allows the respective sensor devices to upload data files including the measured sensor data to the external processing device for processing. The external processing device again accesses the GPS location generated by a location module (e.g. the GPS module of the external processing device) and a system clock to fix the final GPS location value for the end point of the trajectory taken by the sensor devices. [0038] Figure 2 is a plot of speed against time and illustrates a simple case in which a collection of three sensor devices is deployed and encounters a leak (as in Figure ID). As the leading sensor device passes a leak in the pipe it will slow down but following devices will not change their velocity until they each pass the leak in turn. In Figure 2, the velocity for the lead sensor device is plotted as a first curve, 204; for the first following sensor device, as a second curve, 206; and, for the second following sensor device, as a third curve, 208. This allows the normal variances in pipe speed to be eliminated from the data allowing corrected navigational data to pinpoint the location of the leak in 3D space with greater accuracy.
[0039] While the external processing device may itself perform analysis on the respective data streams, certain embodiments make use of the networking capabilities in processing devices such as smartphones, tablets and laptop devices to upload the data streams to a distributed processing network (i.e. a cloud computing platform). Consistent and scalable levels of processing may be applied to data streams from collections of different numbers and types of sensor devices by the servers of the distributed processing network.
[0040] Identification of the size and location of leaks detected in a normally pressurised system using multiple recorded data streams and differential analysis gives unprecedented sensitivity.
[0041] In certain embodiments, the output from the sensor data processing system may be visualised through a visualisation tool. Depending upon requirements and available data, the visualisation tool may present a view of the pipework under investigation that highlights detected anomalies in their context. Examples of display representations include displays in the forms of tables, vector graphics, and point clouds.
[0042] Figure 3 is an exemplary visualisation from the visualisation tool. Anomalies 302 and 304 are identified and their special context is apparent from the neighbouring extent of the pipework.
[0043] Further visualisations may require the correlation of the absolute location values with map data provided by surface mapping tools. For example, the visualisation tool may be provided with an interface that facilitates the overlay of the generated model of the investigated pipework in a familiar mapping application such as Google Maps (e.g. using a marked up KML file). Through the use of such “mash-up” interfaces, engineers may readily determine specific geographic locations at which ground works should take place in order to resolve identified leak conditions. [0044] In certain embodiments, the sensor data processing system may generate a user presentation of movement analysis with the audio (and visual) streams. Combined multiple still images taken at multiple angles may be image processed to produce single, composite, 'internal view' of the pipe. Such internal view visualisations may be conveniently presented through virtual reality display means such as the Hololens [RTM] system of Microsoft [RTM].
[0045] The operation of the sensor data processing system is illustrated in Figures 4 and 5.
[0046] Figure 4 summarises the main functional blocks in the operation of the sensor data processing system. At step 402, the sensor data is acquired as data streams from respective sensor devices in a collection of such devices. The sensor data is then processed to correlate the data in each of the data streams with relative locations along the trajectory of the sensor devices through a fluid conduit, step 404. The correlated sensor data is used to generate a flow profile for a series of spatial locations along the trajectory, step 406. This output flow profile is then, optionally, used to generate a visualisation for display, step 408.
[0047] The correlation of data at step 404 is discussed in more detail below.
[0048] In certain embodiments, the sensor data is received in source files of different file types. These source files may be grouped into a file folder for processing.
[0049] A first file type (File Type A) includes the following fields: GPS start location; for each measurement event, Time Stamp (e.g. from a millisecond-counting clock), Location co ordinates, Accelerometer values x, y, z, Gyro values x, y, z; and GPS end location.
[0050] A second, optional, file type (File Type B) includes audio data (e.g. as an MP3 audio recording). A third optional, file type (File Type C) includes video data (e.g. as a JPEG image sequence).
[0051] The main steps in the analysis and processing of the sensor data in files of the first type is illustrated in Figure 5.
[0052] At step 502, the timestamp for each file is aligned to a master clock time and a correction offset is applied to each file to normalise movement. Thus, for example, if the data of Stream A starts at time t=0.000s and the data in Stream B starts at t=20.000s, say, Stream A and Stream B may be aligned by subtracting 20.000s from the time stamp values in Stream B.
[0053] At step 504, conventional mathematical models are applied to derive xyz movement from xyz acceleration and gyro information. This is done by double integration (i.e. integrating acceleration values to derive velocity values and then integrating those velocity values, again, to derive distance values). [0054] Techniques such as Kalman filtering can also be applied at this stage. Kalman filtering applies forward analysis techniques to reduce error.
[0055] As it is assumed the data streams are all measured by sensor devices in a pipe, the possible movements between measurement events can be assumed to be constrained, so that unexpected or outlier movements can be eliminated. This is known as assisted inertial navigation.
[0056] The derivation of displacement information from the x-axis component of xyz acceleration and gyro information may be seen in the example of Table 1 below.
[0057] Table 1: ms rx vx dx
32372 2 0 0 32382 2 0.02 0.0001 32392 5 0.055 0.000475 32402 2 0.09 0.0012 32412 0 0.1 0.00215 32422 3 0.115 0.003225 32432 6 0.16 0.0046 32442 7 0.225 0.006525 32452 3 0.275 0.009025 32462 2 0.3 0.0119 32472 -1 0.305 0.014925 32482 -2 0.29 0.0179 [0058] In Table 1, the x axis data rx is integrated to give vx then integrated again to give dx.
[0059] At step 506, each individual inertial navigation course is normalised using the known values of the (GPS) start and end points in 3D space. The normalisation applies a correction to the inertial navigation data (i.e. back propagates a 3D error correction). For instance, for known start point {x,y,z}={0,0,0} and end point at { 1,1,1 }, processing of Stream A at step 504 may generate a location { 1.1, .97, 1.22} over, for the sake of illustration, 1000 data points. The value at each data point in the x axis, say, may be corrected (i.e. normalised) by calculating
(end x - start x)/(data points) x the specific point.
[0060] So that point 592 of the 1000 data points would have an ‘x’ value correction derived by the calculation: ((1.1 - 1)/1000) x 592. [0061] The normalisation process compensates for any inherent error level in the IMUs used to obtain acceleration and orientation readings and is applied for each axis and for each stream. [0062] Having normalised data sets, it is now possible to differentially analyse the streams, step 508, as illustrated in Figure 2. Here, we can see that as the speed of flow in the pipe varies all three data streams change speed simultaneously, however when a feature is passed the velocity changes for each sensor device in sequence.
[0063] From observation of the (normalised) locations of such features one may infer where a feature occurs. Characteristic differences between the streams may also be inferred, which also helps to synchronize the timing differences (1 sec offset between successive data streams in the example). As such, the exact time in the flow at which the feature is encountered (averaged over three samples to improve accuracy) may be determined in a 3D flow profile model and, by overlaying the 3D flow profile model, we can determine the leak location within the pipe.
[0064] Knowing the exact loss in velocity at the leak point, it is also possible to calculate the leak size. One suitable calculation would be as follows:-
[0065] (Vdiif x Pipe Diameter) / (VI x Pipe Diameter) = loss of water in litres/sec
[0066] where Vdiif is the velocity at or after the feature and VI is the velocity at a time point before the feature is encountered.
[0067] Additionally, a leak gives rise to turbulence. Water flow rates are typically set where the fluid flow is kept stable in lamina flow (i.e. smooth) rather than turbulent flow (i.e. chaotic) which is inefficient. By examining the turbulence at feature points, the system may be arranged to alert users to the presence of features.
[0068] Similarly, pressure variance is another marker of leakage. Where the sensor components include pressure sensors, this additional parameter may also be used.
[0069] Analysis and processing of the sensor data in files of the second type (i.e. audio recordings) is now discussed. Using the analysis for files of the first type (as illustrated in Figures 4 and 5), it is possible to navigate to a time point in the audio recording where audio signals that indicate a leak or pipe tap might be expected.
[0070] In certain embodiments, the system may process sensor data in files of the second type in one or more of the following techniques a. make the audio signal available to the operator to listen to the audio b. accessing a database of reference feature profiles for different characteristic features (i.e. signatures of leaks of different types and sizes); and, applying machine learning techniques to pre-identify the size of the leak. As the system is applied in a greater number of live sites, the data received those live sites will improve the quality of the reference feature profiles and thus the ability to predict leak size will improve. c. matching the speed variance data from the analysis for files of the first type and the audio stream machine learning of b. to give highly accurate predicted leak size along with a specific location.
[0071] Analysis and processing of the sensor data in files of the third type (i.e. image data) is now discussed. Where files of the third type are generated by the sensor devices, another information stream is to capture a sequence of still images using a 360 degree camera arrangement and a flash to improve resolution. By electronically stitching the captured images together, a composite view of the inside of the pipe can be produced. Using the analysis for files of the first type (as illustrated in Figures 4 and 5), it is possible for an operator can ‘see’ the inside of the pipe to inspect those portions of the pipe where features are indicated.
[0072] All of the above data streams may optionally be combined to produce a single output file (in XML format, for example) containing: a. GPS co-ordinates to 6DP. (the equivalent of ~40cm accuracy at UK latitudes) b. Velocity of flow at each point c. Leak size at each point d. Audio stream at the leak locations e. Image set at the leak locations
[0073] The file can then be processed by an image viewer and displayed to the operator, giving a view similar to Figure 3, for example.
[0074] Analysis of the collected data may further include the application of predictive movement algorithm to indicate when a sensor device has been deflected from its expected course by turbulence or an obstruction.
[0075] Figure 6 is a schematic illustration of certain components of a sensor device suitable for generating a data stream such as that processed by the sensor data processing system of the disclosed subject matter. Specifically, Figure 6 shows a diagrammatic representation of the sensor device 600 in the example form of a computer system, within which instructions 616 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the device 600 to perform any one or more of the methodologies discussed herein may be executed. As such, the instructions 616 may be used to implement modules or components described herein. The instructions 616 transform the general, non-programmed device 600 into a particular machine 600 programmed to carry out the described and illustrated functions in the manner described. In alternative embodiments, the device 600 operates as a standalone device or may be coupled (e.g., networked) to other machines. Further, while only a single device 600 is illustrated, the device may operate as one of a collection of similar devices that individually or jointly execute the instructions 616 to perform any one or more of the methodologies discussed herein.
[0076] The device 600 may include processors 610, memory/storage 630, and I/O components 650, which may be configured to communicate with each other such as via a bus 602. The memory/storage 630 may include a memory 632, such as a main memory, or other memory storage, and a storage unit 636, both accessible to the processors 610 such as via the bus 602. The storage unit 636 and memory 632 store the instructions 616 embodying any one or more of the methodologies or functions described herein. The instructions 616 may also reside, completely or partially, within the memory 632, within the storage unit 636, within at least one of the processors 610 (e.g., within the processor’s cache memory), or any suitable combination thereof, during execution thereof by the device 600. Accordingly, the memory 632, the storage unit 636, and the memory of processors 610 are examples of machine -readable media.
[0077] The I/O components 650 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 650 that are included in a particular device 600 will depend on the type of machine. It will be appreciated that the I/O components 650 may include many other components that are not shown in Figure 6. The I/O components 650 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 650 may include output components and input components.
[0078] In further example embodiments, the I/O components 650 may include motion components 658, environment components 660, or position components 662 among a wide array of other components. For example, the motion components 658 may include an inertial measurement unit (IMU), acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope, gyrometer), and so forth. The environment components 660 may include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometer that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., , one or more microphones/hydrophones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. The position components 662 may include location sensor components (e.g., a Global Position system (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
[0079] Communication may be implemented using a wide variety of technologies. The I/O components 650 may include communication components 664 operable to couple the device 600 to a network 680 or other devices 670 via coupling 682 and coupling 672 respectively. The communication components 664 (referred to above as a “communication module”) may include at least one of a LoRa, WiFi or cellular communications module. For example, the communication components 664 may include a network interface component or other suitable device to interface with the network 680. In further examples, communication components 664 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities. The other devices 670 may be devices identical to the device 600 or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).
[0080] Moreover, the communication components 664 may detect identifiers or include components operable to detect identifiers. For example, the communication components 664 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In addition, a variety of information may be derived via the communication components 664, such as, location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting a NFC beacon signal that may indicate a particular location, and so forth.
[0081] The instructions 616 can be transmitted or received over the network 680 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 664) and utilizing any one of a number of well- known transfer protocols (e.g., HTTP). Similarly, the instructions 616 can be transmitted or received using a transmission medium via the coupling 672 (e.g., a peer-to-peer coupling) to devices 670. [0082] The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying the instructions 616 for execution by the device 600, and includes digital or analog communications signals (i.e., carrier signals) or other intangible medium to facilitate communication of such software.
[0083] Figure 7 is a schematic illustration of certain components of a server in the sensor data processing system in accordance with an exemplary embodiment of the disclosed subject matter. Specifically, Figure 7 shows a diagrammatic representation of a server 700 in the sensor data processing system in the example form of a computer system, within which instructions 716 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the server 700 to perform any one or more of the methodologies discussed herein may be executed. As such, the instructions 716 may be used to implement modules or components described herein. The instructions 716 transform the general, non-programmed server 700 into a particular machine 700 programmed to carry out the described and illustrated functions in the manner described. The server 700 may be coupled (e.g., networked) to other machines, including one or more sensor devices, such as sensor device 600 in Figure 6 . Further, while only a single server 700 is illustrated, the device may operate as one of a collection of similar servers that individually or jointly execute the instructions 716 to perform any one or more of the methodologies discussed herein.
[0084] The server 700 may include processors 710, memory/storage 730, and communication components 774, which may be configured to communicate with each other such as via a bus 702. The memory/storage 730 may include a memory 732, such as a main memory, or other memory storage, and a storage unit 736, both accessible to the processors 710 such as via the bus 702. The storage unit 736 and memory 732 store the instructions 716 embodying any one or more of the methodologies or functions described herein. The instructions 716 may also reside, completely or partially, within the memory 732, within the storage unit 736, within at least one of the processors 710 (e.g., within the processor’s cache memory), or any suitable combination thereof, during execution thereof by the server 700. Accordingly, the memory 732, the storage unit 736, and the memory of processors 710 are further examples of machine- readable media.
[0085] Communication may be implemented using a wide variety of technologies. The communication components 774 are operable to couple the server 700 to a network 780 and/or other devices 770 (e.g., devices 600, 670 of Figure 6) via couplings 782 & 772 respectively. The network 780 may correspond with the network 680 illustrated in Figure 6. For example, the communication components 774 may include a network interface component or other suitable device to interface with the network 780.
[0086] The instructions 716 can be transmitted or received over the network 780 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 774) and utilizing any one of a number of well- known transfer protocols (e.g., HTTP). Similarly, the instructions 716 can be transmitted or received using a transmission medium via the coupling 772 (e.g., a peer-to-peer coupling) to devices 770.
[0087] The reader will readily appreciate that, while the above description of the embodiments illustrated in Figures 1A to 7 is described in terms of pipework delivering water utilities, the disclosed system and method has wider applicability. The fluid medium may be a hydrocarbon “product” such as oil, diesel, petroleum, etc. as well as water-based medium. Likewise, the fluid conduit may be a waste pipe, fuel line etc. rather than water supply pipework.
SELECTED EXAMPLE EMBODIMENTS
[0088] From the preceding description it will be seen that a number of example embodiments and combinations of example embodiments are disclosed. The disclosed embodiments include, but are not limited to, the enumerated list of example embodiments that follow.
Example 1. A method for characterising fluid flow in a conduit, the method comprising: receiving a plurality of data streams from a plurality of sensor devices, each data stream including sensor data gathered by a respective sensor device following a respective trajectory through the conduit, the sensor data including a time-separated series of acceleration and orientation measurements; processing the sensor data to determine a flow profile, the flow profile including at least one conduit feature; and outputting the flow profile characterising the fluid flow.
Example 2. The method of Example 1 , wherein the series of acceleration and orientation measurements includes corresponding timestamps for each measurement; and wherein the processing of the sensor data comprises aligning timestamps for respective data streams. Example 3. The method of any one of the preceding Examples, wherein the processing of the sensor data comprises processing the acceleration and orientation measurements to derive a set inertial navigation data.
Example 4. The method of Example 3, wherein the sensor data further includes at least one absolute location measurement.
Example 5. The method of Example 4, wherein the or each absolute location measurement is determined by a GPS location device.
Example 6. The method of any one of Examples 3, 4 or 5, wherein the sensor data includes an initial location fix and a final location fix; and wherein the processing of the sensor data comprises applying a normalising correction to the inertial navigation data based on the initial location fix and the final location fix.
Example 7. The method of any one of Examples 3 to 6, wherein the respective inertial navigation data for each data stream are differentially analysed to identify conduit features.
Example 8. The method of any one of the preceding Examples, wherein the sensor data further includes video data.
Example 9. The method of any one of the preceding Examples, wherein the sensor data further includes audio data.
Example 10. The method of Example 9, further comprising: analysing the audio data to determine patterns; accessing a data base of reference patterns corresponding to the presence of a conduit feature; and comparing determined patterns with reference patterns to detect the signature of the conduit feature.
Example 11. The method of any one of the preceding Examples, wherein the acceleration and orientation measurements are output by at least one of an inertial measurement unit, IMU, an accelerometer and a gyrometer. Example 12. The method of any one of the preceding Examples, wherein the conduit feature is at least one of a leak and a pump at a location in the conduit. Example 13. A sensor data processing system having a memory means (730), communication means (774) and at least one processor means (710), the at least one processor means being configured to execute a set of instructions that causes the at least one processor to carry out the method of any one of Examples 1 to 12. Example 14. A non-transitory computer readable storage medium having stored thereon instructions for causing a machine, when executing the instructions, to perform operations comprising the method of any one of Examples 1 to 12.
[0089] Further particular and preferred aspects of the present invention are set out in the accompanying independent and dependent claims. It will be appreciated that features of the dependent claims may be combined with features of the independent claims in combinations other than those explicitly set out in the claims.
[0090] Terms such as "comprises", "comprising", "has", "contains", "includes" or any other grammatical variation thereof, are intended to cover a non-exclusive inclusion, such that module, circuit, device components, structures and method steps that comprises a list of elements or steps does not necessarily include only those elements but, rather, may include other elements or steps not expressly listed or inherent to such module, circuit, device components or steps. An element or step proceeded by "comprises ... a" does not, without more constraints, preclude the existence of additional identical elements or steps that comprise the element or step.

Claims

1. A method for characterising fluid flow in a conduit, the method comprising: receiving a plurality of data streams from a plurality of sensor devices, each data stream including sensor data gathered by a respective sensor device following a respective trajectory through the conduit, the sensor data including a time-separated series of acceleration and orientation measurements; processing the sensor data to determine a flow profile, the flow profile including at least one conduit feature; and outputting the flow profile characterising the fluid flow.
2. The method of claim 1, wherein the series of acceleration and orientation measurements includes corresponding timestamps for each measurement; and wherein the processing of the sensor data comprises aligning timestamps for respective data streams.
3. The method of any one of the preceding claims, wherein the processing of the sensor data comprises processing the acceleration and orientation measurements to derive a set inertial navigation data.
4. The method of claim 3, wherein the sensor data further includes at least one absolute location measurement.
5. The method of claim 4, wherein the or each absolute location measurement is determined by a GPS location device.
6. The method of any one of claims 3, 4 or 5, wherein the sensor data includes an initial location fix and a final location fix; and wherein the processing of the sensor data comprises applying a normalising correction to the inertial navigation data based on the initial location fix and the final location fix.
7. The method of any one of claims 3 to 6, wherein the respective inertial navigation data for each data stream are differentially analysed to identify conduit features.
8. The method of any one of the preceding claims, wherein the sensor data further includes video data.
9. The method of any one of the preceding claims, wherein the sensor data further includes audio data.
10. The method of claim 9, further comprising: analysing the audio data to determine patterns; accessing a data base of reference patterns corresponding to the presence of a conduit feature; and comparing determined patterns with reference patterns to detect the signature of the conduit feature.
11. The method of any one of the preceding claims, wherein the acceleration and orientation measurements are output by at least one of an inertial measurement unit, IMU, an accelerometer and a gyrometer.
12. The method of any one of the preceding claims, wherein the conduit feature is at least one of a leak and a pump at a location in the conduit.
13. A sensor data processing system having a memory means (730), communication means (774) and at least one processor means (710), the at least one processor means being configured to execute a set of instructions that causes the at least one processor to carry out the method of any one of claims 1 to 12.
14. A computer readable storage medium carrying instructions for causing a machine, when executing the instructions, to perform operations comprising the method of any one of claims 1 to 12.
PCT/EP2020/073248 2019-08-19 2020-08-19 Sensor data processing WO2021032800A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1911874.4 2019-08-19
GBGB1911874.4A GB201911874D0 (en) 2019-08-19 2019-08-19 Sensor data processing

Publications (1)

Publication Number Publication Date
WO2021032800A1 true WO2021032800A1 (en) 2021-02-25

Family

ID=68099431

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/073248 WO2021032800A1 (en) 2019-08-19 2020-08-19 Sensor data processing

Country Status (2)

Country Link
GB (1) GB201911874D0 (en)
WO (1) WO2021032800A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030056607A1 (en) * 1999-05-28 2003-03-27 Baker Hughes Incorporated Method for utilizing microflowable devices for pipeline inspections
DE102015206535A1 (en) * 2015-04-13 2016-10-13 Robert Bosch Gmbh Mobile device, method and system for monitoring material transport lines

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030056607A1 (en) * 1999-05-28 2003-03-27 Baker Hughes Incorporated Method for utilizing microflowable devices for pipeline inspections
DE102015206535A1 (en) * 2015-04-13 2016-10-13 Robert Bosch Gmbh Mobile device, method and system for monitoring material transport lines

Also Published As

Publication number Publication date
GB201911874D0 (en) 2019-10-02

Similar Documents

Publication Publication Date Title
EP2860596B1 (en) Methods and apparatus relating to measurement instruments
EP3896479A1 (en) Localization systems and methods
US20160198286A1 (en) Sensor installation location determination support system and sensor installation location determination support method
CN107251049B (en) Detecting a location of a mobile device based on semantic indications
KR102028456B1 (en) Facility Inspection System using Augmented Reality based on IoT
EP3492868B1 (en) Mobile device localization based on spatial derivative magnetic fingerprint
US20140156195A1 (en) Systems and Methods for Collecting, Analyzing, Recording, and Transmitting Fluid Hydrocarbon Production Monitoring and Control Data
JPWO2013030929A1 (en) Monitoring device, monitoring system, and monitoring method
KR20150096761A (en) Sensor data collection
EP3203185A1 (en) System and method for determining paths in underground pipelines
US11768151B2 (en) Systems and methods for locating sources of fugitive gas emissions
WO2019013673A1 (en) Magnetic flaw detector for diagnostics of underground steel pipelines
Shangguan et al. Towards accurate object localization with smartphones
US11280933B2 (en) Systems and methods for collecting, displaying, analyzing, recording, and transmitting fluid hydrocarbon production monitoring and control data
KR101597723B1 (en) Manhole Control System and Method
WO2021032800A1 (en) Sensor data processing
CN108931618A (en) A kind of data uploading method and system of underwater detectoscope
TW201621273A (en) Mobile positioning apparatus and positioning method thereof
JP6895733B2 (en) Optical fiber single point of failure identification system and its optical fiber single point of failure identification method
US9953234B2 (en) Compressor conduit layout system
CN205669677U (en) Oil and gas pipeline leakage detection system based on radio sensing network and KNN algorithm
KR20150089366A (en) Indoor location tracking system using earth magnetic field sensor
KR101685067B1 (en) System and method for searching a pipe infomation
Bolanakis Mems barometers in a wireless sensor network for position location applications
JP6727032B2 (en) Mobile terminal, self-position estimation system using the same, server and self-position estimation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20771185

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20771185

Country of ref document: EP

Kind code of ref document: A1