Steel Wire Rope Bending Cycle fatigue

Part Five-Structural  and Hoist Integrity  Management

The series titled  Wire Rope Integrity Management is to be read in conjunction with the Whole Life management of Steel Wire ropes procedure  and consists of the following elements:

Part One – Bending Cycle Counting

Part Two – Initial Setting of Trigger Action and Trigger Discard Values

Part Three –  Empirical assessment of Steel Wire Rope Service Life using Bending fatigue Calculation

Part Four – Practical example of Bending fatigue Calculator Hardware

Part Five -  Structural Fatigue Cycle Counting

The following describes a hardware solution provided to a Well Intervention System which had Wire Rope Cycle fatigue issues

Counting Process

The Counting Process or Program (CP) is a mechanism allowing the accumulative  assessment of stress cycles effecting an item(s) of Lifting Equipment.

To allow for this the Lifting Equipment must have  a load cell. Information is recorded either automatically or manually along with a time stamp for when  discrete  recording entries  were made.

The CP is a standalone programme which interprets the load data into a number of stress cycles.

Data received into the CP is initially normalised to ensure it is in a format that the CP can run its various processes.

Smoothing may be applied to data which is considered to be particularly noisy.

Typically Trend Consistency filtering is applied. As this is considered to be an essential process in this document it is integrated with the requisite data slicing process.

The Stress Cycles are counted in the discrete slices, these are accumulated and compared against Trigger Action Values. The CP will output information is a recognisable format.

Data Smoothing and Trend Consistency Filtering

For data streams containing noisy data the detection of what is a peak or a trough may become very difficult.

The above  trace shows a peak and a trough but without some form of filtering any CP would identify the rapid fluctuations as stress reversals and thereby try to count them. In trying to  process such noisy data it may take a long time to perform the analysis.

As such two methods are offered to ease this process

Trend Consistency

Data smoothing.

A third way may be developed which determines the validity of  a datapoint based upon statistical analysis. This approach is not developed in this revision of this document.

Trend consistency FT(x,y)

This is the most simplest form of filtering.  To acknowledge a reversal the CP must detect 2 or more ( typically 3)  readings of consistently increasing or decreasing magnitude.

Two parameters are required to perform this filtering

Number of Points required  before a reversal is considered to have occurred.

Exclusion deviation ( this is the amount a datapoint must vary

Trend Consistency Filtering is given the nomenclature FT(x,y) where

X is the  number of continuous points required to determine  a reversal has occurred

Y is the Exclusion deviation, this is a relative fractional value of the proceeding datapoint which the subsequent point must  exceed ( or  less than) for the points  to be considered for  reversal. Typical values are 0.01. For example , SWL of  value 100   stress variations of less than 1 are ignored

Methodology of Trend Consistency

Trend direction is  given the nomenclature Td and is Boolean for programming purposes. For rising load Td is  True, for falling load level Td is False

Trend Consistency is applied at  the beginning of a data slice  and is always considered initially  Td is True . Depending on operation experience there may be advantage in applying  Td as False when starting counting from elevated loads. This is not further explored.

A special case is where x is considered one, i.e. FT(1,y)  and therefore each point is considered as if  Trend consistency filtering was not applied. In this case only the Exclusion deviation is applied. Care should be taken when applying this mode as the  additional validity checks must be sufficiently robust  to recognise low amplitude long time base fluctuations. There may be advantage in applying  Smoothing  before running with such a mode.

A Data Slice  start signal initiates the process

CP reads the Load-Time datapoint and sets Td to positive.

The load is compared to the LTT plus any exclusion deviation applied ( the exclusion deviation is applied to the LLT as there is not previous DP.

If the LLT plus Y is not exceeded then the next DP is stepped to.

If the LTT + Y is exceeded then the next DP is stepped too and the program moves to the Main counting program

Once initiated Trend consistency progresses in a series of  Do Until..Loop routines checking for  Stress Reversals.

Once detect these are flagged into the  CP Dataset/Dataslice. Subsequent to this the Stress Counting Routine is run against the data

Application of Trend Consistency Filtering

It is recommended that this rule be applied to all but the most coarse of datastreams. The Number of consistent data points should be set based around the expected noise in the system. It is recommended that this is a user variable parameter

Data Smoothing FS(x)

Data smoothing is a more aggressive form of  Data manipulation as it involves  the modification of data by averaging out several points into a single average one.


Da =  Average Date Point Value

n = Number of datapoints  to be averaged

D(y) =  Data Point at x,y

p = Datapoint(s) in Dataset

Where a visual representation of the data is to be made it is necessary to give magnitude to the time at which the Averaged Data Point (Da) is considered to have occurred Ya

The time value  Ya is determined by the following for placement of the Averaged Data value Da


Ya =  Time point at which Averaged Datapoint is assigned.

Y(y) = Time Value  at x,y

Da =  Average Date Point Value

n = Number of datapoints  to be averaged

D(y) =  Data Point at x,y

p = Datapoint(s) in Dataset

Data smoothing  has the nomenclature  FS(x)  where x is the number of datapoints averaged

Application of data smoothing Filter

It is recommended for all but the most coarse of data streams that some degree of Smoothing is all applied.  Too high a degree of smoothing may prevent the  CP from accurately reporting on the stress magnitudes and cycles. It is recommended this is a user variable parameter.


An FS(3)   or 3 point Data smoothing is applied

Stress Cycle Counting

Calculating Theory

Extraction of the Load cycles from the load recordings will be using  Rainflow Counting. This requires the imagination of the load cycle as a pagoda rook upon which water flows. The load magnitude is considered to be  the  horizontal distance the water is able to flow until it spills of the edge of the roof  or is interrupted by flow from  the roof above .The ‘Load Threshold Trigger’ [LTT] is introduced to prevent spurious counting with empty hook.

It is beyond the scope of this document to fully describe Rainflow counting but the following Golden rules apply

Counting is made  in an independent Upstream and Downstream Process, The two values are accumulated to a single cycle during the process stage.

Each Data Slice is considered Independent of other including those adjacent.

 A Cycle ( or half cycle to be more accurate), completes when the following is true, this is always to be taken as within the specific Dataslice except when Continuous

A count magnitude is complete  when the flow over the edge does not meet another roof section  ( See termination of flow from A  at point E below)

The flow over  the roof is terminated where it is interrupted by  flow from a roof above ( see termination of flow from point C at interruption occurring at D)

It flows opposite a peak or trough  of greater magnitude.( See fig 6.1.b, Flow A-B does not interrupt flow  C-D)

Data-Slicing  or Continuous Reading Mode

The typical mode of  processing the Load history is  Data-Slicing. Each time the load raises above the LTT and new data slice is started. Load Fluctuations  due to dynamic effects  are recognised and counted.

Continuous  Reading Mode   continuously counts with the cyle stop points determine by the relationship between load ‘lows’. i.e. when the minimum load in a cycle is less than that seen in the previous cycle then the previous cycle is consider complete an the upstream  count made against it. The downstream count may be a done  in real time.

The former method lends itself to  process of noisy historical data . The latter method may better allow for  real time demonstration of the current status of accumulation but  requires complex and resource hungry processing. As such it is not further considered in this revision of this document

Calculating Effects of Load Cycles

Load cycles are summated  and are  utilised in  one of two ways according to available information. These ways are

Manufacturer Limitation


2.1 Safety factor for Duty Counting

A safety factor shall be applied depending upon the method that data has been recorded.

Manufacturer Limitation

This is the preferred method as it greatly simplifies assessment of utilisation. The calculation is made depending on what informati0n released by the Manufacturer

For Structure

There are a number of Limitations that may be used

Number of Work Cycles U

Cumulative Loading K

Crane Classification A

For Hoisting Arrangement

Design Limit Duration D

Mechanism, Classification M

Hoisting Arrangement calculations differ to Structural calculations in that they