rescaled range analysis

Rescaled range analysis is a statistical method used to measure the variability and predictability of a time series data based on its range and standard deviation. It involves dividing the range of the data by its standard deviation, and then applying a rescaling factor to normalize the values. This technique helps to identify patterns and trends in the data, especially in situations where there is serial dependence or uneven distribution of data points.

Requires login.