H = DFA(X) calculates the Hurst exponent of time series X using Detrended Fluctuation Analysis (DFA). If a vector of increasing natural numbers is given as the second input parameter, i.e. DFA(X,D), then it defines the box sizes that the sample is divided into (the values in D have to be divisors of the length of series X). If D is a scalar (default value D = 10) it is treated as the smallest box size that the sample can be divided into. In this case the optimal sample size OptN and the vector of divisors for this size are automatically computed. OptN is defined as the length that possesses the most divisors among series shorter than X by no more than 1%. The input series X is truncated at the OptN-th value. [H,PV95] = DFA(X) returns the empirical confidence intervals PV95 at the 95% level (see ). [H,PV95,P] = DFA(X) returns the average standard deviations P of the detrended walk for all the divisors.