Welcome back to the Ubiq research series, dedicated to explaining key parts of the Ubiq ecosystem.

This time we are focusing on the Flux Difficulty Algorithm, our home-grown solution for ensuring a consistent block time and mining difficulty, despite fluctuating mining hashrate on the network.

### Terminology

As with our other research blog posts, it is useful to first set out an explanation of the terminology with which we are using in this article.

Difficulty — The measure of complexity in regards to the mathematical computation required to find the solution for the next block. The greater the difficulty, the greater amount of computation cycles (hashes) will be required to find the solution.

Hashrate — The speed at which a computer is completing an operation in the Ubiq code.

Block Time — The time it takes from the previous block being found and the current block being found.

Flux Difficulty Algorithm — Code that the Ubiq development team created for difficulty adjustment, ensuring an average block time targeted at 88 seconds, with fluctuating hashrate on the network.

### Ubiq’s nucleus and changing the difficulty algorithm

At the nucleus of Ubiq, the Digibyte Digishield Algorithm was originally utilized to adjust the difficulty per block. This did not fare well on a large hashing network as seen in the graphic below.

The first section (until around block 4000) represents the original algorithm. There were great fluctuations in difficulty in a short period of time, leading to network instability. The second section (from around block 4000–8000) saw a modification of the parameters of the Digishield algo, which led to an improvement in the difficulty adjustments. This was still not stable enough. Block times were still ranging between 2 seconds at the bottom and 10–20 minutes at the top end.

Observing the instability, The Ubiq development team went to work to create and test an alternative solution. This solution would become the Flux Difficulty Algorithm. From block 8000 onwards you can observe the reduction in variability of difficulty, bringing stable blocks to the network in a timely manner.

### How does Flux Function?

Flux targets an average block time of 88 seconds and dynamically changes block difficulty by using two mechanisms.

The first looks at blocktimes over the last 88 previous blocks. This determines whether the difficulty needs to be increased or decreased based on variance over or under the 88 block time average.

The second acts as a throttle/dampner, analyzing the blocktime of the most recent block and determining whether the difficulty adjustment should be a full or partial change. If partial, it decides what value the change should be.

As you can see from the example above, the block time was running above the target of 88 seconds. Based on both an average of 88 blocks and the previous block time, the Flux Algorithm identified the requirement to make a full adjustment to lower the difficulty.

### Tracking the performance of Flux

All of this information is good in theory. However, if we benchmark the performance of Flux, what results do we see?

The above chart benchmarks the average block time over a three month period from actual verifiable data on the Ubiq blockchain. As you can see, the average line in the lighter color varies around the 88 second mark moving above and then below the average as the algorithm adjusts. Overall the average stays within the margin of error of the 88 second block time target.

Also https://ubiq.darcr.us/ collects statistics on the network and has since the nucleus block. You can see the total block time over all blocks is reporting as 87.49 seconds, close to the target of an average 88 second blocktime.

All of these factors lead to a more stable network for developers and businesses using the network, underlining our commitment to be the most Enterprise Stable smart contracts platform.