The square root of a number which is not a perfect square can only be obtained to whatever precision is obtainable by either the hardware or the algo you are using. The best precision you can expect from 128-bit hardware would be the 128^{th} bit. Depending on your rounding mode, that last bit would have been rounded according to the 129^{th} bit if it had been possible to 'keep' it.

In 'down' and 'truncate' modes, the value of that 129^{th} bit is totally disregarded but could be significant in the 'nearesr' mode. The reported difference would then be that 129^{th} bit which has absolutely NO significance when converted to ascii, whether you display it with 1 or 100 decimal digits. 2^{-128} is approximately equal to 10^{-39}.