News:

Masm32 SDK description, downloads and other helpful links
Message to All Guests
NB: Posting URL's See here: Posted URL Change

Main Menu

Random number visualization tool

Started by NoCforMe, August 07, 2024, 03:50:59 PM

Previous topic - Next topic

NoCforMe

How 'bout starting by just 'splaining it to us? How it works, without too many gory details.
Assembly language programming should be fun. That's why I do it.

daydreamer

#16
When I made program for simt challenge with instead 128 bit SIMD prng, I also made .inc file with prime number lut for prime numbers between 0-65535
I downloaded ent and tried also test entropy on that prime numbers out and got highest entropy = uniqueness
That suggests that random shuffle prime numbers would become highest entropy prng but probably not useful if you want random numbers between 0-1000

What about the prng that's included in masm32 package ?
I used it in my unfinished labyrinth game for random generated treasure found in treasure chests, random amount of coin, the deeper level of labyrinth : level 1,2,3,4 = 1 ,2,,3,4 digits of coin and better quality of items found and more items
I use random numbemrs to point to strings size of 8 chars describes what kind of item you get + another string material its made of
For example sword, shield, armour, axe, ring, helmet + combined with wood, bronze, iron, steel, mythril , silver, gold, platinum = shows what quality item is
Message box pops up with all that info when you open treasure chest
my none asm creations
https://masm32.com/board/index.php?topic=6937.msg74303#msg74303
I am an Invoker
"An Invoker is a mage who specializes in the manipulation of raw and elemental energies."
Like SIMD coding

NoCforMe

I created a couple random # files with a tool I created and ran them through that utility I linked to above (which I renamed randtest.exe). Here are the results:

1,000 random #s:
Entropy = 7.840096 bits per byte.

Optimum compression would reduce the size
of this 1000 byte file by 1 percent.

Chi square distribution for 1000 samples is 213.95, and randomly
would exceed this value 97.10 percent of the times.

Arithmetic mean value of data bytes is 129.8050 (127.5 = random).
Monte Carlo value for Pi is 3.036144578 (error 3.36 percent).
Serial correlation coefficient is 0.051742 (totally uncorrelated = 0.0).

1,000,000 random #s:
Entropy = 7.999793 bits per byte.

Optimum compression would reduce the size
of this 1000000 byte file by 0 percent.

Chi square distribution for 1000000 samples is 287.07, and randomly
would exceed this value 8.17 percent of the times.

Arithmetic mean value of data bytes is 127.4709 (127.5 = random).
Monte Carlo value for Pi is 3.143628575 (error 0.06 percent).
Serial correlation coefficient is -0.001882 (totally uncorrelated = 0.0).

I still don't really understand what all those numbers mean; perhaps someone here could explain that. I assume "entropy" is a rough measure of randomness, the closer it is to 8 (bits/byte), the better?

The larger file seems to do better than the smaller one.

I included my random-# generating tool, RandFile.exe. Usage is
randfile n:<# of random #s> f:<name of file to write>

(don't include the angle brackets) It writes 1-byte numbers between 0-255 to the file, using the Lehmer function.
Assembly language programming should be fun. That's why I do it.

daydreamer

Entropy is good for testing if your own prng get better or worse result
Vcpp directive
#windowe lean and mean
Mean has to do with arithmetic mean or something else ?
Does it mean compromise among library include files ?
my none asm creations
https://masm32.com/board/index.php?topic=6937.msg74303#msg74303
I am an Invoker
"An Invoker is a mage who specializes in the manipulation of raw and elemental energies."
Like SIMD coding

NoCforMe

Assembly language programming should be fun. That's why I do it.