News:

Masm32 SDK description, downloads and other helpful links
Message to All Guests

Main Menu

best way to handle large amounts of information

Started by mikeburr, December 01, 2021, 06:32:22 AM

Previous topic - Next topic

mikeburr

Im thinking of creating a database related to the number stuff i use here .. the problem is that the prime nos alone constitute i think its 14 [or possibly 16] files each 2**14 in size .. there a lot of other data some of it much  larger than the primes files and of a variable length format .. often created in memory as linked lists with sub lists with indexes ..  theres a bit of a mish mash of functions in some programs and a couple are in 64 bit .. which im beginning to like quite a lot for its more rigorous style ..
so my questions revolve around
1) should i use a server like arrangement with small functionality called with something like a named pipe arrangement [is there an example of this somewhere ??]
2) and or use memory mapped files in another stand alone process and leave the functionality where they are in the programs
another possibility is that the files could be combined and then overlayed into the programs .. does 64 bit give greater memory per program [ vaguely recall that you can get 4gig .. not that this prob helps much] .. and are here any examples of large file handling ..i think that you Hutch have prob done some ??

any advice gratefully received .. ive no specific ideas yet so any suggestions wd be helpfull.. theres obviously no correct way to solve this sort of thing
regards mike b

hutch--

Mike,

I have not fully comprehended what you are after but in terms of how and where you are working on big data, it is usually a heirarchy depending on location. On a local machine with enough memory, a memory mapped file has the legs. If its a machine on your local network with reliable networking, probably UDP but if its a remote machine, its TCP for its reliability.

I have been working for a while on using memory mapped files for interfacing between 32 and 64 bit apps and as long as you stay within the memory limits of 32 bit, its slick and fast. Allocate just under 1tb as a memory mapped file and you can pass big data both ways.

Now there is big data and there is big data and depending on how big the data is, you will have to design a tiling technique from one location to another if the data is very large. Sounds like a fun project you have in mind.

mikeburr

@hutch
most excellent reply .. i remember you put an example of a memory mapped file arrangement and this is pretty much what i had in mind .. what im not so certain about is whether to retain the functionality in individual programs or to have them largely look up data [akin to smart versus dumb terminal configurations ]
many thanks ,ike b