The use of memory mapping for large files in Delphi
I rarely use the memory mapping of large files. I happened to encounter such a requirement, so I recorded the process to give you an introduction. Because the application is not complicated, there may be things that cannot be considered. Welcome to exchange.
For some small files, ordinary file streams can be used to solve the problem. However, for very large files, such as 2G or more, file streams will not work, so you need to use the memory mapping related methods of the API, even if it is memory mapping. Nor can the entire file size be mapped at once, so mapping must be done in chunks, processing a small portion at a time.
Let’s look at some functions first
CreateFile: open a file
GetFileSize: Get file size
CreateFileMapping: Create mapping
MapViewOfFile: mapping file
Looking at the help of MapViewOfFile, its last two parameters need to be an integer multiple of the page granularity. Generally, the page granularity of the machine is 64k (65536 bytes). However, in our actual operation, this is generally not the case. Any position, any Any length is possible, so some processing needs to be done.
The task of this example is to read the length values sequentially from a length list (FInfoList), and then read the data of the specified length sequentially from another large file (FSourceFileName). If it is a small file, this is easier to handle. Just read the file stream once and then read it in sequence. For large files, we need to constantly change the mapping position to get the data we want.
This example shows that the page granularity is first obtained through GetSystemInfo, and then 10 times the page granularity is used as a mapped data block. In the for loop, it will be judged whether the read length (totallen) plus the length to be read are within This time Within the mapping range (10 times the page granularity), if it is, continue reading. If it is exceeded, the remaining data must be recorded, and then the next block of memory must be remapped, and the recorded remaining data must be merged into the new read The data obtained is a bit convoluted (maybe my idea is too convoluted),
The code is listed below.
procedure TGetDataThread.DoGetData; var FFile_Handle:THandle; FFile_Map:THandle; list:TStringList; p:PChar; i,interval:Integer; begin try totallen := 0; offset := 0; tstream := TMemoryStream.Create; stream := TMemoryStream.Create; list := TStringList.Create; //Get system information GetSystemInfo(sysinfo); //Page allocation granularity blocksize := sysinfo.dwAllocationGranularity; //Open file FFile_Handle := CreateFile(PChar(FSourceFileName),GENERIC_READ,FILE_SHARE_READ,nil,OPEN_EXISTING,FILE_ATTRIBUTE_NORMAL,0); if FFile_Handle = INVALID_HANDLE_VALUE then Exit; //Get file size filesize := GetFileSize(FFile_Handle,nil); //Create mapping FFile_Map := CreateFileMapping(FFile_Handle,nil,PAGE_READONLY,0,0,nil); if FFile_Map = 0 then Exit; //Here we have mapped 10 times blocksize into one data block. If the file size is less than 10 times blocksize, the entire file length will be mapped directly if filesize div blocksize > 10 then readlen := 10*blocksize else readlen := filesize; for i := 0 to FInfoList.Count - 1 do begin list.Delimiter := ':'; list.DelimitedText := FInfoList.Strings[i]; //Get the length, I did the analysis here, because the information I stored is of type a:b:c, so len is separated by ::= StrToInt(list.Strings[1]); interval := StrToInt(list .Strings[2]); if (i = 0) or (totallen+len >=readlen) then begin //If the read length plus the length to be read is greater than 10 times the blocksize, then we need to retain the content at the end of the previous mapping in order to merge it with the new mapping content if i > 0 then begin offset := offset + readlen ; //Write to the temporary stream tstream.Write(p^,readlen-totallen); tstream.Position := 0; end; //If the length of the unread data is not enough for one allocation granularity, then directly map the remaining lengthif filesize-offset < blocksize then readlen := filesize-offset; //Mapping, p is a pointer to the mapping area //Note that the third parameter here is always set to 0. This value should be set according to the actual situation p:= PChar(MapViewOfFile(FFile_Map,FILE_MAP_READ, 0,offset,readlen)); end; //If there is data in the temporary stream, it needs to be merged if tstream.Size > 0 then begin //Copy the temporary stream data over stream.CopyFrom(tstream,tstream.Size); //Then write new data at the end and merge to complete stream.Write(p^,len-tstream.Size); totallen := len- tstream.Size; //Move the position of the pointer to point to the beginning of the next data Inc(p,len-tstream.Size); tstream.Clear; end else begin stream.Write(p^,len); totallen := totallen + len; Inc(p,len); end; stream.Position := 0; //Save the stream into a file stream.SaveToFile(IntToStr(i)+'.txt'); stream.Clear; end; finally stream.Free ; tstream.Free; CloseHandle(FFile_Handle); CloseHandle(FFile_Map); end; end;
If you have any questions, please leave a message or go to the community of this site to communicate and discuss. Thank you for reading. I hope it can help everyone. Thank you for your support of this site!