Thread class in Delphi
Raptor[Mental Studio]
http://mental.mentsu.com
Four
Critical Section (CriticalSection) is a technology for shared data access protection. It is actually equivalent to a global Boolean variable. But its operation is different. It has only two operations: Enter and Leave. Its two states can also be regarded as True and False, respectively indicating whether it is in the critical section. These two operations are also primitives, so it can be used to protect shared data from access violations in multi-threaded applications.
The method of using critical sections to protect shared data is very simple: call Enter to set the critical section flag before each access to shared data, then operate the data, and finally call Leave to leave the critical section. Its protection principle is as follows: after a thread enters the critical section, if another thread also wants to access the data at this time, it will find that a thread has already entered the critical section when calling Enter, and then this thread will be Hang up and wait for the thread currently in the critical section to call Leave to leave the critical section. When another thread completes the operation and calls Leave to leave, this thread will be awakened, set the critical section flag, and start operating data, thus preventing access. conflict.
Taking the previous InterlockedIncrement as an example, we use CriticalSection (Windows API) to implement it:
Var
InterlockedCrit : TRTLCriticalSection;
PRocedure InterlockedIncrement( var aValue : Integer );
Begin
EnterCriticalSection(InterlockedCrit);
Inc(aValue);
LeaveCriticalSection(InterlockedCrit);
End;
Now look at the previous example:
1. Thread A enters the critical section (assuming the data is 3)
2. Thread B enters the critical section. Because A is already in the critical section, B is suspended.
3. Thread A adds one to the data (now 4)
4. Thread A leaves the critical section and wakes up thread B (the current data in the memory is 4)
5. Thread B wakes up and adds one to the data (now it is 5)
6. Thread B leaves the critical section, and the current data is correct.
This is how critical sections protect access to shared data.
Regarding the use of critical sections, one thing to note is the handling of exceptions during data access. Because if an exception occurs during data operation, the Leave operation will not be executed. As a result, the thread that should be awakened will not be awakened, which may cause the program to become unresponsive. So generally speaking, the correct approach is to use critical sections as follows:
EnterCriticalSection
Try
//Operate critical section data
Finally
LeaveCriticalSection
End;
The last thing to note is that Event and CriticalSection are both operating system resources, which need to be created before use and released after use. For example, a global Event: SyncEvent and a global CriticalSection: TheadLock used by the TThread class are both created and released in InitThreadSynchronization and DoneThreadSynchronization, and they are called in the Initialization and Finalization of the Classes unit.
Since APIs are used to operate Event and CriticalSection in TThread, the API is used as an example above. In fact, Delphi has provided encapsulation of them. In the SyncObjs unit, they are the TEvent class and TCriticalSection class respectively. The usage is almost the same as the previous method of using API. Because TEvent's constructor has too many parameters, for simplicity, Delphi also provides an Event class initialized with default parameters: TSimpleEvent.
By the way, let me introduce another class used for thread synchronization: TMultiReadExclusiveWriteSynchronizer, which is defined in the SysUtils unit. As far as I know, this is the longest class name defined in Delphi RTL. Fortunately, it has a short alias: TMREWSync. As for its use, I think you can know it just by looking at the name, so I won’t say more.
With the previous preparatory knowledge about Event and CriticalSection, we can officially start discussing Synchronize and WaitFor.
We know that Synchronize achieves thread synchronization by placing part of the code in the main thread for execution, because in a process, there is only one main thread. Let’s first look at the implementation of Synchronize:
procedure TThread.Synchronize(Method: TThreadMethod);
begin
FSynchronize.FThread := Self;
FSynchronize.FSynchronizeException := nil;
FSynchronize.FMethod := Method;
Synchronize(@FSynchronize);
end;
where FSynchronize is a record type:
PSynchronizeRecord = ^TSynchronizeRecord;
TSynchronizeRecord = record
FThread: TObject;
FMethod: TThreadMethod;
FSynchronizeException: TObject;
end;
Used for data exchange between threads and the main thread, including incoming thread class objects, synchronization methods and exceptions that occur.
An overloaded version of it is called in Synchronize, and this overloaded version is quite special, it is a "class method". The so-called class method is a special class member method. Its invocation does not require the creation of a class instance, but is called through the class name like a constructor. The reason why it is implemented using a class method is that it can be called even when the thread object is not created. However, in practice, another overloaded version of it (also a class method) and another class method StaticSynchronize are used. Here is the code for this Synchronize:
class procedure TThread.Synchronize(ASyncRec: PSynchronizeRecord);
var
SyncProc: TSyncProc;
begin
if GetCurrentThreadID = MainThreadID then
ASyncRec.FMethod
else
begin
SyncProc.Signal := CreateEvent(nil, True, False, nil);
try
EnterCriticalSection(ThreadLock);
try
if SyncList = nil then
SyncList := TList.Create;
SyncProc.SyncRec := ASyncRec;
SyncList.Add(@SyncProc);
SignalSyncEvent;
if Assigned(WakeMainThread) then
WakeMainThread(SyncProc.SyncRec.FThread);
LeaveCriticalSection(ThreadLock);
try
WaitForSingleObject(SyncProc.Signal, INFINITE);
finally
EnterCriticalSection(ThreadLock);
end;
finally
LeaveCriticalSection(ThreadLock);
end;
finally
CloseHandle(SyncProc.Signal);
end;
if Assigned(ASyncRec.FSynchronizeException) then raise ASyncRec.FSynchronizeException;
end;
end;
This code is a little longer, but it's not too complicated.
The first is to determine whether the current thread is the main thread. If so, simply execute the synchronization method and return.
If it is not the main thread, it is ready to start the synchronization process.
The thread exchange data (parameters) and an Event Handle are recorded through the local variable SyncProc. The record structure is as follows:
TSyncProc=record
SyncRec: PSynchronizeRecord;
Signal: THandle;
end;
Then create an Event, then enter the critical section (through the global variable ThreadLock, because only one thread can enter the Synchronize state at the same time, so you can use the global variable to record), and then store the recorded data in the SyncList list (if this If the list does not exist, create it). It can be seen that the critical section of ThreadLock is to protect access to SyncList. This will be seen again when CheckSynchronize is introduced later.
The next step is to call SignalSyncEvent. Its code has been introduced before when introducing the TThread constructor. Its function is to simply perform a Set operation on SyncEvent. The purpose of this SyncEvent will be detailed later when WaitFor is introduced.
Next is the most important part: calling the WakeMainThread event for synchronization operations. WakeMainThread is a global event of type TNotifyEvent. The reason why events are used for processing here is because the Synchronize method essentially puts the process that needs to be synchronized into the main thread for execution through messages. It cannot be used in some applications without message loops (such as Console or DLL). , so use this event for processing.
The application object responds to this event. The following two methods are used to set and clear the response to the WakeMainThread event (from the Forms unit):
procedure TApplication.HookSynchronizeWakeup;
begin
Classes.WakeMainThread := WakeMainThread;
end;
procedure TApplication.UnhookSynchronizeWakeup;
begin
Classes.WakeMainThread := nil;
end;
The above two methods are called in the constructor and destructor of the TApplication class respectively.
This is the code that responds to the WakeMainThread event in the Application object. The message is sent here. It uses an empty message to achieve this:
procedure TApplication.WakeMainThread(Sender: TObject);
begin
PostMessage(Handle, WM_NULL, 0, 0);
end;
The response to this message is also in the Application object, see the following code (remove irrelevant parts):
procedure TApplication.WndProc(var Message: TMessage);
…
begin
try
…
with Message do
case Msg of
…
WM_NULL:
CheckSynchronize;
…
except
HandleException(Self);
end;
end;
Among them, CheckSynchronize is also defined in the Classes unit. Since it is relatively complex, we will not explain it in detail for the time being. Just know that it is the part that specifically handles the Synchronize function. Now continue to analyze the Synchronize code.
After executing the WakeMainThread event, exit the critical section, and then call WaitForSingleObject to start waiting for the Event created before entering the critical section. The function of this Event is to wait for the execution of this synchronization method to end. This will be explained later when analyzing CheckSynchronize.
Note that after WaitForSingleObject, you re-enter the critical section, but exit without doing anything. It seems meaningless, but it is necessary!
Because Enter and Leave in the critical section must strictly correspond to one-to-one. So can it be changed to this:
if Assigned(WakeMainThread) then
WakeMainThread(SyncProc.SyncRec.FThread);
WaitForSingleObject(SyncProc.Signal, INFINITE);
finally
LeaveCriticalSection(ThreadLock);
end;
The biggest difference between the above code and the original code is that WaitForSingleObject is also included in the restrictions of the critical section. It seems to have no impact, and it greatly simplifies the code, but is it really possible?
In fact, no!
Because we know that after the Enter critical section, if other threads want to enter again, they will be suspended. The WaitFor method will suspend the current thread and will not be awakened until it waits for SetEvent of other threads. If the code is changed to the above, if the SetEvent thread also needs to enter the critical section, a deadlock will occur (for the theory of deadlock, please refer to the information on operating system principles).
Deadlock is one of the most important aspects of thread synchronization!
Finally, the Event created at the beginning is released. If the synchronized method returns an exception, the exception will be thrown again here.
(to be continued)