Yesterday I posted an article on how to maximize .NET performance. There were many comments. Thank you very much for your comments. Some of them pointed out some errors in the article. I would like to thank those who patiently wrote comments. My brother has benefited a lot.
Yesterday's article mainly focused on improving the speed from some details of writing code. It may be difficult to actually feel the performance improvement, but as a programmer, constantly improving the quality of your own code is the goal you are constantly pursuing.
In fact, with the development of hardware, the speed of hardware now far meets the needs of most people. Some people even suggest that algorithms are becoming less and less effective in current software development. I remember watching a data structure video from MIT before, and the professor who taught the lecture asked a question (I don’t remember clearly, but this is what he meant): Since algorithms are no longer important, why are we still here? What about research? The answer he gave was "SPEED". We pursue speed just like racing drivers pursue speed!
In the development of many systems nowadays, speed is not the first priority. Others such as stability, security, reusability, etc. are often given top priority. Nowadays, design patterns, development architecture, etc. are not mainly designed to solve performance problems. The above are considered by analysts and architects. Small programmers like us can only optimize the program in some small places in the code, a class, a method, and a line of code. I think it’s good to pay more attention to details.
Okay, enough nonsense, let’s talk about today’s topic. The performance overhead of many network systems developed now is mainly on the reading and transmission of data. Faster reading speed and less network bandwidth usage are The goals we pursue. I will talk about how to improve the performance of .net from this aspect.
1. Paging data in the data layer. It can be implemented through ExcuteReader or stored procedures. There are many methods, so I won’t go into details. (You can read what I wrote)
2. Try to use ExcuteReader to read data. ExcuteReader is the most efficient. In Microsoft’s PetShop 4.0, all data can be accessed. They are all implemented using ExcuteReader, unless you have special requirements for non-connection (such as SmartClient, etc.).
3. In non-connected situations, using DataTable has better performance than using DataSet, unless you want to save multiple relational tables.
4. Use the ImportRow method of DataTable.
In some cases, it is necessary to copy a large amount of data from one DataTable to another. Using the ImportRow method of DataTable can greatly improve performance. When the amount of data is small, there is not much difference. When the amount of data reaches more than 10,000 rows, it can be significantly improved and can reach several times.
5. Serialize data into binary files for easy transmission.
When we process DataSet and DataTable objects, we can serialize them into XML files. If they are to be transmitted over the network, XML files will cause resource problems such as memory and network bandwidth. At this time, we can serialize it into a binary file, so that the generated files will be reduced a lot. The code is as follows:
FileStream fs = new fileStream(@"XMLData.bin",FileMode.Create);
BinaryFormatter bf = new BinaryFormatter();
bf.Serialize(fs,XMLData);
fs.colse();
The binary file generated in this way is called XMLBinary. If you open it directly with WINHEX, you can see some XML tags in it. If the amount of data is large, add a line of code:
XMLData.RemortingFormat = SerializationFormat.Binary;
The file generated at this time is called a TrueBinary file. When processing a large number (more than 10,000 lines), the file size generated is a fraction of XMLBinary. The schema is automatically saved during serialization, making the desequencing process simple. I don't know yet how much the performance drop of deserializing will be compared to reading the XML directly.
6. Make reasonable use of connection pools.
Connection pooling plays a great role in improving performance and is turned on by default. The default Min Pool Size is 0, which is generally set to a relatively small value, such as 5. The default Max Pool Size is 100, which is sufficient for most WEB sites. For large ones, increase it appropriately.
7. Develop using SQLCLR
If you are focusing on opening the SQL Server series, you should study SQLCLR. It is very powerful and can improve performance in many situations (especially large enterprise-level applications).
8. Access APP.Config/Web.Config through static classes
We have a lot of configuration information in APP.Config/Web.Config, which is accessed very frequently. At this time, we create a static class. All attributes are accessed through the static class, which can improve performance to a certain extent. The static class only instance Config once, and APP.Config/Web.Config will generate a lot of IO operations.
public static class MyWebConfig
{
static MyWebConfig()
{
ConnString =
ConfigurationManager.ConnectionStrings["Connection"].
ConnectionString;
}
public static string DbConnectionString
{
get
{
return ConnString;
}
}
}
Okay, that’s it for today. I would like to point out any mistakes and shortcomings. You are welcome to put forward better opinions and make progress together.