Meta released the Meta LLM compiler, which caused a huge shock in the programming field. The fine-tuned version of this compiler, FTD, performs well in code size optimization, outperforming the -Oz option and even far better than GPT-4 Turbo. This compiler is built based on the Meta Code Llama model series, including two versions, 7B and 13B, and also shows significant advantages in disassembly capabilities.
Meta recently launched the Meta LLM compiler (Meta Large Language Model Compiler), which shocked the entire programming industry. It is reported that the LLM compiler FTD (fine-tuned version) achieved a 5.24% increase in code size optimization, exceeding the optimization option -Oz, while GPT-4 Turbo was only 0.03%.
Meta LLM is a model series built on Meta Code Llama, including versions 7B and 13B. In terms of disassembly capabilities, the LLM compiler FTD achieved a round-trip BLEU score of 0.96, far exceeding GPT-4 Turbo's 0.43.
The emergence of the Meta LLM compiler marks significant progress in the field of code optimization and disassembly of large language models, providing developers with more powerful tools, and also indicates that LLM technology will further improve code efficiency and development efficiency in the future. This will undoubtedly have a profound impact on future software development.