CIMPS 2025: LLM Optimization Techniques for Compliance Assessment
PhD student Victor Terrón presented his research on LLM optimization techniques for compliance assessment at CIMPS 2025 in Lima, Peru.
Research Overview
The work provides a comprehensive analysis of 9 different optimization techniques for Large Language Models in the context of software compliance assessment:
Techniques Analyzed
- RAG (Retrieval-Augmented Generation) - Enhancing LLM responses with retrieved context
- LoRA (Low-Rank Adaptation) - Efficient fine-tuning of large models
- Prompt Engineering - Optimizing input prompts for better outputs
- Fine-tuning strategies - Domain-specific model adaptation
- Chain-of-thought prompting - Structured reasoning approaches
- And more…
Key Findings
The research demonstrates how these techniques can be applied to improve the accuracy and reliability of LLMs when assessing software compliance with industry standards and regulations.
Congratulations to Victor on this excellent presentation!
Originally shared on LinkedIn
Enjoy Reading This Article?
Here are some more articles you might like to read next: