NALABS is a tool designed to detect "bad smells" in natural language requirements and test specifications. It identifies vague, ambiguous, or poorly structured language, helping improve clarity, maintainability, and precision in software documentation.
This thesis proposes extending NALABS by integrating LLMs to address fundamental limitations in requirements quality assessment. Currently, NALABS detects ambiguous or poorly structured language but cannot suggest targeted revisions. This research will introduce an LLM-driven improvement module to suggest specific, context-aware edits, enabling NALABS to support detection and refinement. The thesis will focus on measuring the delta improvements and changes in readability, clarity, and maintainability achieved through LLM-based suggestions. By focusing on these deltas, this thesis aims to demonstrate the impact of NALABS on the quality of software documentation.
https://github.com/eduardenoiu/NALABS