Large language models (LLMs) have become essential in software development,
especially for issue resolution. However, despite their widespread use,
significant challenges persist in the quality of LLM responses to issue
resolution queries. LLM interactions often yield incorrect, incomplete, or
ambiguous information, largely due to knowledge gaps in prompt design, which
can lead to unproductive exchanges and reduced developer productivity. In this
paper, we analyze 433 developer-ChatGPT conversations within GitHub issue
threads to examine the impact of prompt knowledge gaps and conversation styles
on issue resolution. We identify four main knowledge gaps in developer prompts:
Missing Context, Missing Specifications, Multiple Context, and Unclear
Instructions. Assuming that conversations within closed issues contributed to
successful resolutions while those in open issues did not, we find that
ineffective conversations contain knowledge gaps in 54.7% of prompts, compared
to only 13.2% in effective ones. Additionally, we observe seven distinct
conversational styles, with Directive Prompting, Chain of Thought, and
Responsive Feedback being the most prevalent. We find that knowledge gaps are
present in all styles of conversations, with Missing Context being the most
repeated challenge developers face in issue-resolution conversations. Based on
our analysis, we identify key textual and code related heuristics-Specificity,
Contextual Richness, and Clarity-that are associated with successful issue
closure and help assess prompt quality. These heuristics lay the foundation for
an automated tool that can dynamically flag unclear prompts and suggest
structured improvements. To test feasibility, we developed a lightweight
browser extension prototype for detecting prompt gaps, that can be easily
adapted to other tools within developer workflows.
Metrics
3 Record Views
Details
Title
Towards Detecting Prompt Knowledge Gaps for Improved LLM-guided Issue Resolution
Creators
Ramtin Ehsani
Sakshi Pathak
Preetha Chatterjee
Resource Type
Preprint
Language
English
Academic Unit
Computer Science (Computing); College of Computing and Informatics
Other Identifier
991022020437904721
Research Home Page
Browse by research and academic units
Learn about the ETD submission process at Drexel
Learn about the Libraries’ research data management services