Will they like this? Evaluating Code Contributions With Language Models

TitleWill they like this? Evaluating Code Contributions With Language Models
Publication TypeConference Proceedings
Year of Publication2015
AuthorsHellendoorn, VJ, Devanbu, PT, Bacchelli, A
Secondary Title12th Working Conference on Mining Software Repositories (MSR 2015)
Date Published05/2015

Popular open-source software projects receive and
review contributions from a diverse array of developers, many
of whom have little to no prior involvement with the project. A
recent survey reported that reviewers consider conformance to
the project’s code style to be one of the top priorities when evaluating
code contributions on Github. We propose to quantitatively
evaluate the existence and effects of this phenomenon. To this aim
we use language models, which were shown to accurately capture
stylistic aspects of code. We find that rejected changesets do
contain code significantly less similar to the project than accepted
ones; furthermore, the less similar changesets are more likely
to be subject to thorough review. Armed with these results we
further investigate whether new contributors learn to conform to
the project style and find that experience is positively correlated
with conformance to the project’s code style.

Full Text
PDF icon msr2015.pdf395.86 KB