- Submission: June 30, 2020
- Notification: July 30
- Camera ready: Aug 30
- Meeting: Nov 8
Track chairs
- Michael Hilton, CMU
- Gema Rodriguez-Perez, U. Waterloo
How to Participate in the Challenge
Familiarize yourself with the proposed papers to be replicated. Read the suggested papers, analyze their tools and datasets, and evaluate which one would fit with your interests. Remember that you always can select any prior paper (of your choosing) to participate in the replication challenge.
Then, totally or partially replicate the paper and report your findings in a four-page paper (see information below),
Submit your final paper before June 30, 2020. Replication papers are four pages (max) including references. For details on that submission process, see PROMISE submission requirements.
If your paper is accepted, present your results at PROMISE in Sacramento, California, United States. `
PROMISE 2020 Replication Challenge Track: Call for papers
The International Conference on Predictive Models and Data Analytics in Software Engineering is hosting a replication challenge track. With this challenge, we call upon everyone interested to reproduce prior results in software analytics and help verify, evaluate or improve research outcomes of fellow researchers. While negative replications may raise concerns about the reliability of previous results, positive replications can provide even greater trust in the results of the original paper.
Although replication challenge participants would choose any prior results in SE and replicate it, we highly encourage them to replicate one of the papers in the list before. These papers have been proposed by their authors and selected by the organizing committee based on the availability of data to replicate the paper, and what support can authors provide to replication challenge participants
Papers Wanted To Be Replicated
- Mohanani, R., Turhan, B., & Ralph, P. Requirements framing affects design creativity. IEEE Transactions on Software Engineering, 12 pages. DOI: 10.1109/TSE.2019.2909033.
- Reproduction package: https://goo.gl/r5mPLv
- Ralph, P., & Tempero, E. (2016, June). Characteristics of decision-making during coding.
In Proceedings of the 20th International Conference on Evaluation and Assessment in Software Engineering (p. 34). ACM. DOI: 10.1145/2915970.2915990
- Authors support: They can provide their interview guide, a little guidance on open coding, and answer the occasional question about Leximancer.
- Jesus M. Gonzalez-Barahona, Gregorio Robles, Israel Herraiz, and Felipe Ortega. Studying the laws of software evolution in a long-lived FLOSS project. Journal of Software: Evolution and Process, 26(7):589–612, 2014. doi:10.1002/smr.1615.
- Reproduction package: http://gsyc.urjc.es/jgb/repro/2012-jsme-long-evolution
- Tools accesibles: CVSAnalY (https://metricsgrimoire.github.io/
- McKee, S., Nelson, N., Sarma, A., & Dig, D. (2017, September). Software practitioner perspectives on merge conflicts and resolutions. In 2017 IEEE International Conference on Software Maintenance and Evolution (ICSME). DOI: 10.1109/ICSME.2017.53
- Publicly-accessible artifacts: https://nomatic.dev/icsme17.html
- Hilton, M., Tunnell, T., Huang, K., Marinov, D., & Dig, D. (2016, September). Usage, costs, and benefits of continuous integration in open-source projects. In 2016 IEEE International Conference on Automated Software Engineering (ASE). DOI: 10.1145/2970276.2970358
- Data information: http://cope.eecs.oregonstate.edu/CISurvey/