Dr. Sarah S. Heckman

About Teaching Research Service Publications CV

Research

Computing Education Research

Developing Integrated Teaching Platforms to Enhance Blended Learning in STEM

Funding: National Science Foundation [$597,529]

Principal Investigators: Collin Lynch, Tiffany Barnes, Sarah Heckman

Summary: In most college courses, students use multiple online tools to support collaboration and learning. However, little is known about how students navigate and integrate their use of online tools, or about the collective impact of using a specific set of online tools. This project aims address this knowledge gap by developing an open platform to collect, integrate, and analyze data from students’ use of multiple online tools. This platform, called Concert, will actively track student progress, and allow instructors to identify students’ help-seeking and collaboration behaviors. It will also enable research to develop a model of how students use the online resources that are available to them. It is expected that results of this project will increase understanding of students’ help-seeking behaviors, study behaviors, and social relationships within classes, and how these behaviors and relationships affect student performance.

Using open application programming interfaces, the Concert platform will gather data from commonly used systems, such as the Piazza forum, Jenkins Automated Grader, the GitHub submission system, MyDigitalHand, and Moodle. It will integrate data from these online tools and provide a single student interface for notifications and help seeking, as well as a single instructor interface for data analysis and student evaluation. It will monitor students’ use of the online tools and their study habits, and respond with automated guidance. Although the project will initially focus on computer science courses, it is designed to support students in any other STEM field. The Concert platform will collect large sets of detailed, anonymous data about students’ online actions and class performance, providing a rich dataset to support further educational research. If successful, this project has the potential to empower STEM students and broaden participation by reducing the complexity of selecting and using online tools, thus supporting increased student engagement and learning.

Publications

Collaborative Research: Transforming Computer Science Education Research Through Use of Appropriate Empirical Research Methods: Mentoring and Tutorials

Website: http://empiricalcsed.org/

Funding: National Science Foundation [$406,557]

Principal Investigators: Jeffrey Carver (University of Alabama), Sarah Heckman (NC State), and Mark Sherriff (University of Virginia)

Summary: The computer science education (CSEd) research community consists of a large group of passionate CS educators who often contribute to other disciplines of CS research. There has been a trend in other disciplines toward more rigorous and empirical evaluation of various hypotheses. However, many of the practices that we apply to demonstrate rigor in our discipline research are ignored or actively avoided when performing research in CSEd. This suggests that CSEd is "theory scarce" because most publications are not research and do not provide the evidence or replication required for meta-analysis and theory building . An increase in empiricism in CSEd research will move the field from "scholarly teaching" to the "scholarship of teaching and learning" (SoTL) providing the foundation for meta-analysis and the generation of theories about teaching and learning in computer science. We propose the creation of training workshops and tutorials to educate the educators about appropriate research design and evaluation of educational interventions. The creation of laboratory packages, "research-in-a-box," will support sound evaluation and replication leading to meta-analysis and theory building in the CSEd community.

Status: We have mentored three cohorts of 8-15 participants. Our participants have completed or are in the process of working on an educational study.

Publications

Research Triangle Peer Teaching Fellows: Scalable Evidence-Based Peer Teaching for Improving CS Capacity and Diversity

Website: Peer Teaching Fellows Program

Funding: Google Computer Science Capacity Award

Principal Investigators: Jeff Forbes (Duke University), Ketan Mayer-Patel (UNC Chapel Hill), Kristy Boyer (University of Florida), Sarah Heckman (NC State)

Summary: The project seeks to increase undergraduate retention and diversity in introductory programming courses. We are interested in creating a scalable, effective, and evidence-based peer teacher training program at NC State, Duke University, and UNC Chapel-Hill.

Publications

Jenkins: Automated Grading while Reinforcing Software Engineering Best Practices

Summary: Automated grading systems provide students with feedback early and often. Use of automated grading systems and other software engineering tooling could reinforce software engineering best practices when developing software. We have developed a custom configuration of Jenkins, a continuous integration server, to automatically grade assignments in CSC216, CSC230, and CSC326 while reinforcing software engineering best practices of test driven development, continuous integration, version control, code coverage, and static analysis.

Publications

CPATH II: Incorporating Communication Outcomes into the Computer Science Curriculum

Funding: National Science Foundation

Role: Senior Personal

Summary: In partnership with industry and faculty from across the country, this project will develop a transformative approach to developing the communication abilities (writing, speaking, teaming, and reading) of Computer Science and Software Engineering students. We will integrate communication instruction and activities throughout the curriculum in ways that enhance rather than replace their learning technical content and that supports development of computational thinking abilities of the students. We will implement the approach at two institutions. By creating concepts and resources that can be adapted by all CS and SE programs, this project also has the potential to increase higher education’s ability nationwide to meet industry need for CS and SE graduates with much better communication abilities than, on average, is the case today. In addition, by using the concepts and resources developed in this project, CS and SE programs will be able to increase their graduates’ mastery of technical content and computational thinking.

Publications

Course Redesigns

CSC326 Course Redesign – Creating an Agile Course to Support Software Engineering Process

Funding: NC State DELTA Course Redesign Grant [$36,400 + $9,500 supplement]

Principal Investigators: Sarah Heckman, Katie Stolee, Chris Parnin

Summary: CSC 326: Software Engineering is a required course for CSC majors typically taken in their Junior year and is a prerequisite for CSC 492: Senior Design. Students find CSC 326 challenging due to the new technologies, required collaborations, and using software engineering practices to complete programming assignments on a large, legacy system. We have added a credit hour to CSC 326 starting in Fall 2017, which will increase the lecture meeting time from 50-minutes to 75-minutes twice a week and better reflects the workload of students in practice. We want to redesign the course to 1) better support students in learning new technologies for completing the course project and in preparation for work in industry, 2) provide better team training to support stronger collaborative experiences, and 3) align the delivery of CSC 326 with earlier courses in the curriculum that have also been redesigned with DELTA support (CSC 116, CSC 216, and CSC 316).

Publications

Course Projects

Incorporating Software Engineering Best Practices into CSC216

Funding:

Principal Investigators:

Summary: Students in CSC216, a second semester programming course at NC State, struggle with the transition from a small integrated lecture-lab classroom to a large lecture class. While I use active learning the classroom, the exercises are small (e.g., write a single method) and do not demonstrate how the piece might fit into a larger solution. Student evaluations have consistently requested additional time to practice programming larger solutions in class. I revised a linear data structure unit to use in-class laboratory assignments. Before class, students watch short videos about the topic and they engage with the topic during class time on small teams. While working on the in-class labs, students are encouraged to follow software engineering best practices like test driven development. I have several research questions:

Current findings show no major difference on student learning between in-class labs and standard active learning lectures. Pass rates are lower than what we might expect to see from active learning classes [Freeman2014], but are higher than traditional lecture pass rates [Freeman2014] and the pass rates of CS1 courses from literature [Bennedsen2007, Watson2014].

The DELTA Course Redesign grant is supporting work to move CSC216 to a lab-based course. The in-class labs will be transitions into an affiliated Peer Teaching Fellow run lab of 20-24 students starting in Fall 2016.

Publications

Software Engineering Research

SHF: Small: Enabling Scalable and Expressive Program Analysis Notifications

Funding: National Science Foundation [$265,853] (a continuation of earlier work)

Principal Investigators: Emerson Murphy-Hill and Sarah Heckman

Summary: Program analysis tools are necessary for high quality software. The goal of this research is to understand how expressiveness and scalability can be increased within and across these tools, which is important to advancing knowledge by transforming how software development environments converse with the developers who use them. There will be three outcomes: a framework designed to enable toolsmiths to create program analysis tools that are expressive and scalable, three re-engineered program analysis tools that use the framework, and validation that program analysis tools built using our framework provide positive results. It will have significant benefits to society by enabling developers to fully reap the benefits of program analysis tools more correct, more reliable, and more on-time software systems.

Program analysis tools such as static analysis tools, restructuring tools, and code coverage tools communicate with the software developer through notifications, but these notifications must balance two competing priorities. First, they must be expressive enough that software developers can understand what the notification is trying to convey. Second, they must be scalable enough that as understanding a notification becomes increasingly cognitively demanding, the developer does not abandon the tool in favor of an error-prone process of manual diagnosis. The project designs a new interactive development environment (IDE) framework for notifications. Many program analysis notifications have a common structure, which can be leveraged to enable expressiveness and scalability for IDE notifications.

Expressive and Scalable Notifications from Program Analysis Tools

Funding: National Science Foundation

Principal Investigators: Emerson Murphy-Hill (NC State) and Sarah Heckman (NC State)

Summary: A wide variety of program analysis tools have been created to help software developers do their jobs, yet the output of these tools are often difficult to understand and vary significantly from tool to tool. As a result, software developers may waste time trying to interpret the output of these tools, instead of making their software more capable and reliable. This proposal suggests a broad investigation of several types of program analysis tools, with the end goal being an improved understanding of how program analysis tools can inform developers in the most expressive and uniform way possible. Once this goal is reached, we can create program analysis tools that enable developers to make tremendous strides towards more correct, more reliable, and more on-time software systems.

Resources The resources, below, are used in CSC216: Programming Concepts - Java, a second semester programming course for CSC majors. The resources are related directly to exercises and materials that were added to CSC216 as part of this grant. Access the PPTX and actual Google forms available upon request. Please note that some of the slides are modified from Reges and Stepp’s Building Java Programs website. See the available supplements.

Lecture Exercises Code
Code Coverage and Static Analysis

Exercise 05.01
Exercise 05.02
Exercise 05.03
Exercise 05.04
Exercise 05.05

 
Linked Lists

Exercise 11.03
Exercise 11.06

Lists.zip
Inspections Exercise 12.01
Exercise 12.02
Exercise 12.03
InspectionExercises.zip
Iterators and Inner Classes Exercise 13.03 Lists.zip
Recursion Exercise 18.03 Recursion.zip

Publications:

Predicting Actionable Static Analysis Alerts

Funding:

Summary: Automated static analysis (ASA) can identify potential source code anomalies like null pointers. These anomalies could lead to field failures. Each alert or notification from an ASA tool requires inspection by a developer to determine if the alert is important enough to fix, which is called an actionable alert. Automated identification of actionable alerts generated by ASA could reduce the inspection overhead. Actionable alert identification techniques (AAIT) supplement ASA by using ASA alerts and other information about the software under analysis, called artifact characteristics, to prioritize or classify alerts.

We proposed an AAIT that utilizes machine learning on the history of a software development project to classify alerts as actionable or unactionable. Using the benchmark, FAULTBENCH, we completed a comparative evaluation between our AAIT and other AAIT from literature and found that for the subject programs our AAIT tended to better classify actionable an unactionable alerts when focusing on the goal of reducing false positives.

Publications