Supplemental Instruction: An Effective Component of Student Affairs Programming

Journal of College Student Development; 
November/December 1997, Vol. 38, No. 6

Supplemental Instruction: An Effective Component of Student Affairs Programming E.O. Kochenour, D.S. Jolley, J.G. Kaup, D.L. Patrick, K.D. Roach, L.A. Wenzler

Based in part on a paper presented at the American College Personnel Association Convention,Baltimore, March 1996

At the time this was written, Edith O. Kochenour, M.A., was the director of the Learning Success Center at the University of Utah; Kenneth Roach was a graduate student in Educational Psychology at the University of Utah; Dennis S. Jolley, M.S., taught science at Rowland Hall-St. Mark's School in Salt Lake City; John Kaup, Ph.D., was a Postdoctoral Fellow in chemistry at Furman University; David Patrick, Ph.D., was an Assistant Professor of chemistry at Western Washington University; Lisa Wenzler, Ph.D., was a Postdoctoral Fellow in chemistry at New York University.


The effectiveness of Supplemental Instruction (SI) was examined using 11,000 participants in 8 varied courses at a large Rocky Mountain research university. Correlational analyses and analysis of covariance (ANCOVA) support the hypothesis that SI is an effective program that student affairs practitioners can use in their efforts to enhance student learning and development.

In 1994, the American College Personnel Association (ACPA) published The Student Learning Imperative: Implications for Student Affairs concerning the need for student affairs practitioners to tie their work closely to student learning. The authors recommended the allocation of student affairs resources to programs that encourage student learning and development and the promotion of collaboration of student affairs professionals with other campus leaders, including faculty.

An academic support program, Supplemental Instruction (SI) is a dynamic example of a student affairs service that has as its primary purpose the enhancement of student learning in academic courses, the expansion of student development through the mastery of effective academic skills, and the collaboration of the program's professionals with numerous individual faculty and academic departments. The SI program also fulfills a recommendation (US Department of Education, 1984) to focus more resources on the freshman year and to create learning communities that incorporate aspects of cooperative learning.

The authors of Involvement in Learning (US Department of Education, 1984) and the authors of The Student Learning Imperative (ACPA, 1994) stressed the importance of assessment and program evaluation. To determine the effectiveness of the SI program in meeting student learning goals, the current researchers discuss a study in which they evaluated the program's impact on student success in the targeted courses and examined the cost effectiveness of this use of student affairs resources.

The SI program was originally developed in the Division of Student Affairs at the University of Missouri-Kansas City in the 1970s. Estimates of the number of institutions that have adopted SI range up to 500 (Blanc & Martin, 1994), including 2-year community colleges, 4-year liberal arts colleges, major research universities, and medical schools. The SI model differs from traditional academic support programs in two major ways. First, instead of targeting high-risk students, SI focuses on high-risk courses, mostly at the freshman level. Second, instead of offering individual tutoring help or class meetings concerning study techniques, SI provides group study sessions led by student peers who not only lead the sessions but also attend the class, so they know firsthand what is covered. SI thus provides a framework for small-group discussion of course material in a cooperative, nonremedial setting (Blanc, DeBuhr, & Martin, 1983) and allows students to benefit from interacting with both the SI leader and fellow classmates.

Previous studies have demonstrated that SI is effective at increasing student success, as measured by course grade, in a variety of undergraduate courses (Blanc, et al., 1983; Congos & Schoeps, 1993), including chemistry (Lockie & Van Lanen, 1994; Lundeberg, 1990), mathematics (Burmeister et al., 1994; Kenney & Kallison, 1994), and history (Martin & Blanc, 1981; Wolfe, 1987). The pattern of course grades improves as a result of SI, with more As and Bs and fewer Ds, Fs, and Ws (withdrawals) (Martin & Blanc, 1981). Students who attend SI earn higher average course grades, as well as higher overall semester grade point averages (GPAs), than do their peers who do not attend (Blanc, et al., 1983).

Nevertheless, academic support programs in general, and SI in particular, face criticism regarding their effectiveness and appropriateness as a use of university resources. Schwartz (1992), for example, was unable to demonstrate any effect of SI attendance on course grade among students in his sociology courses. Instead, he found that course attendance and prior GPA predicted course performance. Schwartz suggested that SI attendance was the result of academic achievement, not the other way around. In another study, Visor, Johnson, and Cole (1992) found that high-risk psychology students--defined as students with an external locus of control, low self-efficacy, and low self-esteem--did not take full advantage of SI. The authors questioned whether SI was helping those students most in need of help.

Of the research that supports SI, much is anecdotal, is based on small or nonrepresentative samples, or does not adequately consider student ability as a possible explanation for the apparent "effect" of SI. Studies that did attempt to control for student ability (e.g., Congos & Schoeps, 1993) often failed to demonstrate that SI helped those students who needed help the most.

The goal of the current study was to undertake a thorough program evaluation that would respond to some of the limitations of other SI research and would explore some additional questions that have emerged from this university's experience with SI.

Method

The first issue examined was SI effectiveness. The large sample population--11,000 participants in 8 courses over 2 academic years--and the statistical procedures employed allowed for a more definitive assessment of the relationship between SI participation and student performance than is found in much of the SI literature. In addition, the researchers investigated whether students can benefit from participation in SI regardless of their predicted ability. The experience with SI at this large Rocky Mountain research university also led investigators to question if SI's impact is consistent across different academic areas, specifically the social versus the physical sciences.

The second issue explored in this study, that of SI attendance patterns, was prompted by observations of SI program coordinators at this university that SI attendance in courses with large enrollments differs from the patterns found in smaller classes.

Because student affairs resources are limited on many campuses, the final issue addressed was whether SI is a cost-effective way of reaching students and helping them to succeed academically.

Procedure

To evaluate SI, the researchers performed analyses of covariance (ANCOVAs) and correlational analyses on data collected by the SI program at a public research university during the 1992-93 and 1993-94 academic years. The cost effectiveness of SI as an academic support program was analyzed using program costs from the 1994-95 academic year.

An SI program was instituted at this large Rocky Mountain research university in the Spring of 1986 to assist new and transfer students by identifying challenging courses and providing undergraduate students to lead study sessions. The program was designed to follow the SI model developed at the University of Missouri-Kansas City and has grown from its small beginning in 2 sections of a single course to its current size, attached to 16 sections of 10 courses each quarter.

The SI program is managed by a team composed of the program director and a small group of graduate students. Responsibilities of this team include selecting courses to which SI will be attached, hiring and training undergraduate students as SI leaders, and supervising the leaders. The undergraduate SI leaders are selected on the basis of their academic records, interpersonal skills, creativity, empathy, insight, and the recommendations of faculty members in the disciplines in which they will be working. The undergraduate SI leaders advertise the service and conduct voluntary study sessions for the students in the designated courses. The undergraduate SI leaders' responsibilities include attending all classes and conducting three study sessions a week for the students in the course. Also, the leaders frequently offer review sessions prior to midterm exams and finals. Students can attend as few or as many sessions as they wish during the term, or they can choose not to attend at all. These sessions are run as discussions rather than lectures, and the SI leaders guide the discussion by suggesting agenda items, posing questions, and giving verbal feedback. In addition to focusing on course content, the SI leaders weave ideas into the discussion about ways to approach studying and learning.

Permission to have SI attached to courses is requested of the departments and professors involved. Beyond granting this permission and allowing time for in-class announcements by the SI leaders, the faculty have no responsibility for the program. An important part of the SI model is the way in which courses are selected. Courses considered for SI usually exhibit one or more of the following characteristics: (a) have a high percentage of students who do not successfully complete the course, defined as receiving grades of D, E, or W; (b) have a large course enrollment, especially with a high proportion of freshman, sophomore, and transfer students; and (c) fulfill important prerequisite or graduation requirements. In addition, courses in the SI program often involve lengthy readings from challenging texts and lecture-style instruction with limited opportunity for questions.

Data Collection

This university's SI program has maintained a strong commitment to program evaluation and research. Information about the SI program is collected quarterly from four sources: (a) a survey given at the beginning of the term to all students enrolled in courses offering SI, (b) an evaluation completed by these same students at the end of the term, (c) attendance records from each SI session, and (d) information from university student records.

During the 1st week of the quarter, a single-page questionnaire designed to encourage participation in SI and assist leaders in scheduling their sessions is distributed in all SI classes. This questionnaire includes the student's name; social security number; self-reported high school GPA, Scholastic Aptitude Test (SAT) scores/American College Testing Program(ACT) scores; course load; and work hours.

At the end of the term, students in each course offering SI are asked to complete anonymously an evaluation form regarding satisfaction with the SI program, perceived effectiveness of the SI leader, expected course grade, and satisfaction with that grade. Students who did not attend are asked why they did not attend and how they perceived the SI program.

The SI leaders maintain attendance rolls which provide information regarding the number of contact hours per leader and the number of sessions attended by each student. Finally, a limited amount of information is extracted from the university's student records database to obtain course grades and verify the self-reported data.

Results and Discussion

With the exception of the financial analysis of the program, the results are based on data collected over 6 consecutive 10-week quarters (Autumn, Winter, and Spring of the 1992-93 and 1993-94 academic years).

Summary statistics for the SI program during this period are presented in Table 1. Physical science classes included chemistry, mathematics, and physics; social science classes included history, political science, and psychology.

TABLE 1
Summary Data for Autumn, Winter, and Spring Quarters, 1992-93 and 1993-94

  Physical Science Social Science Total
Number of SI sections 50 32 82
Number of students enrolled in SI sections 6,738 5,192 11,930
Number of students attending at least one SI session 2,189 1,882 4,071
Percentage of students attending at least one SI session 32.5 36.2 34.1
Average number of SI sessions attended 3.75 3.66 3.71

SI Effectiveness

A goal of the researchers was to evaluate the effectiveness of SI in introductory physical and social science courses. There are many potential measures of the effectiveness of a program like SI. Some of the measures are amenable to quantitative analysis, and others incorporate measures or descriptions that are more qualitative. In this section, the authors focus on one particular measure: the relationship between student performance and participation in SI.

This relationship was examined by first determining the number of students participating in SI at each level ("no", "low", and "average" participation) and the unadjusted average course grade (A = 4.0) of students at each level of participation. The analysis was performed over all 6 quarters covered by this study. The relationship between these variables is reported in Table 2. A student who attended zero sessions of SI was recorded as having had no participation; attendance at one or two sessions was judged to be low participation; and attendance at three or more sessions was deemed to be average participation.

The data in Table 2 indicate that students who did not attend SI received, on average, a lower grade in the course than those students who attended SI received. Students who attended SI three or more times earned average grades higher than the class earned as a whole. Students with low participation in SI received course grades slightly higher than the class average. A positive correlation was found (r = .165, p < .001) between SI participation level and standardized student grade (the student's grade in the course, converted to a z score), demonstrating a slight relationship between course grade and SI attendance.

TABLE 2 
Unadjusted Course Grade by SI Participation Level for All Classes

SI Participation Level SI Participation Level Number of Students at This Level of Participation Average Course Grade for Students at This Level of Participation
No = 0 7,545 2.387
Low = 1 to 2 1,953 2.597
Average = 3+ 1,892 2.848
Total 11,390* 2.499
Note: Excluding students who (a) received a course grade of W or I, (b) enrolled for credit/no credit, or (c) audited the course.

Correlation coefficients were also calculated to examine the relationships between standardized predicted GPA (PGPA) and standardized student grade (r = .404, p < .001), and between standardized PGPA and SI participation level (r = -.013, ns). PGPA is a measure that includes both achievement (high school GPA) and aptitude (composite ACT score) dimensions. The former correlation coefficient established that PGPA was a strong indicator of student success; PGPA is thus both a theoretically and a statistically sound covariate. The latter correlation coefficient indicated that no significant relationship existed between PGPA and SI attendance. Thus, there was no evidence that students with higher PGPAs attended SI more. These results are consistent with conclusions from previous studies (e.g., Blanc, et al., 1983; Congos & Schoeps, 1993).

The presence of a positive correlation between SI attendance and course grade supported the hypothesis that SI attendance has a positive influence on grades. However, the correlation coefficient ignored other factors, such as aptitude, that could potentially be more important than SI attendance in accounting for differences in students' grades. In an attempt to control statistically for these other influences, three ANCOVAs (all courses, physical science courses, and social science courses) were performed.

The ANCOVA dependent variable was the student's standardized grade in the class in which SI was offered. Students who audited the class, who withdrew, or who received an incomplete were not included in these analyses. Students who attended SI were significantly less likely, X2(2) > 140, p < .001, to withdraw from the class than were students who did not attend SI. Grades were converted to standardized scores to control for differences in grading patterns among classes and then were converted back to A = 4.0 format for reporting purposes. The independent variable was the SI participation level.

Two independent sets of covariates were used in this analysis. The first, a student's standardized PGPA, was included to help control for internal student factors such as aptitude and innate academic ability. Although any single covariate cannot entirely represent all of these factors, the strong relationship found between PGPA and course grade (r > .4) indicates that PGPA was an appropriate choice. As a further check of the statistical validity of PGPA as a covariate, the researchers verified that PGPA was normally distributed and displayed homogeneity of variance.

A significant interaction was found between PGPA and participation in SI. Students with lower PGPAs who attended SI showed greater improvement than did students with higher PGPAs. This finding could be the result of a ceiling effect or could indicate that students with lower PGPAs derive greater benefit from attending SI.

The second covariate used, a student's course affiliation, was included to help control for factors external to the student, such as instructor's teaching style, individual SI leader's characteristics, and course enrollment. A variable was assigned to each student with a set value equal to 1 for students in a given course and 0 for everyone else.

A total of 11,390 students were candidates for inclusion in the ANCOVA. A candidate was any student who was enrolled, for credit, in an SI course. Candidates were eliminated from the ANCOVA if they did not complete the course or if necessary information was missing or incomplete. In most cases, students were excluded from the ANCOVA because no PGPA was on their student record. Although how these students differed from the students included in these analyses is not entirely clear, certain conclusions are reasonable. For example, the presence of a PGPA likely indicates that the student was new in his or her academic career. Academically mature transfer and nontraditional students are less likely to have a PGPA, because the university often does not require from them information, such as ACT or SAT scores, necessary to compute the PGPA. However, because newer students are frequently considered to be at greatest risk of academic failure (Dwyer, 1989), the subset of students included in this analysis is arguably of greatest interest. The total number of students included in the ANCOVA after exclusion for these reasons was 8,047 (70.6%).

The adjusted average course grade computed using PGPA and course affiliation as covariates is presented in Table 3. The differences between low participation and no participation, and average participation and no participation, are shown in the third column. The ANCOVA demonstrates that the average grade of students who attended SI three or more times was 0.603 (p < .001) higher than the average grade of students who did not attend, after controlling for internal student factors measured by PGPA and for external student factors measured by class affiliation. This effect could not be explained by the number of hours per week the student worked, the number of hours the student planned to study, or the number of credit hours for which the student was enrolled; none of these variables demonstrated any correlation with either class grade or SI participation.

TABLE 3 
Adjusted Course Grade by SI Participation Level for All Classes

SI Participation Level Adjusted Course Grade of Students This Level of Participation (N = 8047) Difference in Adjusted Course Grade for Students This Level of Participation
No = 0 2.319 (N = 5397)  
Low = 1 to 2 2.596 (N = 1388) +0.277***
Average = 3+ 2.922 (N = 1262) +0.603***
***p < .001.

The researchers offer two reasons that the ANCOVA may have underestimated the actual differences in grade between SI and non-SI students. First, these analyses did not identify students who participated in SI in more than one course or at other institutions. These students experienced an advantage from their earlier participation in the program, yet some of them may have been grouped with students who never attended SI. Blanc, et al., (1983), for example, found that SI attendance in one class improved the student's overall semester GPA, and Martin and Blanc (1981) demonstrated over a 3-year period that students who attended SI continued to receive higher grades and reenrolled at a higher rate than their peers. Second, these analyses did not identify students who attended SI sessions but who failed to fill out attendance records. These students would also have been grouped with students who never attended SI or with students with low participation (one or two sessions).

To examine the relative effectiveness of SI in physical and social science courses, the researchers performed separate ANCOVAs on each group. The results of this analysis are shown in Table 4. The differences between low participation and no participation, and average participation and no participation, are presented in the third and fourth columns. The results indicate that students who attended SI achieved higher average course grades in both physical and social science courses (p < .001) than their peers who did not attend SI achieved. However, the analyses demonstrated a stronger positive effect for students in the social sciences, even though the average number of SI sessions attended was roughly equal for students in the social and physical sciences (see Table 1).

TABLE 4 
Adjusted Course Grade by SI Participation Level for Physical and Social Science Courses

SI Participation Level Adjusted Course Grade of Students at This Level of Participation Difference in Adjusted Course Grade for Students at This Level of Participation
  Physical 
(N = 4188)
Social 
(N = 3859)
Physical Social
No = 0 2.400
(N = 2936)
2.250
(N = 2461)
   
Low = 1 to 2 2.562
(N = 642)
2.611
(N = 746)
+0.162*** +0.361***
Average = 3+ 2.843
(N = 610)
2.972
(N = 652)
+0.443*** +0.722***
***p < .001.

SI Attendance Patterns

The findings demonstrate a clear positive relationship between SI attendance and academic performance, as measured by the grade received for the course. Therefore, examining some of the factors that influenced a student's decision to participate is of interest. As stated, neither a student's PGPA (a good indicator of academic aptitude) nor any of several other indicators examined appear to predict SI attendance. In this section, the researchers consider another factor related to SI attendance--course enrollment.

TABLE 5 
Correlations (r) Between Course Size and SI Attendance

Relationship Between Physical Science Social Science Total
Number of students attending SI and course size 0.47*** 0.67*** 0.55***
Percentage of students attending SI and course size -0.40** -0.53** -0.42***
Average number of SI sessions attended and course size -0.25 -0.44* -0.29**
Total contact hours and course size 0.23 0.24 0.25*
*p < .05. **p < .01. ***p < .001.

Table 5 contains data on several relationships between course size and SI attendance. The key results may be summarized as follows: (a) although the number of students who attended at least one SI session during the term increased as course size increased, the percentage of students attending--as well as the average number of SI sessions attended--generally decreased with increasing course size; (b) the offsetting effects of this trend caused the total number of contact hours to remain relatively independent of course size (the correlations were not found to be statistically significant in the subgroups, although a slight positive relationship existed overall); and (c) both physical and social science courses showed similar trends.

An explanation for the first result has yet to be found. The researchers note, for example, that a lack of student awareness is not the reason, because surveys revealed that an average of 96.4% of all students knew that SI was offered, with no important differences in knowledge between large and small courses. In addition, no relationship existed between course size and the provision of teaching assistant-led discussion sections (which could conceivably compete with SI).

Although without direct evidence, a possible explanation for the first result is that in smaller classes, students who attend SI would be able to recognize and associate with other SI participants and the SI leader in the class lectures. This contact outside of the SI sessions could reinforce SI attendance. In larger classes, students would have more difficulty finding and associating with one another in the class lecture.

SI Costs

Figure 1To determine the costs of the SI program at this institution, data from 3 quarters were analyzed (Autumn 1994, Winter 1995, and Spring 1995). The overall cost for the program was determined by summing the pay for the undergraduate leaders, the graduate supervisors, a research assistant, and a work-study student for data entry. Costs for the 1994-95 academic year are presented in the Figure 1.

To calculate a cost per contact hour for the program, information regarding the number of students attending SI and the frequency with which they attended was tabulated. A contact hour was defined as one student attending one hour-long SI session. The average cost per contact hour for the SI program for each individual quarter and the average cost for the 1994-95 academic year are shown in Table 6.

TABLE 6 
Average Cost Per Contact Hour

  Cost Contact Hours Cost/Contact Hour
Fall 1994 $11,668 3,836 $3.04
Winter 1995 $11,793 2,643 $4.46
Spring 1995 $10,618 2,067 $5.14
Total $34,079 8,546 $3.99
Note: Excluding students who (a) received a course grade of W or I, (b) enrolled for credit/no credit, or (c) audited the course.

The cost per contact hour presented here is clearly an underestimation of the actual cost of the program, because the costs for the part-time program administrator and part-time secretary were not included in the calculation. These expenses were omitted for two reasons. First, the researchers wanted to make the information generalizable to SI programs on other campuses. Often, SI administration is folded into existing administrative positions or funded from different sources. Second, estimating the actual percentage of time spent by these employees on SI-related work compared to their other responsibilities proved difficult.

Although SI provides a different service than other learning assistance programs offered by the university (e.g., tutoring and learning-skills courses), a comparison of the cost per student hour between SI and these programs demonstrates SI's relative cost efficiency when similar administrative costs are excluded. Students receiving tutoring services at the same institution are charged $8.00 per hour, and students enrolled in learning-skills courses pay an average of $6.00 per contact hour. The cost of SI, comparatively, is thus shown to be less than these other learning assistance programs.

Conclusion

The current research is particularly important because of the large data pool and the efforts to control for student ability. Strong positive evidence was found that students who participate in SI earn higher average course grades than do students who do not participate. In addition, no relationship was found in this research between a student's expected performance in the course (using PGPA) and SI attendance. Thus, there was no evidence that only the "better" students attend SI.

The research findings that SI was more effective in the social sciences than in the physical sciences and that a relationship existed between class size and SI attendance cannot yet be explained, suggesting a promising area for future research. Finally, this study demonstrated that SI is a cost-effective component of the student affairs division's academic support programs.

Returning to the student learning responsibility of the student affairs division, much information in the recent literature supports the role that an SI program can play in this mission. Terenzini, Pascarella, and Blimling (1996) cited the importance of one aspect of SI, the positive effect of peer interactions on student learning, particularly interactions that focus on academic topics. These same authors also stressed the importance of interventions that take place early in a student's college experience, another focus of the SI program, along with the need for active student involvement in the process.

This latter point, involvement, was one of the seven core conditions of an optimal learning environment discussed by Blocher (1978). Schroder and Hurst (1996), in elaborating on Blocher's work, stated: "Learning is not a spectator sport--it is an active, not a passive, enterprise. Accordingly, a learning environment must invite, even demand, the active engagement of the student." (p. 174). Professionals at the University of Missouri-Kansas City designed the SI program with a central premise that the SI leader's role in these study sessions is to induce students to discuss the course material actively, to explore their understanding, and to look to themselves and each other for information rather than referring passively to an authority.

At least two of Blocher's (1978) other conditions are relevant in assessing the value of SI. One is support, which is provided to students by the SI leader and the other students participating in the SI sessions. Another is structure, which Blocher described as providing role models who are at a more advanced level than the beginning college student. SI leaders are exactly that, model students; and because they have no evaluative role with the students they help, they can provide structure in a way that is different from what professors and graduate student teaching assistants can offer.

Those who argue for the student affairs division's involvement in student learning stress the important cumulative effect of having an environment that provides numerous and varied experiences in which students can learn. The results of this study indicate that SI is a very important and effective program for student affairs divisions to offer.

References

American College Personnel Association (1994). The student learning imperative: Implications for student affairs. Washington, DC: Author. Blanc, R., DeBuhr, L., & Martin, D. (1983). Breaking the attrition cycle. Journal of Higher Education, 54 (1), 80-90.

Blanc, R., & Martin, D. (1994). Supplemental instruction: Increasing student performance and persistence in difficult academic courses. Academic Medicine, 69, 452-454.

Blocher, D. (1978). Campus learning environments and the ecology of student development. In J. H. Banning (Ed.), Campus ecology: A perspective for student affairs [Monograph] (pp. 17-23). NASPA.

Burmeister, S., Carter, J., Hockenberger, L., Kenney, P. McLaren, A., & Nice, D. (1994). Supplemental instruction sessions in college algebra and calculus. In D. Martin & D. Arendale (Eds.), Supplemental instruction: Increasing achievement and retention. New Directions for Teaching and Learning, 60 (pp. 53-61). San Francisco: Jossey-Bass.

Congos, D., & Schoeps, N. (1993). Does supplemental instruction really work, and what is it anyway. Studies in Higher Education, 18, 165-176.

Dwyer, J. (1989). A historical look at the freshman year experience. In The freshman year experience (pp. 25-39). San Francisco: Jossey-Bass.

Kenney, P., & Kallison, J., Jr. (1994). Research studies on the effectiveness of supplemental instruction in mathematics. In D. Martin & D. Arendale (Eds.), Supplemental instruction: Increasing achievement and retention New Directions for Teaching and Learning, 60 (pp. 75-82). San Francisco: Jossey-Bass.

Lockie, N., & Van Lanen, R. (1994). Supplemental instruction for college chemistry courses. In D. Martin & D. Arendale (Eds.), Supplemental instruction: Increasing achievement and retention New Directions for Teaching and Learning, 60 (pp. 63-74). San Francisco: Jossey-Bass.

Lundeberg, M. (1990). Supplemental instruction in chemistry. Journal of Research in Science Teaching, 27, 145-155.

Martin, D., & Blanc, R. (1981). The learning center's role in retention: Integrating student support services with departmental instruction. Journal of Developmental and Remedial Education, 4(3), 2-4, 21-23.

Schroeder, C., & Hurst, J. (1996). Designing learning environments that integrate curricular and cocurricular experiences. Journal of College Student Development, 37, 174-181.

Schwartz, M. (1992). Study sessions and higher grades: Questioning the causal link. College Student Journal, 26, 292-299.

Terenzini, P., Pascarella, E., & Blimling, G. (1996). Students' out-of-class experiences and their influence on learning and cognitive development: A literature review. Journal of College Student Development, 37, 149-162.

U.S. Department of Education. (1884). Involvement in Learning: Realizing the potential of American Higher Education. Washington, D.C.: Author.

Visor, J., Johnson, J., & Cole, L. (1992). The relationship of supplemental instruction to affect. Journal of Developmental Education, 16(2), 12-14, 16-18.

Wolfe, R. (1987). The supplemental instruction program: Developing learning and thinking skills. Journal of Reading, 31, 228-232.