By Hannah Lawler
What are institution-set standards? And how are they different than target goals? As part of the 2016 accreditation self-evaluation process, Santa Monica College will be required to respond to a new standard:
The institution establishes institution-set standards for student achievement, appropriate to its mission, assesses how well it is achieving them in pursuit of continuous improvement, and publishes this information (Standard IB4).
If you are familiar with the 2013 or 2014 Institutional Effectiveness (IE) Reports, you should recognize the terms “institution-set standards” as the College began setting standards for all college-wide student success and achievement metrics (i.e. course success rates, student retention rates, and number of degree completers) in spring of 2013. Recent changes in federal guidelines have prompted the Accrediting Commission of Community and Junior Colleges (ACCJC) to include, as a part of the accreditation process, an evaluation of whether institutions are setting appropriate standards of performance for student success and achievement metrics.
Institution-set standards are defined as the standards reflecting satisfactory performance of student learning and achievement. Standards can be interpreted as the minimum level of performance signaling that the college is meeting educational quality and institutional effectiveness. Target goals, also included in the IE Dashboards/Reports, are different from institution-set standards because targets are aspirational in nature. Targets are goals we hope to achieve in order to improve student learning and achievement and meaningfully move the need on institutional effectiveness.
Institution-set standards are defined for each key indicator of the IE Dashboards that directly measure student performance, such as course success, transfer, degree completion, and license exam pass rates. The standards were initially set by the Academic Senate Joint Institutional Effectiveness Committee and reported to the college’s central planning body, the District Planning and Advisory Council (DPAC), and the Board of Trustees. The standards are reviewed for appropriateness every year. Record of the standards for SMC are kept at the Institutional Effectiveness Dashboard link below.
Target goals are defined and monitored for all key indicators on the “Institutional Priorities” Institutional Effectiveness Dashboard. Targets were set by the primary sponsors or personnel directly responsible for or affected by a key indicator of an IE Dashboard (for example, the Dean of Counseling, Department Chair of Counseling, and the Transfer Center Faculty Leader set the target for the Transfer Rate metric). Historical targets for SMC are also kept on the Institutional Effectiveness Dashboard.
Institution-set standards and targets are critical in telling us how well we are doing in terms of student success and achievement. When the College falls below a standard or is not meeting a target goal, the data is brought to the attention of multiple campus groups, including DPAC, and subsequently, plans to improve performance on the measure are developed.
Want more on institution-set standards and target goals? Visit the following resource pages or email us at firstname.lastname@example.org:
- To see the newly adopted (as of June 2014) criteria that will be used to evaluate community colleges see the ACCJC Accreditation Standards:
- To see the directions given to ACCJC Visiting Teams Regarding Implementation of New U.S. Department of Education Regulations, see this memo from the ACCJC :
- To see a yearly record of SMC’s institution-set standards and targets and whether each was met, see the SMC Institutional Effectiveness Dashboard
If you find yourself browsing through SMC’s 2010 ACCJC Self-Study (also known as the Self-Evaluation Report), you may notice the term “Institutional Research” pops up a mere… 35 times.
So it’s safe to say that data plays an important part of any well-written accreditation self-study report. While your humble blog curator was not involved with previous self-studies, and will have to make a daring escape from his LS office to participate in our upcoming process (hmm, I wonder if this guy’s available), he does have some guesses as to what you may see included into the future.
- ILO & SLO Data: Obviously a big component of data in the ACCJC Self-Evaluation Report includes reporting of Institutional Learning Outcomes (ILO). Since the 2010 study, the college has fully implemented ILOs and SLOs. As an example of the reporting we do, the program ILO data is publicly available here.
- Completion Data: Want to know how many students earned degrees? Certificates? Look no further. Hey, want to know how long it takes students that intend to transfer to do so? Check it out here. Having students complete their college goals is the ultimate goal of the college, and you can bet we’ll find that information in a proper self-study.
- Program Review: How well are our programs doing in accomplishing their goals? How effective are our intervention programs? Flip through our program review documents at SMC’s ProgRev site, and you’ll see report after report teeming to the brink with evaluation information.
- THE DASHBOARD!: How many students begin in our Basic Skills sequence? How many students persist to the next term? What does our Equity Gap look like? All this (and more!) you can find in the 2014 Institutional Effectiveness Dashboard. As one of the points of emphasis in the 2010 report, the IE reports are of paramount importanceto the college’s evaluation efforts.
Interested in more you say? Can we interest you in our newly remodeled IR front page?
As a part of a larger re-thinking of the basic skills, or pre-collegiate, sequence, a significant amount of research has been done on community colleges’ efforts to implement course acceleration practices. Acclerated courses are just that, courses that take a large amount of material (usually from two or more courses) and presents them in a shorter time frame (usually one term).
The thinking behind these changes is that students are finding themselves “stuck” in long basic skills sequences, sometimes taking up to four courses before they are able to enroll in college level (and more importantly transferable) courses in math and English. The longer students are taking courses, the more opportunities they have to drop-out. By providing a faster and more challenging pathway towards completion, students will be much more likelier to complete transfer level math and English, and consequently much more likelier to complete degrees, certificates or transfer.
Just in the last two months, two major studies on accelerated courses have been published online.
The RP Group, in conjunction with 3CSN, released “Curricular Redesign and Gatekeeper Completion: A multi-college evaluation of the California Acceleration Project”. The California Acceleration Project was an intervention that provided support to colleges interested in redesigning the math and English basic skills sequence by shortening the time in remediation by at least one semester. The study included data from 16 California community colleges, and almost 2,500 students that enrolled in accelerated courses.
Their study showed an significant increase in the number of students who completed a transfer level course. They found that the intervention “showed higher outcomes…regardless of demographics such as ethnicity, gender, financial need, disability status, and prior English as a second language course taking”. The study showed that these gains were also evident at different basic skills placement levels.
Similarly, the Community College Research Center published a study that focused on the English sequence acceleration efforts of Chabot College. Their study showed that “participation in the accelerated courses was positively associated with a range of positive short-, medium- and long-term outcomes including entry-level College English completion.”
In an effort to increase the number of SMC students who progress through our English and math basic skills sequence, the college began offering accelerated courses in both subjects. In our next post on acceleration, we will provide some preliminary data on these effort.
Aside from the never ending barrage of research requests, our office has also been inundated with awesome education-related links! Having “run” this blog for a few years now, we are noticing a significant uptick in important education policy and research news items that are must read for higher-ed folks.
While we comb through and gather our thoughts for longer posts on these topics, we thought it useful to give everyone a nice reading list of items we have seen during the last few days.
From our Dean, Dr. Lawler:
- The LA Times highlights a study on the achievement gaps between Asian American and white students that focuses on non-cognitive differences.
- NASH, the National Institution of System Heads, released a study on the functions of IR offices around the country. In short, IR offices need to get bigger and need to provide support for a wider array of college functions.
- A Slate article (we’ll let you decide if it’s a #SlatePitch) on the problems of student evaluations of faculty. It argues that currently available alternatives to faculty evaluation still leave a lot to be desired.
- The Data Points blog from the Chronicle of Education has a post on the difficulty in measuring diversity at colleges. While it offers two approaches to tackling this issue (using a diversity index and comparing college demos with the state), it notes that there is still a limit to what exactly we can measure.
- The Christian Science Monitor has a high level summary of some of the efforts the country’s colleges are making to reduce the “transfer deficit”.
And as a special treat, some #HETTSLINKS. These are just some of the articles you’ll find on LBCC’s own #IRFamous Director of Institutional Effectiveness, Dr. Hetts’ twitter account:
- EdSource has a summary on Assembly Bill 1451, which would allow high schools and colleges to enter into formal co-enrollment program partnerships. It highlights a pilot-program that allowed 10 colleges to fund dual enrollment programs that increased the likelihood of graduation.
- Off The Charts, a blog from the folks at the Center on Budget and Policy Priorities, has a post on how higher ed funding remains “well below pre-recession levels”. It also contains what should be a pretty chilling chart for anyone who lives in Arizona.
- The Washington Post has a summary of new research into the problems with students using laptops during class. The ease in taking notes causes students to “use long verbatim quotes, which they type somewhat mindlessly ”. Students who took handwritten notes had a better understanding of the material presented.
- The Root has a great article titled “9 Biggest Lies About Black Males and Academic Success”
- The Atlantic has an essay by a community college philosophy professor on the merits of the humanities.
- And finally, a Press Telegram summary of a forum held at LBCC that included a speech from Joanna Newsom's cousin.