"Actually, because I enjoy this type of thing – I could make a pretty strong argument that achievement of these outcomes will get better under the revised structured pathways model.  At the moment, under the traditional model, it’s up to the student to arrange the 10-14 courses from the long lists of possibilities, and we make the assumption that somehow – magically – they will do so in a way that will arrange 10-14 courses that produce a high level of achievement of the general education outcomes.  I have to say, this seems an odd argument to me – and the evidence from industry suggests it may not be working as well as we might like.  It also seems that if we empower the subject matter experts – discipline faculty in the fields in which students are graduating – to do as Sinclair CC in Ohio did under CBD, and have each discipline’s faculty suggest a short list of general education electives that would be best for students who graduate in their disciplines, that we will have a much better sense of how the combination of classes arrange for each student.   Ultimately, our ability to monitor and improve students’ achievement of these general education outcomes – the hallmark of a liberal arts education – seems likely to improve under a structured pathways approach.”

-Rob Johnstone, Founder & President at the National Center for Inquiry & Improvement .

image

Speaking with Randy Lawson

by Ani Aharonian

Santa Monica College’s last self-study and site visit took place in 2010 with the next round scheduled for 2016.  It is no secret that when it comes to accreditation, one would be hard-pressed to find someone more knowledgeable than SMC’s Executive Vice President, Randal “Randy” Lawson. Randy serves as Accreditation Liaison Officer, has co-chaired the 1998, 2004, and 2010 accreditation self-studies, and served on visiting accreditation teams to other colleges.  Randy graciously spoke to us about the last accreditation self-study and site-visit and the preparations for the 2016 accreditation year.

Randy explains that the 2010 site visit could be thought of perhaps “as an example of ‘not a very good visit’ which was very disappointing to us because …we felt we were so well prepared particularly in comparison to 2004…” 

SMC’s accreditation was reaffirmed and the college received praise for the high instructional quality the college is known for, but did receive 9 recommendations for improvements. Some were relatively minor and easily addressed (e.g. although the college had ethics statements for each employee group, the team recommended developing a district-wide code of ethics).  However two were of primary importance and required a follow-up with the commission in six months.  The first involved completing the college’s master planning process Randy explained that the college had changed the planning process in 2005 and so had not yet completed a full cycle at the time of the site visit.  In addition to not having had an opportunity to experience a full cycle in order to clarify the entire process and evaluate the components, the college’s research function was very weak at the time due to very recent staffing changes and a staffing shortage in Institutional Research.  The Office of Institutional Research is critical to the support of institutional effectiveness and improvement by assisting the various campus constituents with ongoing and systemic evaluation processes, therefore the second task of primary import was to increase and strengthen the research function of the college.

Recommendation 1: To meet the standards, the team recommends that the college complete the development of a sustainable comprehensive master planning process with the Master Plan for Education at its core. The resultant multi-year plan should contain explicit links to instructional and student services programs, human resources, facilities, technology, and other planning needs that are revealed by the program review process or other assessments of institutional effectiveness. The team further recommends that the college work to achieve among its constituents a uniform understanding of the planning cycle and documentation processes through a mechanism accessible to all audiences regardless of their previous experience with the institution (Standard I.A, I.A.1, I.A.4, I.B.1, I.B.3, I.B.4, I.B.6, I.B.7, II.A.1.a, II.A.1.c, II.A.2.f, III.A.6, III.B.2.b, III.C.2, III.D.3, IV.A.5, and IV.B.2.b)

Recommendation 3: To meet the standards, the team recommends that the college evaluate the efficacy of the current staffing model for the institutional research function with a goal of providing timely, in-depth analysis of effectiveness measures and other key institutional metrics to move the college toward the goal of becoming a culture of evidence (Standards I.B.3, I.B.4, I.B.6, I.B.7, II.A.1.c, II.A.2.e, II.A.2.f, II.A.2.g, and II.B.3)

The college had a very short time frame of about 4 months in which to address these recommendations in a follow-up report to the ACCJC. In hindsight, Randy feels the visiting team did SMC a great favor, “…it made us get to work and complete things, and it probably got us past any minor bickering that we might have had along the way about how we were going to do this.”  In that short time, SMC’s master planning process was finalized and the Master Plan for Education which is the college’s core planning document was revised to more clearly describe the planning process and the interrelationships of the many components of the planning process.  The Office of Institutional Research (IR) expanded the available data on the college’s website and the college developed two new employment classifications (research analyst and senior research analyst) and quickly began recruiting for a research analyst. 

The SMC community banded together and worked hard, particularly the District Planning and Advisory Committee (DPAC) and made what Randy describes as “incredible progress” and SMC “moved from needing to do this follow-up report to being asked, the following spring, to present at the Northern Regional Workshop for the ACCJC on the relationship between program review and the integrated planning process.”  In other words, in a very short time, SMC transitioned from receiving a serious recommendation for improvement of the planning process to becoming exemplary in its master planning process. 

Through his role as Accreditation Liaison Officer, Randy, himself a member of several visiting teams evaluating other colleges, has observed that planning, program review, and student learning outcomes (SLOs) have become, and are likely to remain, focus major directions in accreditation, “… there is more of a trend for those kinds of things to be the reason why a college is asked to do a follow-up report or, in more extreme cases, placed on warning or other kinds of sanction or so on.”  With most colleges now having had an opportunity to set up processes with regards to planning, program review, and SLOs, Randy believes the focus will now be on how well the colleges do with evaluating and assessing these processes and using the data for continuous improvement.  SMC’s current processes were designed in order to help make this easier to accomplish and Randy is confident that we are doing this well as an institution, having made significant improvements with annual objectives in terms of setting, meeting, and following-up on these objectives in a meaningful way.  Randy recommends the Master Plan for Education and the Institutional Effectiveness Report to those seeking to gain a deeper understanding of the planning process at SMC and the institutional goals which guide the planning process.

In preparation for the 2016 Self-study and subsequent site visit, the college is now in the process of forming teams for all accreditation standards and sub-standards and has recently sent out a “Call for Volunteers” in a district-wide email.  Participating in the accreditation process by joining one of these teams and being part of the discussions is an excellent way for new faculty and classified staff to get involved.  Randy encourages any interested employees to get involved, “Participating may mean just being on one of the committees and it doesn’t necessarily mean you have to write something… Being involved is just being part of the discussion.  We think the broader that discussion can be, the better.  So we want to encourage people to join one of the standard committees or subcommittees in an area that interests them.”  Volunteers who wish to do so are even encouraged to branch out and join committees in areas different than those they work in as a way to learn more.  Randy stresses that there is no requisite background or knowledge necessary and “in some cases it is better to have someone who doesn’t know anything about that standard there because their questions are often the kinds of questions that someone on a [visiting] team might ask or they might notice something that is a small weakness that we who know it well may overlook.”

While SMC prepares for accreditation, any significant findings will be shared with the college as one of the first steps in addressing any campus needs. “If we notice something that needs to be addressed, and we can address it now or at least begin to address it now, it’s a strength and it puts us ahead of the game.”

If you are interested in learning more about accreditation in general or about the planning process at Santa Monica College, please consult the reading list of helpful resources and documents below: 

Read More

2014 Institutional Effectiveness Report

image

By Hannah Lawler

During the May 2014 Board of Trustees meeting, the Office of Institutional Research presented a summary of Santa Monica College’s (SMC) 2014 Institutional Effectiveness Report. If you missed the riveting discussion of student success, target goals, and institution-set standards, you are in luck, as the current blog post is dedicated to getting you acquainted with the 2014 Institutional Effectiveness Report and Dashboard!

Institutional Effectiveness (IE) is the systematic and continuous process of measuring the extent to which a college achieves its mission. The primary purpose of IE is to advance educational quality and institutional improvement through the review and discussion of the college’s performance on key data metrics that are aligned with the college’s major goals. The IE process is intimately linked with the accreditation self-evaluation process as current ACCJC standards require institutions to systematically and regularly evaluate institutional quality through the collection and analyses of data for planning and improvement of student learning and success, and college practices and programs.

At SMC, the IE data metrics, called “key indicators (KI)”, are organized by the five supporting goals (major areas) of SMC:

  1. Innovative and Responsive Academic Environment (instructional programs and curriculum),
  2. Supportive Learning Environment (academic and student support services),
  3. Stable Fiscal Environment (fiscal operations),
  4. Sustainable Physical Environment (physical infrastructure), and
  5. Supportive Collegial Environment (human resources and collegiality).

The data are organized by six different dashboards. Dashboards are visual tools that monitor the college’s performance on the KIs. Five of the six dashboards are organized by the associated supporting goals of the college, and the sixth dashboard contains KIs that were identified as institutional priorities by the District Planning Advisory Council (DPAC) and the Academic Senate Joint Institutional Effectiveness Committee. The “Institutional Priorities Dashboard” is aligned with SMC mission, current and former strategic initiatives, objectives of the Master Plan for Education, and the Board of Trustees’ goals and priorities.  The six dashboards, when reviewed together, provide a balanced view of institutional effectiveness at SMC.

The following bullet points highlight some of the findings from the 2014 IE Report:

  • SMC meets or exceeds the institution-set standards for 22 of the 23 student success indicators, including number of transfers to UC/CSU, number of degrees and certificates awarded, course success, CTE license examination pass rates, and ILO mastery rates.
  • The College fell below the standard set for CTE Completion Rate (42.0%), missing the institution-set standard of 43.8% by 1.8%.
  • SMC fell below the target goals for four indicators, Persistence Rate, Basic Skills Course Improvement Rate, Basic Skills Transition to Degree Course Rate, and CTE Completion Rate.

Want to learn more about institution-set standards and target goals? The following blog post describes what standards and targets are in the context of institutional effectiveness at SMC: http://irsmc.tumblr.com/post/92439784701

The IE reports and dashboards are updated on an annual basis. To access to past and current reports, visit the Institutional Effectiveness Dashboard website: www.smc.edu/iedashboard.

image

by Ani Aharonian

Educational accreditation in the United States is essentially a peer-review or self-regulatory quality assurance process.  Whereas in most countries, a governmental office oversees education (e.g. ministry of education), in the United States, independent, but federally recognized, regional accrediting organizations formed from member institutions ensure educational quality.  While the federal government does not directly oversee accreditation, only accredited institutions are eligible to receive federal financial aid dollars.  The Western Association of Schools and Colleges (WASC) has jurisdiction over the states of California and Hawaii as well as East Asia and several Pacific Island locations.  The WASC has two accrediting commissions: the Accreditation Commission for Senior Colleges and Universities (ACSCU) for 4-year colleges and universities and the Accreditation Commission for Community and Junior Colleges (ACCJC) for 2-year colleges.  Santa Monica Colleges is a member of and is accredited by the ACCJC.  By becoming members of the ACCJC colleges agree to meet the eligibility requirements and standards and participate in the accreditation process which involves carrying out a self-study and undergoing peer review on a 6-year cycle.  According to the ACCJC, meeting the accreditation standards provides assurance to the public that an institution provides high quality education and is committed to continuous improvement.


What are the accreditation standards?

image

The college emphasizes student learning and achievement, continuously and systematically uses data for evaluation, planning, and improvement of educational programs and services, demonstrates integrity in all policies, actions, and communications.

image

Programs are of appropriate quality and rigor, quality is assessed and results of assessments are publicly available, and used for the improvement of educational quality and institutional effectiveness.

image

The institution uses its human, physical, technology, and financial resources to achieve its mission and to improve academic quality and institutional effectiveness.

image

The college recognizes and uses the contributions of leadership to promote student success, sustain academic quality, integrity, fiscal stability, and continuous institutional improvement. The governing board, administrators, faculty, staff, and students work together for the good of the institution.

What is the accreditation process?

Accreditation occurs on a 6-year cycle with each cycle beginning with the delivery of the ACCJC’s “action letter” which outlines what action the commission is taking with regards to the accreditation status of the institution.  The accreditation process can be summarized in 4 phases. 

  1. The first phase consists of a self-evaluation (previously called self-study) in which the institution assesses its own adherence to ACCJC eligibility requirements and accreditation standards and develops a plan for improvement in a written Self-Evaluation Report of Educational Quality and Institutional Effectiveness.  The college must provide sufficient evidence to support the claims made in the Self-Evaluation.
  2. The second part is the external evaluation, during which a team of evaluators from other ACCJC accredited institutions visits in order to validate the assertions and evidence in the Self-Evaluation Report.  The external team then writes an evaluative report, called the External Evaluation Report of Education Quality and Institutional Effectiveness and makes recommendations the college should undertake to meet standards or improve. 
  3. During the 3rd phase, the ACCJC reviews both the self- and external- evaluations and makes a decision about the accreditation status of the institution.  The ACCJC sends an “action letter” to the institution which outlines the commission’s decision and may provide recommendations to meet standards and make improvements. 
  4. In the 4th and final, “follow-up” phase of the accreditation process, the institution engages in a follow-up and improvement process based on the recommendations from the ACCJC’s action letter, the External Evaluation Report, and the Self-Evaluation Report.  

CIRCLE Study of SMC Student Voting Patterns

image

By Christopher Gibson

Like most of you, I first assumed C.I.R.C.L.E. to be either an organization planning to cut James Bond in half with a laser beam or some robe-wearing offshoot sect of the Illuminati.  I was pleasantly surprised to learn that the Center for Information & Research on Civic Learning and Engagement conducts research on civic and political engagement in the American education system. Recently SMC participated in a study conducted by CIRCLE on student voting rates for the 2012 Federal Election. To perform the study CIRCLE obtained, with SMC’s permission, data on SMC students through external data sources.  Here is the report (PDF).

Some of the findings discussed in the study are:

  1. According to the CIRCLE study, 62.8%† of the eligible student population at SMC was registered to vote, and 69.6%† of those SMC students who were registered to vote actually voted.
  2. The trends for SMC groups’ voting participation reflect the trends for the nation.  Older people tend to vote at higher rates than younger people, even among students.
  3. According to this study, 19% of SMC students were absentee voters, which is in line with some national trends , although that national number does not focus on students alone, as this study does.

The results for SMC can be compared to the overall group comparison results here.

One interesting aspect of this study is that SMC did not provide CIRCLE with any institutional data directly.  Circle used three external data sources (Student Clearinghouse; Catalist; and NCES’s IPEDS) to complete the analysis, thus enabling them to use the same method for the more than 260 participating campuses across the country.

For a complete list of what was pulled from which data source, and the methodology used to arrive at their conclusions, consult CIRCLE’s Report FAQ.

The report provided by CIRCLE goes on to list recommendations to increase student engagement in the voting process, including suggestions like “Get political, not partisan” to encourage discussion without appearing biased, and “Improving voter registration efforts on campus may help to increase voter turnout” because, given the high likelihood of registered voters actually turning up to vote, more registrations leads to more actual votes. 

†These numbers do not reflect the 10340 students who blocked their records from being used for research.  For the purposes of this study, it is assumed that voting registration and participation happen at the same rate for students who block such uses of their data as for students who do not.

"Slate education writer Dana Goldstein talks about the "flipped classroom," in which students watch lectures at home and do homework at school, with Jonathan Bergmann, a teacher and educational technology advocate, and Frank Noshcese, a science teacher who’s skeptical of flipping".

In the spring of 2013, Laude and his staff sat down with the Dashboard to analyze the 7,200 high-school seniors who had just been admitted to the class of 2017. When they ran the students’ data, the Dashboard indicated that 1,200 of them — including Vanessa Brewer — had less than a 40 percent chance of graduation on time. Those were the kids Laude decided to target. He assigned them each to one or more newly created or expanded interventions. The heart of the project is a portfolio of “student success programs,” each one tailored, to a certain extent, for a different college at U.T. — natural sciences, liberal arts, engineering — but all of them following the basic TIP model Laude dreamed up 15 years ago: small classes, peer mentoring, extra tutoring help, engaged faculty advisers and community-building exercises.

Institution-set Standards vs. Target Goals

By Hannah Lawler

What are institution-set standards? And how are they different than target goals? As part of the 2016 accreditation self-evaluation process, Santa Monica College will be required to respond to a new standard:

The institution establishes institution-set standards for student achievement, appropriate to its mission, assesses how well it is achieving them in pursuit of continuous improvement, and publishes this information (Standard IB4).

If you are familiar with the 2013 or 2014 Institutional Effectiveness (IE) Reports, you should recognize the terms “institution-set standards” as the College began setting standards for all college-wide student success and achievement metrics (i.e. course success rates, student retention rates, and number of degree completers) in spring of 2013. Recent changes in federal guidelines have prompted the Accrediting Commission of Community and Junior Colleges (ACCJC) to include, as a part of the accreditation process, an evaluation of whether institutions are setting appropriate standards of performance for student success and achievement metrics.

Institution-set standards are defined as the standards reflecting satisfactory performance of student learning and achievement. Standards can be interpreted as the minimum level of performance signaling that the college is meeting educational quality and institutional effectiveness. Target goals, also included in the IE Dashboards/Reports, are different from institution-set standards because targets are aspirational in nature. Targets are goals we hope to achieve in order to improve student learning and achievement and meaningfully move the need on institutional effectiveness.

The following figure illustrates the current performance relative to the institution-set standard and target goal for one of the IE key indicators, Career Technical Education Completion Rate.

image

Institution-set standards are defined for each key indicator of the IE Dashboards that directly measure student performance, such as course success, transfer, degree completion, and license exam pass rates. The standards were initially set by the Academic Senate Joint Institutional Effectiveness Committee and reported to the college’s central planning body, the District Planning and Advisory Council (DPAC), and the Board of Trustees. The standards are reviewed for appropriateness every year. Record of the standards for SMC are kept at the Institutional Effectiveness Dashboard link below.

Target goals are defined and monitored for all key indicators on the “Institutional Priorities” Institutional Effectiveness Dashboard. Targets were set by the primary sponsors or personnel directly responsible for or affected by a key indicator of an IE Dashboard (for example, the Dean of Counseling, Department Chair of Counseling, and the Transfer Center Faculty Leader set the target for the Transfer Rate metric). Historical targets for SMC are also kept on the Institutional Effectiveness Dashboard.

Institution-set standards and targets are critical in telling us how well we are doing in terms of student success and achievement. When the College falls below a standard or is not meeting a target goal, the data is brought to the attention of multiple campus groups, including DPAC, and subsequently, plans to improve performance on the measure are developed.

Want more on institution-set standards and target goals? Visit the following resource pages or email us at research@smc.edu:

  • To see the newly adopted (as of June 2014) criteria that will be used to evaluate community colleges see the ACCJC Accreditation Standards:

http://www.accjc.org/wp-content/uploads/2014/06/Accreditation_Standards_Adopted_June_2014.pdf 

  • To see the directions given to ACCJC Visiting Teams Regarding Implementation of New U.S. Department of Education Regulations, see this memo from the ACCJC :

http://www.accjc.org/wp-content/uploads/2013/03/ACCJC-Cover-memo-AND-External-Evaluation-Team-Resp-for-Compliance_2-5-13.pdf

  • To see a yearly record of SMC’s institution-set standards and targets and whether each was met, see the SMC Institutional Effectiveness Dashboard

www.smc.edu/iedashboard

 

SMC Accreditation Preview: The Data!

If you find yourself browsing through SMC’s 2010 ACCJC Self-Study (also known as the Self-Evaluation Report), you may notice the term “Institutional Research” pops up a mere… 35 times.

So it’s safe to say that data plays an important part of any well-written accreditation self-study report. While your humble blog curator was not involved with previous self-studies, and will have to make a daring escape from his LS office to participate in our upcoming process (hmm, I wonder if this guy’s available), he does have some guesses as to what you may see included into the future.

  1. ILO & SLO Data: Obviously a big component of data in the ACCJC Self-Evaluation Report includes reporting of Institutional Learning Outcomes (ILO). Since the 2010 study, the college has fully implemented ILOs and SLOs. As an example of the reporting we do, the program ILO data is publicly available here.  
  2. Completion Data: Want to know how many students earned degrees? Certificates? Look no further. Hey, want to know how long it takes students that intend to transfer to do so? Check it out here. Having students complete their college goals is the ultimate goal of the college, and you can bet we’ll find that information in a proper self-study.
  3. Program Review: How well are our programs doing in accomplishing their goals? How effective are our intervention programs? Flip through our program review documents at SMC’s ProgRev site, and you’ll see report after report teeming to the brink with evaluation information.
  4. THE DASHBOARD!: How many students begin in our Basic Skills sequence? How many students persist to the next term? What does our Equity Gap look like? All this (and more!) you can find in the 2014 Institutional Effectiveness Dashboard. As one of the points of emphasis in the 2010 report, the IE reports are of paramount importanceto the college’s evaluation efforts.  

Interested in more you say? Can we interest you in our newly remodeled IR front page?

Online course enrollment in California’s Community Colleges (CCC) has grown remarkably in the last 10 years, with nearly 20 percent of the students who took courses for credit taking at least one online in 2012. However, students are less likely to complete an online course than a traditional course, and they are less likely to complete an online course with a passing grade. These are among the key findings of a report released today by the Public Policy Institute of California (PPIC). It is based on longitudinal student and course level data from all 112 community colleges.

This tumblr is full of “Big Data ____” articles. We find this particular article interesting because it uses the program Signals as its focal point.