"Sabermetrics Get Soft"

While this article from Grantland is not about education research or policy, we thought this passage, focusing on changes being made by baseball’s Houston Astros, interesting: 

Last year, the Astros hired a new bench coach, Eduardo Perez, who was receptive to front-office recommendations. With Perez (who’s now an ESPN analyst) on hand to help corral rebellious players, the Astros shifted 96 times in April. Then, Luhnow said, “The complaints started to come from the pitchers, from some of the infielders, from the media, from basically anybody out there, and sure enough, as the season wore on, we found more and more reasons not to do the shift.” The Astros’ shift totals fell to 33 in May and 32 in June (though their rate rebounded later in the year, particularly in September).

As a result, Luhnow said, the Astros realized that in order to make an experiment stick, “You’ve got to market it to the people that are involved.”This spring, the Astros spent an hour explaining the thought process behind the shift to the team’s pitching staff, building a tool to display evidence that would answer any questions the players might ask. “I think they weren’t completely satisfied, but I think they felt like we had at least given them a lot more information, and this year we haven’t had anywhere near the pushback from our pitching staff that we did last year,” Luhnow said. Houston also started shifting in the minor leagues to ease the adjustment to the majors. This year, the Astros are leading the majors in shifts by a wide margin, and their monthly totals tell a more consistent story: 272, 263, 208, and 216, with a pace that would put them close to 300 in August.

Does this process sound familiar to the institutional researchers out there? 

On related notes, Leonhardt repeats his errors in over-stating the resources available to the middle-class, and in failing to recognize the role that government-financed aid plays in keeping college prices down. Given its endowment of more than $1.8 billion, taxpayers could reasonably ask why Amherst should be allowed to benefit from funds provided by the federal Pell Grant and federally-backed student loans? Those funds help the school first and foremost, and leave students and families far short of what they need to actually afford such a pricey institution.

That question is a major theme of a 213-page report released on Monday by a committee at the Massachusetts Institute of Technology exploring how the 153-year-old engineering powerhouse should innovate to adapt to new technologies and new student expectations.

“The very notion of a ‘class’ may be outdated,” the report argues. That line appears in the context of online courses, but one of the report’s authors, Sanjay Sarma, who leads MIT’s experiments with massive open online courses, said in an email interview that the sentiment could apply to in-person settings as well.

(Source: pewresearch)

"Actually, because I enjoy this type of thing – I could make a pretty strong argument that achievement of these outcomes will get better under the revised structured pathways model.  At the moment, under the traditional model, it’s up to the student to arrange the 10-14 courses from the long lists of possibilities, and we make the assumption that somehow – magically – they will do so in a way that will arrange 10-14 courses that produce a high level of achievement of the general education outcomes.  I have to say, this seems an odd argument to me – and the evidence from industry suggests it may not be working as well as we might like.  It also seems that if we empower the subject matter experts – discipline faculty in the fields in which students are graduating – to do as Sinclair CC in Ohio did under CBD, and have each discipline’s faculty suggest a short list of general education electives that would be best for students who graduate in their disciplines, that we will have a much better sense of how the combination of classes arrange for each student.   Ultimately, our ability to monitor and improve students’ achievement of these general education outcomes – the hallmark of a liberal arts education – seems likely to improve under a structured pathways approach.”

-Rob Johnstone, Founder & President at the National Center for Inquiry & Improvement .

image

Speaking with Randy Lawson

by Ani Aharonian

Santa Monica College’s last self-study and site visit took place in 2010 with the next round scheduled for 2016.  It is no secret that when it comes to accreditation, one would be hard-pressed to find someone more knowledgeable than SMC’s Executive Vice President, Randal “Randy” Lawson. Randy serves as Accreditation Liaison Officer, has co-chaired the 1998, 2004, and 2010 accreditation self-studies, and served on visiting accreditation teams to other colleges.  Randy graciously spoke to us about the last accreditation self-study and site-visit and the preparations for the 2016 accreditation year.

Randy explains that the 2010 site visit could be thought of perhaps “as an example of ‘not a very good visit’ which was very disappointing to us because …we felt we were so well prepared particularly in comparison to 2004…” 

SMC’s accreditation was reaffirmed and the college received praise for the high instructional quality the college is known for, but did receive 9 recommendations for improvements. Some were relatively minor and easily addressed (e.g. although the college had ethics statements for each employee group, the team recommended developing a district-wide code of ethics).  However two were of primary importance and required a follow-up with the commission in six months.  The first involved completing the college’s master planning process Randy explained that the college had changed the planning process in 2005 and so had not yet completed a full cycle at the time of the site visit.  In addition to not having had an opportunity to experience a full cycle in order to clarify the entire process and evaluate the components, the college’s research function was very weak at the time due to very recent staffing changes and a staffing shortage in Institutional Research.  The Office of Institutional Research is critical to the support of institutional effectiveness and improvement by assisting the various campus constituents with ongoing and systemic evaluation processes, therefore the second task of primary import was to increase and strengthen the research function of the college.

Recommendation 1: To meet the standards, the team recommends that the college complete the development of a sustainable comprehensive master planning process with the Master Plan for Education at its core. The resultant multi-year plan should contain explicit links to instructional and student services programs, human resources, facilities, technology, and other planning needs that are revealed by the program review process or other assessments of institutional effectiveness. The team further recommends that the college work to achieve among its constituents a uniform understanding of the planning cycle and documentation processes through a mechanism accessible to all audiences regardless of their previous experience with the institution (Standard I.A, I.A.1, I.A.4, I.B.1, I.B.3, I.B.4, I.B.6, I.B.7, II.A.1.a, II.A.1.c, II.A.2.f, III.A.6, III.B.2.b, III.C.2, III.D.3, IV.A.5, and IV.B.2.b)

Recommendation 3: To meet the standards, the team recommends that the college evaluate the efficacy of the current staffing model for the institutional research function with a goal of providing timely, in-depth analysis of effectiveness measures and other key institutional metrics to move the college toward the goal of becoming a culture of evidence (Standards I.B.3, I.B.4, I.B.6, I.B.7, II.A.1.c, II.A.2.e, II.A.2.f, II.A.2.g, and II.B.3)

The college had a very short time frame of about 4 months in which to address these recommendations in a follow-up report to the ACCJC. In hindsight, Randy feels the visiting team did SMC a great favor, “…it made us get to work and complete things, and it probably got us past any minor bickering that we might have had along the way about how we were going to do this.”  In that short time, SMC’s master planning process was finalized and the Master Plan for Education which is the college’s core planning document was revised to more clearly describe the planning process and the interrelationships of the many components of the planning process.  The Office of Institutional Research (IR) expanded the available data on the college’s website and the college developed two new employment classifications (research analyst and senior research analyst) and quickly began recruiting for a research analyst. 

The SMC community banded together and worked hard, particularly the District Planning and Advisory Committee (DPAC) and made what Randy describes as “incredible progress” and SMC “moved from needing to do this follow-up report to being asked, the following spring, to present at the Northern Regional Workshop for the ACCJC on the relationship between program review and the integrated planning process.”  In other words, in a very short time, SMC transitioned from receiving a serious recommendation for improvement of the planning process to becoming exemplary in its master planning process. 

Through his role as Accreditation Liaison Officer, Randy, himself a member of several visiting teams evaluating other colleges, has observed that planning, program review, and student learning outcomes (SLOs) have become, and are likely to remain, focus major directions in accreditation, “… there is more of a trend for those kinds of things to be the reason why a college is asked to do a follow-up report or, in more extreme cases, placed on warning or other kinds of sanction or so on.”  With most colleges now having had an opportunity to set up processes with regards to planning, program review, and SLOs, Randy believes the focus will now be on how well the colleges do with evaluating and assessing these processes and using the data for continuous improvement.  SMC’s current processes were designed in order to help make this easier to accomplish and Randy is confident that we are doing this well as an institution, having made significant improvements with annual objectives in terms of setting, meeting, and following-up on these objectives in a meaningful way.  Randy recommends the Master Plan for Education and the Institutional Effectiveness Report to those seeking to gain a deeper understanding of the planning process at SMC and the institutional goals which guide the planning process.

In preparation for the 2016 Self-study and subsequent site visit, the college is now in the process of forming teams for all accreditation standards and sub-standards and has recently sent out a “Call for Volunteers” in a district-wide email.  Participating in the accreditation process by joining one of these teams and being part of the discussions is an excellent way for new faculty and classified staff to get involved.  Randy encourages any interested employees to get involved, “Participating may mean just being on one of the committees and it doesn’t necessarily mean you have to write something… Being involved is just being part of the discussion.  We think the broader that discussion can be, the better.  So we want to encourage people to join one of the standard committees or subcommittees in an area that interests them.”  Volunteers who wish to do so are even encouraged to branch out and join committees in areas different than those they work in as a way to learn more.  Randy stresses that there is no requisite background or knowledge necessary and “in some cases it is better to have someone who doesn’t know anything about that standard there because their questions are often the kinds of questions that someone on a [visiting] team might ask or they might notice something that is a small weakness that we who know it well may overlook.”

While SMC prepares for accreditation, any significant findings will be shared with the college as one of the first steps in addressing any campus needs. “If we notice something that needs to be addressed, and we can address it now or at least begin to address it now, it’s a strength and it puts us ahead of the game.”

If you are interested in learning more about accreditation in general or about the planning process at Santa Monica College, please consult the reading list of helpful resources and documents below: 

Read More

image

by Ani Aharonian

Educational accreditation in the United States is essentially a peer-review or self-regulatory quality assurance process.  Whereas in most countries, a governmental office oversees education (e.g. ministry of education), in the United States, independent, but federally recognized, regional accrediting organizations formed from member institutions ensure educational quality.  While the federal government does not directly oversee accreditation, only accredited institutions are eligible to receive federal financial aid dollars.  The Western Association of Schools and Colleges (WASC) has jurisdiction over the states of California and Hawaii as well as East Asia and several Pacific Island locations.  The WASC has two accrediting commissions: the Accreditation Commission for Senior Colleges and Universities (ACSCU) for 4-year colleges and universities and the Accreditation Commission for Community and Junior Colleges (ACCJC) for 2-year colleges.  Santa Monica Colleges is a member of and is accredited by the ACCJC.  By becoming members of the ACCJC colleges agree to meet the eligibility requirements and standards and participate in the accreditation process which involves carrying out a self-study and undergoing peer review on a 6-year cycle.  According to the ACCJC, meeting the accreditation standards provides assurance to the public that an institution provides high quality education and is committed to continuous improvement.


What are the accreditation standards?

image

The college emphasizes student learning and achievement, continuously and systematically uses data for evaluation, planning, and improvement of educational programs and services, demonstrates integrity in all policies, actions, and communications.

image

Programs are of appropriate quality and rigor, quality is assessed and results of assessments are publicly available, and used for the improvement of educational quality and institutional effectiveness.

image

The institution uses its human, physical, technology, and financial resources to achieve its mission and to improve academic quality and institutional effectiveness.

image

The college recognizes and uses the contributions of leadership to promote student success, sustain academic quality, integrity, fiscal stability, and continuous institutional improvement. The governing board, administrators, faculty, staff, and students work together for the good of the institution.

What is the accreditation process?

Accreditation occurs on a 6-year cycle with each cycle beginning with the delivery of the ACCJC’s “action letter” which outlines what action the commission is taking with regards to the accreditation status of the institution.  The accreditation process can be summarized in 4 phases. 

  1. The first phase consists of a self-evaluation (previously called self-study) in which the institution assesses its own adherence to ACCJC eligibility requirements and accreditation standards and develops a plan for improvement in a written Self-Evaluation Report of Educational Quality and Institutional Effectiveness.  The college must provide sufficient evidence to support the claims made in the Self-Evaluation.
  2. The second part is the external evaluation, during which a team of evaluators from other ACCJC accredited institutions visits in order to validate the assertions and evidence in the Self-Evaluation Report.  The external team then writes an evaluative report, called the External Evaluation Report of Education Quality and Institutional Effectiveness and makes recommendations the college should undertake to meet standards or improve. 
  3. During the 3rd phase, the ACCJC reviews both the self- and external- evaluations and makes a decision about the accreditation status of the institution.  The ACCJC sends an “action letter” to the institution which outlines the commission’s decision and may provide recommendations to meet standards and make improvements. 
  4. In the 4th and final, “follow-up” phase of the accreditation process, the institution engages in a follow-up and improvement process based on the recommendations from the ACCJC’s action letter, the External Evaluation Report, and the Self-Evaluation Report.  

2014 Institutional Effectiveness Report

image

By Hannah Lawler

During the May 2014 Board of Trustees meeting, the Office of Institutional Research presented a summary of Santa Monica College’s (SMC) 2014 Institutional Effectiveness Report. If you missed the riveting discussion of student success, target goals, and institution-set standards, you are in luck, as the current blog post is dedicated to getting you acquainted with the 2014 Institutional Effectiveness Report and Dashboard!

Institutional Effectiveness (IE) is the systematic and continuous process of measuring the extent to which a college achieves its mission. The primary purpose of IE is to advance educational quality and institutional improvement through the review and discussion of the college’s performance on key data metrics that are aligned with the college’s major goals. The IE process is intimately linked with the accreditation self-evaluation process as current ACCJC standards require institutions to systematically and regularly evaluate institutional quality through the collection and analyses of data for planning and improvement of student learning and success, and college practices and programs.

At SMC, the IE data metrics, called “key indicators (KI)”, are organized by the five supporting goals (major areas) of SMC:

  1. Innovative and Responsive Academic Environment (instructional programs and curriculum),
  2. Supportive Learning Environment (academic and student support services),
  3. Stable Fiscal Environment (fiscal operations),
  4. Sustainable Physical Environment (physical infrastructure), and
  5. Supportive Collegial Environment (human resources and collegiality).

The data are organized by six different dashboards. Dashboards are visual tools that monitor the college’s performance on the KIs. Five of the six dashboards are organized by the associated supporting goals of the college, and the sixth dashboard contains KIs that were identified as institutional priorities by the District Planning Advisory Council (DPAC) and the Academic Senate Joint Institutional Effectiveness Committee. The “Institutional Priorities Dashboard” is aligned with SMC mission, current and former strategic initiatives, objectives of the Master Plan for Education, and the Board of Trustees’ goals and priorities.  The six dashboards, when reviewed together, provide a balanced view of institutional effectiveness at SMC.

The following bullet points highlight some of the findings from the 2014 IE Report:

  • SMC meets or exceeds the institution-set standards for 22 of the 23 student success indicators, including number of transfers to UC/CSU, number of degrees and certificates awarded, course success, CTE license examination pass rates, and ILO mastery rates.
  • The College fell below the standard set for CTE Completion Rate (42.0%), missing the institution-set standard of 43.8% by 1.8%.
  • SMC fell below the target goals for four indicators, Persistence Rate, Basic Skills Course Improvement Rate, Basic Skills Transition to Degree Course Rate, and CTE Completion Rate.

Want to learn more about institution-set standards and target goals? The following blog post describes what standards and targets are in the context of institutional effectiveness at SMC: http://irsmc.tumblr.com/post/92439784701

The IE reports and dashboards are updated on an annual basis. To access to past and current reports, visit the Institutional Effectiveness Dashboard website: www.smc.edu/iedashboard.

CIRCLE Study of SMC Student Voting Patterns

image

By Christopher Gibson

Like most of you, I first assumed C.I.R.C.L.E. to be either an organization planning to cut James Bond in half with a laser beam or some robe-wearing offshoot sect of the Illuminati.  I was pleasantly surprised to learn that the Center for Information & Research on Civic Learning and Engagement conducts research on civic and political engagement in the American education system. Recently SMC participated in a study conducted by CIRCLE on student voting rates for the 2012 Federal Election. To perform the study CIRCLE obtained, with SMC’s permission, data on SMC students through external data sources.  Here is the report (PDF).

Some of the findings discussed in the study are:

  1. According to the CIRCLE study, 62.8%† of the eligible student population at SMC was registered to vote, and 69.6%† of those SMC students who were registered to vote actually voted.
  2. The trends for SMC groups’ voting participation reflect the trends for the nation.  Older people tend to vote at higher rates than younger people, even among students.
  3. According to this study, 19% of SMC students were absentee voters, which is in line with some national trends , although that national number does not focus on students alone, as this study does.

The results for SMC can be compared to the overall group comparison results here.

One interesting aspect of this study is that SMC did not provide CIRCLE with any institutional data directly.  Circle used three external data sources (Student Clearinghouse; Catalist; and NCES’s IPEDS) to complete the analysis, thus enabling them to use the same method for the more than 260 participating campuses across the country.

For a complete list of what was pulled from which data source, and the methodology used to arrive at their conclusions, consult CIRCLE’s Report FAQ.

The report provided by CIRCLE goes on to list recommendations to increase student engagement in the voting process, including suggestions like “Get political, not partisan” to encourage discussion without appearing biased, and “Improving voter registration efforts on campus may help to increase voter turnout” because, given the high likelihood of registered voters actually turning up to vote, more registrations leads to more actual votes. 

†These numbers do not reflect the 10340 students who blocked their records from being used for research.  For the purposes of this study, it is assumed that voting registration and participation happen at the same rate for students who block such uses of their data as for students who do not.

"Slate education writer Dana Goldstein talks about the "flipped classroom," in which students watch lectures at home and do homework at school, with Jonathan Bergmann, a teacher and educational technology advocate, and Frank Noshcese, a science teacher who’s skeptical of flipping".