"Slate education writer Dana Goldstein talks about the "flipped classroom," in which students watch lectures at home and do homework at school, with Jonathan Bergmann, a teacher and educational technology advocate, and Frank Noshcese, a science teacher who’s skeptical of flipping".

In the spring of 2013, Laude and his staff sat down with the Dashboard to analyze the 7,200 high-school seniors who had just been admitted to the class of 2017. When they ran the students’ data, the Dashboard indicated that 1,200 of them — including Vanessa Brewer — had less than a 40 percent chance of graduation on time. Those were the kids Laude decided to target. He assigned them each to one or more newly created or expanded interventions. The heart of the project is a portfolio of “student success programs,” each one tailored, to a certain extent, for a different college at U.T. — natural sciences, liberal arts, engineering — but all of them following the basic TIP model Laude dreamed up 15 years ago: small classes, peer mentoring, extra tutoring help, engaged faculty advisers and community-building exercises.

Institution-set Standards vs. Target Goals


By Hannah Lawler

What are institution-set standards? And how are they different than target goals? As part of the 2016 accreditation self-evaluation process, Santa Monica College will be required to respond to a new standard:

The institution establishes institution-set standards for student achievement, appropriate to its mission, assesses how well it is achieving them in pursuit of continuous improvement, and publishes this information (Standard IB4).

If you are familiar with the 2013 or 2014 Institutional Effectiveness (IE) Reports, you should recognize the terms “institution-set standards” as the College began setting standards for all college-wide student success and achievement metrics (i.e. course success rates, student retention rates, and number of degree completers) in spring of 2013. Recent changes in federal guidelines have prompted the Accrediting Commission of Community and Junior Colleges (ACCJC) to include, as a part of the accreditation process, an evaluation of whether institutions are setting appropriate standards of performance for student success and achievement metrics.

Institution-set standards are defined as the standards reflecting satisfactory performance of student learning and achievement. Standards can be interpreted as the minimum level of performance signaling that the college is meeting educational quality and institutional effectiveness. Target goals, also included in the IE Dashboards/Reports, are different from institution-set standards because targets are aspirational in nature. Targets are goals we hope to achieve in order to improve student learning and achievement and meaningfully move the need on institutional effectiveness.

Institution-set standards are defined for each key indicator of the IE Dashboards that directly measure student performance, such as course success, transfer, degree completion, and license exam pass rates. The standards were initially set by the Academic Senate Joint Institutional Effectiveness Committee and reported to the college’s central planning body, the District Planning and Advisory Council (DPAC), and the Board of Trustees. The standards are reviewed for appropriateness every year. Record of the standards for SMC are kept at the Institutional Effectiveness Dashboard link below.

Target goals are defined and monitored for all key indicators on the “Institutional Priorities” Institutional Effectiveness Dashboard. Targets were set by the primary sponsors or personnel directly responsible for or affected by a key indicator of an IE Dashboard (for example, the Dean of Counseling, Department Chair of Counseling, and the Transfer Center Faculty Leader set the target for the Transfer Rate metric). Historical targets for SMC are also kept on the Institutional Effectiveness Dashboard.

Institution-set standards and targets are critical in telling us how well we are doing in terms of student success and achievement. When the College falls below a standard or is not meeting a target goal, the data is brought to the attention of multiple campus groups, including DPAC, and subsequently, plans to improve performance on the measure are developed.

Want more on institution-set standards and target goals? Visit the following resource pages or email us at research@smc.edu:

  • To see the newly adopted (as of June 2014) criteria that will be used to evaluate community colleges see the ACCJC Accreditation Standards:


  • To see the directions given to ACCJC Visiting Teams Regarding Implementation of New U.S. Department of Education Regulations, see this memo from the ACCJC :


  • To see a yearly record of SMC’s institution-set standards and targets and whether each was met, see the SMC Institutional Effectiveness Dashboard



SMC Accreditation Preview: The Data!

If you find yourself browsing through SMC’s 2010 ACCJC Self-Study (also known as the Self-Evaluation Report), you may notice the term “Institutional Research” pops up a mere… 35 times.

So it’s safe to say that data plays an important part of any well-written accreditation self-study report. While your humble blog curator was not involved with previous self-studies, and will have to make a daring escape from his LS office to participate in our upcoming process (hmm, I wonder if this guy’s available), he does have some guesses as to what you may see included into the future.

  1. ILO & SLO Data: Obviously a big component of data in the ACCJC Self-Evaluation Report includes reporting of Institutional Learning Outcomes (ILO). Since the 2010 study, the college has fully implemented ILOs and SLOs. As an example of the reporting we do, the program ILO data is publicly available here.  
  2. Completion Data: Want to know how many students earned degrees? Certificates? Look no further. Hey, want to know how long it takes students that intend to transfer to do so? Check it out here. Having students complete their college goals is the ultimate goal of the college, and you can bet we’ll find that information in a proper self-study.
  3. Program Review: How well are our programs doing in accomplishing their goals? How effective are our intervention programs? Flip through our program review documents at SMC’s ProgRev site, and you’ll see report after report teeming to the brink with evaluation information.
  4. THE DASHBOARD!: How many students begin in our Basic Skills sequence? How many students persist to the next term? What does our Equity Gap look like? All this (and more!) you can find in the 2014 Institutional Effectiveness Dashboard. As one of the points of emphasis in the 2010 report, the IE reports are of paramount importanceto the college’s evaluation efforts.  

Interested in more you say? Can we interest you in our newly remodeled IR front page?

Online course enrollment in California’s Community Colleges (CCC) has grown remarkably in the last 10 years, with nearly 20 percent of the students who took courses for credit taking at least one online in 2012. However, students are less likely to complete an online course than a traditional course, and they are less likely to complete an online course with a passing grade. These are among the key findings of a report released today by the Public Policy Institute of California (PPIC). It is based on longitudinal student and course level data from all 112 community colleges.

This tumblr is full of “Big Data ____” articles. We find this particular article interesting because it uses the program Signals as its focal point.


Are you watching Obama’s Q&A with Tumblr founder David Karp? They’re discussing education and student debt at the White House.
Our May report found that households headed by young adults owing student debt lag far behind their peers in terms of wealth accumulation. Dig into the data here. 


Are you watching Obama’s Q&A with Tumblr founder David Karp? They’re discussing education and student debt at the White House.

Our May report found that households headed by young adults owing student debt lag far behind their peers in terms of wealth accumulation. Dig into the data here. 

Accelerated Courses Pt. 1

As a part of a larger re-thinking of the basic skills, or pre-collegiate, sequence, a significant amount of research has been done on community colleges’ efforts to implement course acceleration practices. Acclerated courses are just that, courses that take a large amount of material (usually from two or more courses) and presents them in a shorter time frame (usually one term).

The thinking behind these changes is that students are finding themselves “stuck” in long basic skills sequences, sometimes taking up to four courses before they are able to enroll in college level (and more importantly transferable) courses in math and English. The longer students are taking courses, the more opportunities they have to drop-out. By providing a faster and more challenging pathway towards completion, students will be much more likelier to complete transfer level math and English, and consequently much more likelier to complete degrees, certificates or transfer.

 Just in the last two months, two major studies on accelerated courses have been published online.

The RP Group, in conjunction with 3CSN, released  “Curricular Redesign and Gatekeeper Completion: A multi-college evaluation of the California Acceleration Project”. The California Acceleration Project was an intervention that provided support to colleges interested in redesigning the math and English basic skills sequence by shortening the time in remediation by at least one semester. The study included data from 16 California community colleges, and almost 2,500 students that enrolled in accelerated courses.

Their study showed an significant increase in the number of students who completed a transfer level course. They found that the intervention “showed higher outcomes…regardless of demographics such as ethnicity, gender, financial need, disability status, and prior English as a second language course taking”. The study showed that these gains were also evident at different basic skills placement levels.

Similarly, the Community College Research Center published a study that focused on the English sequence acceleration efforts of Chabot College. Their study showed that “participation in the accelerated courses was positively associated with a range of positive short-, medium- and long-term outcomes including entry-level College English completion.”

In an effort to increase the number of SMC students who progress through our English and math basic skills sequence, the college began offering accelerated courses in both subjects. In our next post on acceleration, we will provide some preliminary data on these effort.

Links, Damned Links and #HETTSLINKS

Aside from the never ending barrage of research requests, our office has also been inundated with awesome education-related links! Having “run” this blog for a few years now,  we are noticing a significant uptick in important education policy and research news items that are must read for higher-ed folks.

While we comb through and gather our thoughts for longer posts on these topics, we thought it useful to give everyone a nice reading list of items we have seen during the last few days.

From our Dean, Dr. Lawler

  • The LA Times highlights a study on the achievement gaps between Asian American and white students that focuses on non-cognitive differences.
  • NASH, the National Institution of System Heads, released a study on the functions of IR offices around the country. In short, IR offices need to get bigger and need to provide support for a wider array of college functions.
  • A Slate article (we’ll let you decide if it’s a #SlatePitch) on the problems of student evaluations of faculty. It argues that currently available alternatives to faculty evaluation still leave a lot to be desired.
  • The Data Points blog from the Chronicle of Education has a post on the difficulty in measuring diversity at colleges. While it offers two approaches to tackling this issue (using a diversity index and comparing college demos with the state), it notes that there is still a limit to what exactly we can measure.
  • The Christian Science Monitor has a high level summary of some of the efforts the country’s colleges are making to reduce the “transfer deficit”.

And as a special treat, some #HETTSLINKS.  These are just some of the articles you’ll find on LBCC’s own #IRFamous Director of Institutional Effectiveness, Dr. Hetts’ twitter account:

  • EdSource has a summary on Assembly Bill 1451, which would allow high schools and colleges to enter into formal co-enrollment program partnerships. It highlights a pilot-program that allowed 10 colleges to fund dual enrollment programs that increased the likelihood of graduation. 
  • Off The Charts, a blog from the folks at the Center on Budget and Policy Priorities, has a post on how higher ed funding remains “well below pre-recession levels”. It also contains what should be a pretty chilling chart for anyone who lives in Arizona.
  • The Washington Post has a summary of new research into the problems with students using laptops during class. The ease in taking notes causes students to “use long verbatim quotes, which they type somewhat mindlessly ”. Students who took handwritten notes had a better understanding of the material presented.
  • The Root has a great article titled “9 Biggest Lies About Black Males and Academic Success”
  • The Atlantic has an essay by a community college philosophy professor on the merits of the humanities.
  • And finally, a Press Telegram summary of a forum held at LBCC that included a speech from Joanna Newsom's cousin.


As mentioned around these parts before, the Statway math program is an attempt to accelerate student’s progress through the basic skills math sequence. Students enroll in a remedial algebra course, and then proceed to enroll in a transfer level statistics course. This process is completed in one academic year. The program includes the creation of learning communities for professors, and significant student support. 

This LA Times article shares Pierce College’s experience with the program, which includes this important bit of information:

The Cal State system accepts Statway for transfer credit on a temporary basis, but University of California administrators have ruled several times that the course isn’t rigorous enough for their standards.

"The faculty at UC are interested in alternative pathways, but, so far, Statway has not reached the level of quality we expect," said George Johnson, a UC Berkeley mechanical engineering professor who has reviewed courses.

Pierce administrators believe that they have temporarily solved the problem by combining the Statway curriculum with another course that was already accepted for UC transfer, although they say they want Statway to be recognized by the state system.

UC administrators are scheduled to review Statway again this spring and Martinez said he thinks that it should pass.

"We’re math people, we don’t push things unless we have the data to show we’re right," he said. "And I think we’re right."

You can click here for more information on the Statway program.


Does your house have a flush toilet? What time did you usually leave home to go to work last week? What was your total income during the past 12 months? Do you have trouble concentrating, remembering or making decisions because of a physical, mental and emotional condition?

These are some of the questions Census is considering eliminating from the American Community Survey due to privacy concerns.


Since 2000, the U.S.-born Latino population has grown at a faster rate than the immigrant population. As a result, the foreign-born share of Latinos is now in decline.


Since 2000, the U.S.-born Latino population has grown at a faster rate than the immigrant population. As a result, the foreign-born share of Latinos is now in decline.