<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>New England Board of Higher Education &#187; Community College of Vermont</title>
	<atom:link href="http://www.nebhe.org/tag/community-college-of-vermont/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.nebhe.org</link>
	<description>NEBHE</description>
	<lastBuildDate>Mon, 12 Aug 2013 19:54:35 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.5.1</generator>
		<item>
		<title>Implementing System-Level Graduation Standards</title>
		<link>http://www.nebhe.org/thejournal/a-look-at-implementing-system-level-graduation-standards/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=a-look-at-implementing-system-level-graduation-standards</link>
		<comments>http://www.nebhe.org/thejournal/a-look-at-implementing-system-level-graduation-standards/#comments</comments>
		<pubDate>Mon, 21 Nov 2011 11:00:06 +0000</pubDate>
		<dc:creator>John O. Harney</dc:creator>
				<category><![CDATA[Analysis]]></category>
		<category><![CDATA[College Readiness]]></category>
		<category><![CDATA[Commentary]]></category>
		<category><![CDATA[Demography]]></category>
		<category><![CDATA[Economy]]></category>
		<category><![CDATA[Homeslide]]></category>
		<category><![CDATA[Journal Type]]></category>
		<category><![CDATA[Students]]></category>
		<category><![CDATA[The Journal]]></category>
		<category><![CDATA[Topic]]></category>
		<category><![CDATA[Trends]]></category>
		<category><![CDATA[assessment]]></category>
		<category><![CDATA[Carol Moore]]></category>
		<category><![CDATA[Castleton State]]></category>
		<category><![CDATA[City University of New York]]></category>
		<category><![CDATA[Community College of Vermont]]></category>
		<category><![CDATA[graduation standards]]></category>
		<category><![CDATA[Johnson State]]></category>
		<category><![CDATA[Karrin Wilks]]></category>
		<category><![CDATA[Lyndon State College]]></category>
		<category><![CDATA[Vermont State Colleges]]></category>

		<guid isPermaLink="false">http://www.nebhe.org/?post_type=thejournal&#038;p=11309</guid>
		<description><![CDATA[<p> Driven by external pressure for increased accountability and internal pressure for improved learning outcomes, colleges across the country have been developing and refining assessment systems for several decades. In some cases, assessment results have significant positive impact, for example, when used to enhance teaching and learning or as a lever for organizational change. In ...]]></description>
				<content:encoded><![CDATA[<p><strong> </strong>Driven by external pressure for increased accountability and internal pressure for improved learning outcomes, colleges across the country have been developing and refining assessment systems for several decades. In some cases, assessment results have significant positive impact, for example, when used to enhance teaching and learning or as a lever for organizational change. In other cases, the results have little impact, are not seen as useful or not designed for program improvement purposes in the first place. Assessment can have substantial negative effects as well, including ill will among faculty or other key constituents, reputational damage or reduced funding.</p>
<p>In 1999, the Vermont State Colleges (VSC)—comprising Castleton, Johnson and Lyndon state colleges, the Community College of Vermont and the Vermont Technical College—initiated a systemwide planning process that identified multiple strategic initiatives, including several designed to improve outcomes assessment and accountability. One initiative called for the establishment of common graduation standards for all students across the five colleges, at both the associate and bachelor’s levels. The board of trustees wanted to provide a “guarantee” to the public and employers that every graduate of the VSC could demonstrate essential skills for success after college.</p>
<p>The chancellor established a systemwide steering committee to oversee the graduation standards initiative. The committee included faculty representatives from each college and academic deans, and was co-chaired by the academic vice president of the system and the president of one of the four-year colleges. Faculty on the committee were expected to serve as liaisons to the faculty assemblies on each campus, to allow for broader faculty input and to facilitate endorsement of the committee’s plan. Likewise, reports were provided frequently for the state colleges’ Council of Presidents, the chancellor and the broader VSC community.</p>
<p><strong>Areas of competency<br /></strong></p>
<p><strong> </strong>The steering committee ultimately proposed six areas of competency: writing, quantitative reasoning, information literacy, oral communication, civic engagement, and critical thinking. Facing significant opposition to the entire initiative from a vocal group of faculty, the steering committee formed faculty-majority subcommittees to define the outcomes and propose assessment strategies for each standard. Several months into this process, <em>civic engagement</em> and <em>critical thinking</em> were permanently tabled as the subcommittees were sharply divided about the feasibility of valid assessment in those areas. This elevated the political challenges associated with assessing a limited set of skills rather than a broad set of learning outcomes such as those identified by the Association of American Colleges and Universities through Liberal Education and America’s Promise (LEAP).</p>
<p>Unexpectedly, it was easier to come to agreement about specific language for defining learning outcomes than about what to call the entire set of competencies. Faculty vehemently opposed the initial label of “minimum competencies,” on the grounds that it potentially conflated expectations for collegiate learning with those at the high school level. Faculty ultimately agreed to the term “graduation standards.”  Of course, this semantic shift did not mitigate the challenges associated with establishing appropriate performance levels for the standards, made politically charged given the VSC’s public access mission and that over 60% of students are the first in their families to attend college. Many expressed concerns about creating barriers to graduation. But by far, the most controversy centered on the assessment tool itself.</p>
<p>Fundamental methodological questions were debated. Would faculty design the assessments or would the VSC select commercially available instruments? Who would set the standards for passing? Would all students be assessed or would a sampling technique be employed? At what point in time would students be assessed? Ultimately the steering committee recommended a politically acceptable compromise—adoption of common statements of learning outcomes across the five colleges and agreement on a set of parameters for assessing the outcomes (including that every student would be assessed), while allowing each college to develop and implement campus-specific assessments for each standard. This plan satisfied the demands of the board of trustees and chancellor for common learning outcomes and a “guarantee” of minimum competency, and provided a mechanism for faculty buy-in at the campus level.</p>
<p><strong>Implementation</strong></p>
<p>The academic vice president in the system office worked closely with the college presidents and academic deans to ensure progress on the development of local assessments. The implementation timeline was staggered over a five-year period, beginning with the development of a writing assessment that met the requirements established by the steering committee. One college already had in place an institutional writing proficiency exam, and another had in place portfolio-based writing assessment. These models and others were shared among faculty and provided a foundation for the timely and relatively smooth implementation of writing assessments across the system.</p>
<p>The other three areas proved more difficult to implement. There was wide disagreement about the level at which students should demonstrate proficiency in quantitative reasoning, especially for students in STEM fields as opposed to those majoring in the humanities. There was disagreement about how to differentiate minimum competency in information literacy from what might be expected of high school graduates. Finally, there was ongoing confusion about how to differentiate expectations at the associate and bachelor’s levels. Concerns arose about the potential for wide variation across colleges in the performance levels being assessed, as well as in the overall quality of the assessments.</p>
<p>Several years into the implementation process, the academic vice president in the system office and academic deans at the colleges designed and implemented a process to regularly review the assessment methods and results at the colleges. In addition to annual monitoring of results across all assessments, one competency is evaluated comprehensively per year on a rotating basis. Faculty from across the colleges go together in a retreat format to reconsider the common learning outcomes, analyze local assessment methodology and results, and make recommendations to the presidents and chancellor for improving the process. This provided a mechanism for faculty to have a significant role in the ongoing improvement of the assessment system, while supporting the broader strategy of engaging faculty in assessment as part of the regular work of teaching and learning.</p>
<p>Given that writing was the first area to be implemented, it was also the first to be evaluated. As a result, revisions were made to the learning outcomes, as were recommendations for improving the reliability and validity of the local assessments. Writing faculty from across the system shared student writing samples and assessment rubrics, a process they found both useful and engaging, particularly given the opportunity for expanded colleagueship beyond the small departments in VSC colleges. Most recently, the assessment of information literacy was reviewed, which identified  areas of concern in the current approach, including the wide variability in expectations across departments within colleges.  Additionally, there was agreement that the standards and implementation are not rigorous enough in relation to intellectual property and the ethical use of information</p>
<p><strong>Results and lessons learned</strong></p>
<p>Given that <em>all</em> students would be assessed across <em>all</em> standards, the instruments developed by faculty at each college were, in theory, high-stakes. The VSC policy remains that no student can graduate without demonstrating competency in all four graduation standards. However, as assessments were implemented and have now been in place for several years, very few students fail to the pass the assessments in time to graduate. Students routinely require multiple attempts to pass (and benefit from a variety of academic supports in place to help them), but none of the colleges limited the number of times a student could attempt demonstrating competence. The <em>de facto</em> pass-rate, then, remains nearly 100%.</p>
<p>The perception of a high-stakes model may have brought about low standards (as did the original concept of “minimum” competencies). But the most consequential decision was to allow for the design of local assessments within a system-level model. This approach provided for substantial faculty ownership of the process but precluded any cross-college analysis or national benchmarking with similar institutions (although two colleges use a nationally normed online assessment of information literacy). Equally significant was the decision to measure competence at a single point in time rather than at multiple points in order to measure learning gains over time. While the notion of measuring the “value added” by a college degree is fraught with methodological problems related to isolating the effects of the institution (versus those resultant of maturation or experiences outside the institution), it has become the gold standard in outcomes assessment, particularly at a time when popular books such as <em>Academically Adrift: Limited Learning on College Campuses</em> (Arum and Roksa, 2011) have raised questions about the extent to which students learn anything at all in college. Further, measuring competence at a single point in time provides little insight into how students acquire skills and the extent to which particular curricular or pedagogical approaches impact learning gains.</p>
<p>To a large extent, the approach did not take advantage of the opportunity to aggregate and analyze system-level data to improve teaching and learning. Despite having a single administrative information system across the colleges, inadequate attention was paid to developing robust data-collection and analysis systems to support the graduation standards initiative. The strategy of early compromise was critical to ensuring faculty engagement in the assessment process, but it leaned too far in the direction of local autonomy. This manifests an inherent tension in higher education system leadership: supporting strong, unique colleges while maximizing the benefits of the system.</p>
<p>In other respects, the assessment approach did maximize the benefits of being a system. VSC policy remains that meeting the graduation requirements at one college also meets the graduation requirements at any other VSC college, despite the variation of assessment methodology. This benefits transfer students and encourages community college students to continue their studies in the VSC. Other benefits of the assessment model include systemwide awareness of national trends in assessment and accountability, faculty agreement on essential learning outcomes for all VSC graduates, and increased student awareness of performance expectations for college graduates.</p>
<p>Perhaps most valuable has been the annual systemwide retreat devoted to analyzing assessment methods and results in particular areas. In order for an assessment model to ultimately succeed as a means of improving learning outcomes, systemic processes must be in place at all levels to continually monitor, evaluate and strengthen the approach. The annual review process could potentially be enhanced through student involvement, reflecting the growing body of literature speaking to the potential benefits of engaging students in the study of teaching and learning. But by bringing together faculty from across colleges, systems have the opportunity to establish what the Carnegie Foundations calls “networked improvement communities,” which provide for highly structured, cross-functional, cross-institutional inquiry. Finally, the decision to focus on a limited set of outcomes, while for some creating the perception of diluting the greater purpose of a college education, provides the opportunity for in-depth analysis of how students learn a discrete set of skills commonly viewed as essential for success in and beyond college.</p>
<p><em><strong>Carol Moore </strong>is the past president of Lyndon State College and currently works as a consultant. <strong>Karrin Wilks</strong> is the past senior vice president of the Vermont State Colleges and currently serves as university dean for undergraduate studies at the City University of New York.</em></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>http://www.nebhe.org/thejournal/a-look-at-implementing-system-level-graduation-standards/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Book Review: Harnessing America&#8217;s Wasted Talent</title>
		<link>http://www.nebhe.org/thejournal/book-review-harnessing-americas-wasted-talent/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=book-review-harnessing-americas-wasted-talent</link>
		<comments>http://www.nebhe.org/thejournal/book-review-harnessing-americas-wasted-talent/#comments</comments>
		<pubDate>Sun, 12 Dec 2010 12:00:41 +0000</pubDate>
		<dc:creator>John O. Harney</dc:creator>
				<category><![CDATA[Commentary]]></category>
		<category><![CDATA[Journal Type]]></category>
		<category><![CDATA[The Journal]]></category>
		<category><![CDATA[Topic]]></category>
		<category><![CDATA[Trends]]></category>
		<category><![CDATA[Alan R. Earls]]></category>
		<category><![CDATA[book review]]></category>
		<category><![CDATA[California State University at Monterey Bay]]></category>
		<category><![CDATA[Community College of Vermont]]></category>
		<category><![CDATA[for-profit]]></category>
		<category><![CDATA[Government Accountability Office]]></category>
		<category><![CDATA[Harnessing America's Wasted Talent]]></category>
		<category><![CDATA[Kaplan]]></category>
		<category><![CDATA[Peter Smith]]></category>

		<guid isPermaLink="false">http://www.nebhe.org/?p=7105</guid>
		<description><![CDATA[<p></p>
<p>Harnessing America's Wasted Talent: A New Ecology of Learning, Peter Smith, Jossey-Bass, San Francisco, 2010</p>
<p>In 1970, I was a high school student in a suburban New England town. The invasion of Cambodia and the shootings at Kent State had brought spectacular illumination to the end of the academic year and dimmed hopes that the war ...]]></description>
				<content:encoded><![CDATA[<p><br class="spacer_" /></p>
<p><img class="alignright size-medium wp-image-7109" title="peter smith book cover" src="http://www.nebhe.org/wp-content/uploads/peter-smith-book-cover1-203x300.jpg" alt="" width="203" height="300" /><strong><em>Harnessing America's Wasted Talent: A New Ecology of Learning, Peter Smith, Jossey-Bass, San Francisco, 2010</em></strong></p>
<p>In 1970, I was a high school student in a suburban New England town. The invasion of Cambodia and the shootings at Kent State had brought spectacular illumination to the end of the academic year and dimmed hopes that the war in Vietnam would soon be over. But optimism and idealism left over from the 1960s still percolated in our midst. That summer, a group of students, aided by a few like-minded parents and educators, came up with the idea of setting up a “free school” in town over the vacation period. Free schools, which at the time were springing up in cities and college towns across the country, were intended to be places where education would finally be democratized; teachers and students would be equals, and the focus would be on real learning rather than meeting pre-established academic standards or simply earning credits. Thanks to several thousand dollars in start-up funding, provided with some reluctance by the school committee, our free school began and flourished, albeit only for an eight-week run, during which we had free use of parts of the high school. It attracted people who had knowledge to share and people, young and old, who wanted to learn. Courses ranged from radio electronics and cooking to rock climbing, foreign languages and simulation games.</p>
<p>Sadly, our free school never managed a second act. By the following summer, idealism had turned to cynicism and the first signs of the decade's economic malaise had begun to make officials more parsimonious and everyone perhaps less experimental. However, having witnessed this wondrous phenomenon, I never entirely let go of the idea that education could be done differently.</p>
<p>Peter Smith, the author of <a href="http://books.google.com/books?id=gyEMiWxZLv8C&amp;printsec=frontcover&amp;dq=peter+smith+harnessing+America%27s+Wasted+talent&amp;source=bl&amp;ots=p6C5ftRlyZ&amp;sig=11HUIPPWNuypSbvUBHT7HfheShg&amp;hl=en&amp;ei=-m3-TPrHBYL78Ab0pMiRBw&amp;sa=X&amp;oi=book_result&amp;ct=result&amp;resnum=3&amp;ved=0CCcQ6AEwAg#v=onepage&amp;q&amp;f=false" target="_blank"><em>Harnessing America's Wasted Talent</em></a>, also has had occasion to see education from different vantage points, thanks to a long and varied career in education and politics. Founding president of Community College of Vermont and California State University at Monterey Bay, Smith has also served as Vermont's lieutenant governor and as a Vermont congressman. In recent years, he has authored a slew of books serving up thoughtful critiques of American higher education along with nostrums rooted in his experience.</p>
<p>On a perhaps more controversial note, Smith currently serves as vice president of academic strategies and development at Kaplan University, one of more than a dozen for-profit institutions skewered by investigators of the Government Accountability Office for allegedly deceptive statements made to investigators pretending to be applicants. And for the most part, for-profits are anathema to mainstream educators.</p>
<p>Leaving aside any temptation to shoot the messenger, though, Smith's arguments come across as both persuasive and simple without being simplistic. His central thesis, what he calls his “Law of Thirds” is that higher education has done a generally good job of serving the needs of the “top” one-third of learners who have the means and/or the skills to access and navigate the formal structures of K-12 learning and the college world that follows. However, the remaining two-thirds of learners either never make it out of high school or graduate but do not go on to college. This, he says, is not good enough given that so much job growth is in fields requiring advanced skills.</p>
<p>The cure he proposes is not dismantling higher education, nor does he really fault the higher education “establishment.” Instead, he suggests that higher education is simply “maxed out” and cannot and should not be expected to solve the two-thirds problem by itself. It is what he characterizes as a cottage industry rather than a system—with each school issuing its own currency in the form of academic credits. Still, despite its faults, he is largely content to let much of the higher ed establishment do what it has been doing, often with great success.</p>
<p>What does need to change, he argues, is the notion that only traditional schools, traditional curriculum, traditional classrooms and traditional methods for assessing and awarding credit should remain as the only way to serve up education. Like the American automobile industry, which fattened on cheap petroleum and government subsidized highway and ignored foreign challenges for too long, the education establishment must recognize that change has arrived and a revolution is brewing, Smith writes.</p>
<p>With so many people effectively excluded from the benefits of higher education, with a deep and persistent need for more skilled and capable people in the workforce, and with unlimited quantities of information on the web and communication technologies that have grown ubiquitous and cheap, Smith says America can no longer wait for miracles that will never happen. He points out that the U.S. is the only developed nation where younger workers are less educated than older workers. Therefore, he suggests, educators must devise ways to recognize learning in all its form and engage learners from cradle to grave using more innovative methods and recognizing each individual’s personal learning capabilities.</p>
<p>One of the solutions he proposes is the creation of Colleges of the 21<sup>st</sup> Century (C21C). Instead of focusing on exclusion—with admission standards as the gate—he says, “For the first time in history, we have the knowledge and the tools available to educate through new designs,” including “emerging information technology.”</p>
<p>C21Cs will, in his vision, thoroughly personalize learning, connecting it to all aspects of life and ensuring the mobility of credit and credentials so no one will be left out of the system. For example, C21Cs would find ways to identify and recognize learning done on the job, in the home and through leisure. The competent and intelligent people that often have crucial positions in our world—albeit without benefit of formal credentials—would be embraced and given opportunities to grow. In the end, he writes, “the new ecology of learning will change forever the balance of power between the learner and his or her learning.”</p>
<p>Smith’s vision of a democratized, wide-ranging and humanized education system is everything an idealist might hope for supplemented by plausible means of implementation that should satisfy the pragmatist. It will be interesting to see how far he gets.</p>
<p>_______________________________________________________________________<em> </em></p>
<p><em>Reviewed by <a href="http://www.alanearls.com/" target="_blank">Alan R. Earls</a>, a Boston-area writer.</em></p>
]]></content:encoded>
			<wfw:commentRss>http://www.nebhe.org/thejournal/book-review-harnessing-americas-wasted-talent/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>

<!-- Dynamic page generated in 0.597 seconds. -->
<!-- Cached page generated by WP-Super-Cache on 2013-08-13 13:24:36 -->