<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments on: The Smarter, Balanced Sample Items	</title>
	<atom:link href="/2012/the-smarter-balanced-sample-items/feed/" rel="self" type="application/rss+xml" />
	<link>/2012/the-smarter-balanced-sample-items/</link>
	<description>less helpful</description>
	<lastBuildDate>Sat, 24 Nov 2012 23:02:47 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.2</generator>
	<item>
		<title>
		By: Dan Meyer		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-587030</link>

		<dc:creator><![CDATA[Dan Meyer]]></dc:creator>
		<pubDate>Sat, 24 Nov 2012 23:02:47 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-587030</guid>

					<description><![CDATA[&lt;strong&gt;Alice&lt;/strong&gt;, thanks a mil for the write-up here. I always appreciate your critical eye on policy.]]></description>
			<content:encoded><![CDATA[<p><strong>Alice</strong>, thanks a mil for the write-up here. I always appreciate your critical eye on policy.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Alice Mercer		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-583511</link>

		<dc:creator><![CDATA[Alice Mercer]]></dc:creator>
		<pubDate>Thu, 22 Nov 2012 05:16:39 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-583511</guid>

					<description><![CDATA[I did a recent session at Fall CUE that was a last minute addition, and sparsely attended discussing the SBAC sample questions on both Language Arts and Mathematics. This was a blessing as it allowed for a more seminar-type discussion of the subject.

First, attendees brought up that there had been a state math test with constructed response questions (I believe it was CLASS) that predated the CST/CAP. New York has already used this.

The small group I had liked this question (as do I), but I think the video is crappy, and more likely to distract (the swimmers finish in the animation not matching the time orders listed). My specturm-y students will perseverate on things like that and make mistakes--I have one student who tries to take a protractor to the triangles I draw freehand on the whiteboard to figure out the &quot;missing&quot; angle measure, rather than just subtracting. I vote thumbs-down on tech making a difference on this one. 

So, the question I had was is the tech worth it? The exercise I liked was Item 43051. Someone here may have commented on the calculator being confusing, but the thing that was interesting to me was that it would take a variety of answers in either fractional, decimal form, including un-reduced fractions. This is something that is not as easily handled on a paper/pencil test (although you could allow for multiple answers with a constructed response I suppose.)

I&#039;m not sanguine about scoring by machine or by a roomful of temps. Frankly, I look forward to more and better formative assessment with Common Core, not these tests or frankly any high-stakes assessment. The only thing I heard that was intriguing at my session was someone sharing the idea that only a sample of students maybe tested at any given time, but that could blow-up in our faces as much as NCLB did.

I am happier about these questions than the current ones being asked on CST, and what SBAC came up with for ELA, and being able to adjust instruction to better prepare students for this change, but it will be an adjustment.

Problems lurking out there are the always fun &quot;math wars&quot; a preview can be seen here:

http://truthinamericaneducation.com/common-core-state-standards/common-core-math-standards-making-the-simple-complicated/

I am trying to blog more about the implementation, as my district is apparently at the forefront of the roll-out in our state, and I&#039;m part of a &quot;leadership&quot; cadre that is developing new curricula for ELA. Thanks for allowing me to see what others are seeing, doing, etc.]]></description>
			<content:encoded><![CDATA[<p>I did a recent session at Fall CUE that was a last minute addition, and sparsely attended discussing the SBAC sample questions on both Language Arts and Mathematics. This was a blessing as it allowed for a more seminar-type discussion of the subject.</p>
<p>First, attendees brought up that there had been a state math test with constructed response questions (I believe it was CLASS) that predated the CST/CAP. New York has already used this.</p>
<p>The small group I had liked this question (as do I), but I think the video is crappy, and more likely to distract (the swimmers finish in the animation not matching the time orders listed). My specturm-y students will perseverate on things like that and make mistakes&#8211;I have one student who tries to take a protractor to the triangles I draw freehand on the whiteboard to figure out the &#8220;missing&#8221; angle measure, rather than just subtracting. I vote thumbs-down on tech making a difference on this one. </p>
<p>So, the question I had was is the tech worth it? The exercise I liked was Item 43051. Someone here may have commented on the calculator being confusing, but the thing that was interesting to me was that it would take a variety of answers in either fractional, decimal form, including un-reduced fractions. This is something that is not as easily handled on a paper/pencil test (although you could allow for multiple answers with a constructed response I suppose.)</p>
<p>I&#8217;m not sanguine about scoring by machine or by a roomful of temps. Frankly, I look forward to more and better formative assessment with Common Core, not these tests or frankly any high-stakes assessment. The only thing I heard that was intriguing at my session was someone sharing the idea that only a sample of students maybe tested at any given time, but that could blow-up in our faces as much as NCLB did.</p>
<p>I am happier about these questions than the current ones being asked on CST, and what SBAC came up with for ELA, and being able to adjust instruction to better prepare students for this change, but it will be an adjustment.</p>
<p>Problems lurking out there are the always fun &#8220;math wars&#8221; a preview can be seen here:</p>
<p><a href="http://truthinamericaneducation.com/common-core-state-standards/common-core-math-standards-making-the-simple-complicated/" rel="nofollow ugc">http://truthinamericaneducation.com/common-core-state-standards/common-core-math-standards-making-the-simple-complicated/</a></p>
<p>I am trying to blog more about the implementation, as my district is apparently at the forefront of the roll-out in our state, and I&#8217;m part of a &#8220;leadership&#8221; cadre that is developing new curricula for ELA. Thanks for allowing me to see what others are seeing, doing, etc.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Jeff Layman		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-561582</link>

		<dc:creator><![CDATA[Jeff Layman]]></dc:creator>
		<pubDate>Sun, 04 Nov 2012 08:13:13 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-561582</guid>

					<description><![CDATA[A few of those videos seemed like they were straight out of your brain if you were an ELA teacher. Reminded me a lot of what you and Dave Major did.]]></description>
			<content:encoded><![CDATA[<p>A few of those videos seemed like they were straight out of your brain if you were an ELA teacher. Reminded me a lot of what you and Dave Major did.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Dan Meyer		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-561075</link>

		<dc:creator><![CDATA[Dan Meyer]]></dc:creator>
		<pubDate>Sat, 03 Nov 2012 20:21:38 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-561075</guid>

					<description><![CDATA[@&lt;strong&gt;Rob&lt;/strong&gt;, I don&#039;t see that working. The second act is ideally constructed between teacher and student.]]></description>
			<content:encoded><![CDATA[<p>@<strong>Rob</strong>, I don&#8217;t see that working. The second act is ideally constructed between teacher and student.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Rob Kessler		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-560998</link>

		<dc:creator><![CDATA[Rob Kessler]]></dc:creator>
		<pubDate>Sat, 03 Nov 2012 18:59:31 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-560998</guid>

					<description><![CDATA[These tasks make me wonder if 3 Acts lessons might be well suited for assessment in addition to sense-making or introduction of concepts. Any plans on doing that, Dan?]]></description>
			<content:encoded><![CDATA[<p>These tasks make me wonder if 3 Acts lessons might be well suited for assessment in addition to sense-making or introduction of concepts. Any plans on doing that, Dan?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Justine C		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-560971</link>

		<dc:creator><![CDATA[Justine C]]></dc:creator>
		<pubDate>Sat, 03 Nov 2012 18:31:04 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-560971</guid>

					<description><![CDATA[I completely agree that assessment drives instruction.  Being a newer teacher at a high performing school, I was astonished by how much teaching revolves around test scores and that is the main focus of administration.  I got into teaching to help better student education.  I am excited about the new common core assessments and am in hope they drive better math instruction to replace the drill and kill I see every day.  The more I learn about the new assessments, the more excited I get for changing the way math is taught.  At the same time, I too am afraid of how the students will perform in the beginning years and what actions will be taken.  Our students are definitely not prepared for what is about to come.]]></description>
			<content:encoded><![CDATA[<p>I completely agree that assessment drives instruction.  Being a newer teacher at a high performing school, I was astonished by how much teaching revolves around test scores and that is the main focus of administration.  I got into teaching to help better student education.  I am excited about the new common core assessments and am in hope they drive better math instruction to replace the drill and kill I see every day.  The more I learn about the new assessments, the more excited I get for changing the way math is taught.  At the same time, I too am afraid of how the students will perform in the beginning years and what actions will be taken.  Our students are definitely not prepared for what is about to come.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Lorraine Baron		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-560869</link>

		<dc:creator><![CDATA[Lorraine Baron]]></dc:creator>
		<pubDate>Sat, 03 Nov 2012 16:26:27 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-560869</guid>

					<description><![CDATA[Hi Dan:
This is such a good question!
These types of machine-scorable questions address students’ basic skills and understanding by applying the Cognitive Processes, which are based on a revision of Bloom&#039;s Taxonomy by Anderson and Krathwohl (2001). From what I understand, the University of Iowa, with the goal of developing machine-scorable questions that address lower- and higher-order thinking skills, supported the creation of these types of questions. Samples of these questions, designed by Scalise and Gifford (2006) are available on Scalise’s (2012) website. These question-types were designed and funded by partnerships such as the Smarter Balanced Assessment Consortium (SBAC) (2010) and the Partnership for Assessment of Readiness for College and Careers (PARCC) (2010). As progressive as they are, while striving to measure higher-level understanding, it could be argued that they were designed using an ‘instrumental’ and assessment-driven perspective which includes goals such as the development of a common assessment system that “will help make accountability policies better drivers of improvement” (p. 8). Politics are really hard to tease out of assessment… but I diverge…
Whatever perspective or purpose you (the general “you”) may have, you may still feel it is important for students to engage in technologically dependent assessments to address the ‘reality’ of the future of their own testing experiences. 
Many others, such as Burkhardt (2012) would argue for a more problem-based approach to assessment, such as YOURS Dan â˜º, or those of the Shell Centre (Shell Centre for Mathematical Education, 2012), since teachers DO teach to the test.  Burkhardt suggested that well-designed assessment includes “short items and substantial performance tasks so that teachers who teach to the test, as most teachers will, are led to deliver a balanced curriculum that reflects the standards” (p. 2). 
I do not dismiss machine-scorable items when they are well designed. I would agree with a ‘balanced approach’  - that’s one of those statements that people say that makes it hard to disagree ;)

References that are worth Checking Out!
Anderson, L. W., &#038; Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom&#039;s Taxonomy of Educational Objectives. New York, NY: Longman.

Burkhardt, H. (2012, October 3). Engineering good math tests, Education Week. 

Partnership for Assessment of Readiness for College and Careers. (2010). PARCC: Application for the race to the top comprehensive assessment systems competition. Tallahassee, FL: Florida Department of Education.

Scalise, K. (2012). Intermediate constraint taxonomy: Open source assessment objects, from http://edfotech.com/archives/257

Scalise, K., &#038; Gifford, B. (2006). Computer-based assessment in e-learning: A framework for constructing &quot;intermediate constraint&quot; questions and tasks for technology platforms. The Journal of Technology, Learning, and Assessment, 4(6), 4-45. Retrieved from http://www.jtla.org/

Shell Centre for Mathematical Education. (2012). Mathematics assessment project, from http://map.mathshell.org/materials/index.php

Smarter Balanced Assessment Consortium. (2010). Smarter balanced assessment consortium: Theory of action - an excerpt from the smarter balanced race to the top application. Olympia, WA: Office of Superintendent of Public Instruction.]]></description>
			<content:encoded><![CDATA[<p>Hi Dan:<br />
This is such a good question!<br />
These types of machine-scorable questions address students’ basic skills and understanding by applying the Cognitive Processes, which are based on a revision of Bloom&#8217;s Taxonomy by Anderson and Krathwohl (2001). From what I understand, the University of Iowa, with the goal of developing machine-scorable questions that address lower- and higher-order thinking skills, supported the creation of these types of questions. Samples of these questions, designed by Scalise and Gifford (2006) are available on Scalise’s (2012) website. These question-types were designed and funded by partnerships such as the Smarter Balanced Assessment Consortium (SBAC) (2010) and the Partnership for Assessment of Readiness for College and Careers (PARCC) (2010). As progressive as they are, while striving to measure higher-level understanding, it could be argued that they were designed using an ‘instrumental’ and assessment-driven perspective which includes goals such as the development of a common assessment system that “will help make accountability policies better drivers of improvement” (p. 8). Politics are really hard to tease out of assessment… but I diverge…<br />
Whatever perspective or purpose you (the general “you”) may have, you may still feel it is important for students to engage in technologically dependent assessments to address the ‘reality’ of the future of their own testing experiences.<br />
Many others, such as Burkhardt (2012) would argue for a more problem-based approach to assessment, such as YOURS Dan â˜º, or those of the Shell Centre (Shell Centre for Mathematical Education, 2012), since teachers DO teach to the test.  Burkhardt suggested that well-designed assessment includes “short items and substantial performance tasks so that teachers who teach to the test, as most teachers will, are led to deliver a balanced curriculum that reflects the standards” (p. 2).<br />
I do not dismiss machine-scorable items when they are well designed. I would agree with a ‘balanced approach’  &#8211; that’s one of those statements that people say that makes it hard to disagree ;)</p>
<p>References that are worth Checking Out!<br />
Anderson, L. W., &amp; Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom&#8217;s Taxonomy of Educational Objectives. New York, NY: Longman.</p>
<p>Burkhardt, H. (2012, October 3). Engineering good math tests, Education Week. </p>
<p>Partnership for Assessment of Readiness for College and Careers. (2010). PARCC: Application for the race to the top comprehensive assessment systems competition. Tallahassee, FL: Florida Department of Education.</p>
<p>Scalise, K. (2012). Intermediate constraint taxonomy: Open source assessment objects, from <a href="http://edfotech.com/archives/257" rel="nofollow ugc">http://edfotech.com/archives/257</a></p>
<p>Scalise, K., &amp; Gifford, B. (2006). Computer-based assessment in e-learning: A framework for constructing &#8220;intermediate constraint&#8221; questions and tasks for technology platforms. The Journal of Technology, Learning, and Assessment, 4(6), 4-45. Retrieved from <a href="http://www.jtla.org/" rel="nofollow ugc">http://www.jtla.org/</a></p>
<p>Shell Centre for Mathematical Education. (2012). Mathematics assessment project, from <a href="http://map.mathshell.org/materials/index.php" rel="nofollow ugc">http://map.mathshell.org/materials/index.php</a></p>
<p>Smarter Balanced Assessment Consortium. (2010). Smarter balanced assessment consortium: Theory of action &#8211; an excerpt from the smarter balanced race to the top application. Olympia, WA: Office of Superintendent of Public Instruction.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Dan Meyer		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-560093</link>

		<dc:creator><![CDATA[Dan Meyer]]></dc:creator>
		<pubDate>Sat, 03 Nov 2012 00:50:37 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-560093</guid>

					<description><![CDATA[Yeah, +1 on the tech files. Really interesting constructed response items in the &quot;Movie Files&quot; folder.]]></description>
			<content:encoded><![CDATA[<p>Yeah, +1 on the tech files. Really interesting constructed response items in the &#8220;Movie Files&#8221; folder.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Candice Frontiera		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-560034</link>

		<dc:creator><![CDATA[Candice Frontiera]]></dc:creator>
		<pubDate>Fri, 02 Nov 2012 23:28:02 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-560034</guid>

					<description><![CDATA[Next thing to explore:
Go to this site: http://www.smarterbalanced.org/smarter-balanced-assessments/

Scroll down until you see the link for &quot;Technology Enhanced Item Supporting Materials (ZIP)&quot;

It has both Math and ELA videos and templates of how students will use the tools to label number lines, partition shapes, create angles, etc.  The folder is a hidden gem.  I totally agree that the possibilities this type of testing offers is a step in the right direction!]]></description>
			<content:encoded><![CDATA[<p>Next thing to explore:<br />
Go to this site: <a href="http://www.smarterbalanced.org/smarter-balanced-assessments/" rel="nofollow ugc">http://www.smarterbalanced.org/smarter-balanced-assessments/</a></p>
<p>Scroll down until you see the link for &#8220;Technology Enhanced Item Supporting Materials (ZIP)&#8221;</p>
<p>It has both Math and ELA videos and templates of how students will use the tools to label number lines, partition shapes, create angles, etc.  The folder is a hidden gem.  I totally agree that the possibilities this type of testing offers is a step in the right direction!</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Colleen		</title>
		<link>/2012/the-smarter-balanced-sample-items/#comment-559974</link>

		<dc:creator><![CDATA[Colleen]]></dc:creator>
		<pubDate>Fri, 02 Nov 2012 22:21:59 +0000</pubDate>
		<guid isPermaLink="false">/?p=15403#comment-559974</guid>

					<description><![CDATA[I am really impressed.  In Massachusetts, we do have open response questions, but I do not think they are as cognitively demanding as these questions.  So far, the released items from PARCC do not seem as demanding, but we have only seen a small sample from PARCC.]]></description>
			<content:encoded><![CDATA[<p>I am really impressed.  In Massachusetts, we do have open response questions, but I do not think they are as cognitively demanding as these questions.  So far, the released items from PARCC do not seem as demanding, but we have only seen a small sample from PARCC.</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
