Fake news, critical thinking

Some ideas for student activities:


and http://learning.blogs.nytimes.com/2014/10/08/guest-post-practical-tools-for-teaching-news-literacy/

More on the fake news-makers:


Crummy sites:

crummy sites to use for evaluation


False, Misleading, Clickbait-y, and Satirical “News” Sources‘ by Melissa Zimdars

Construct your own matrix:

Benjes-Small, C., Archer, A., Tucker, K., Vassady, L., & Resor Whicker, J. (2013). Teaching Web Evaluation: A Cognitive Development Approach. Communications In Information Literacy, 7(1). Retrieved March 6, 2014, from http://www.comminfolit.org/index.php?journal=cil&page=article&op=view&path%5B%5D=v7i1p39

Especially Appendix A

and: Alyssa Archer and Candice Benjes-Smal also developed  “Evaluating news sites: Credible or Clickbait?”   at http://www.projectcora.org/assignment/evaluating-news-sites-credible-or-clickbait


Stop using Likert scales: 3 minute version


I’ll present a performance-based assessment model that produces actionable data as an alternative to Likert scale questions. [longer article forthcoming in the journal linked below]

Dominique Turnbow and I have been working on developing performance based replies for our end of class evaluations, based on work by Will Thalheimer who does this in business settings. We have a lot to say about this in an upcoming article that will be in Communications in Information Literacy. We’ll be sure to let you know when you can read all about it. We suggest turning your likert scales into something more useful. Dominique and Amanda Roth, both at UCSD, have this lovely example in qualtrics.

Or in like this in google forms.

I have also printed out copies and brought them to class for students to fill out. When I don’t have students working on computers, that’s a good strategy. Qualtrics is so much nicer, but anything will work. Whenever possible I use our LMS (it’s now Canvas) to make a simple survey for students to fill out. We don’t have a formal system for reporting our results, but I can use these results to see what worked and what didn’t. I use that for my own documentation of my instruction for contract renewals and performance reviews. Here is one that I used:

If I haven’t convinced you about this, that’s ok, but please use qualtrics which lets you label each reply, NOT google forms for likert scales.

forthcoming journal article will be here [I’ll update the link when it’s available]: http://www.comminfolit.org/index.php?journal=cil

COPUS in Library instruction

The instrument for recording is named COPUS.xlsx file in the folder linked here.

You will need some training first. I am happy to show you what I have learned.

Using another department’s template, I was able to make 2 documents for two different library instruction sessions and 2 different instructors. One was a single class in a full term class and the other was a one off session.

The teaching inventory would work for the full class, but not as well for the one off class. I’m working on making a version that will work for one-off classes, but any changes will mean that it’s not nationally calibrated. I am open to ideas.

Why active learning? part 1

For the beginner:

I have gone so far down the path of figuring out how to use active learning effectively in a classroom that I was more than a bit taken aback recently. I had to defend the reasoning behind active learning for understanding material, for acquiring skills (which we do a lot in our one-off sessions), but also for understanding important concepts in a field. This is especially true when you are trying to change thinking and patterns about things the learners feel or think are true. Everything isn’t on google, the world is complicated, etc., etc.

While we might change our minds in another 10 years, here is what we know now:

Across STEM fields and at all levels, students fail less and learn more when the classes involve more active learning and less lecture. Female students and  those from disadvantaged backgrounds do even better in active learning classes than traditional lectures.

“The authors examined two outcome measures: the failure rate in courses and the performance on tests. They found the average failure rate decreased from 34% with traditional lecturing to 22% with active learning (Fig. 1A), whereas performance on identical or comparable tests increased by nearly half the SD of the test scores (an effect size of 0.47).”


“However, in undergraduate STEM education, we have the curious situation that, although more effective teaching methods have been overwhelmingly demonstrated, most STEM courses are still taught by lectures—the pedagogical equivalent of bloodletting. Should the goals of STEM education research be to find more effective ways for students to learn or to provide additional evidence to convince faculty and institutions to change how they are teaching? Freeman et al. do not answer that question, but their findings make it clear that it cannot be avoided.”

from Wieman, C. E. (2014). Large-scale comparison of science teaching methods sends clear message. Proceedings of the National Academy of Sciences, 111(23), 8319–8320. doi:10.1073/pnas.1407304111

Referring to this work: Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., … Bruce Alberts, by. (n.d.). Active learning increases student performance in science, engineering, and mathematics. doi:10.1073/pnas.1319030111


Active learning in classrooms designed for it. I wish there were more references here, but it seems clear that students are happier in active learning classrooms. And they think they are learning more (perceived learning). Likely, there is some study that has better data about actual learning  and interactivity (see slide 12). I am not sure what higher-order active learning might refer to.

Pearling and footnote chasing

A recent post on Medical Libraries Discussion List suggests that this is yet another topic that might need a new article. I learned the following in library school. From the ever so clever Dr. Marcia Bates.

footnote chasing (Bates, 1989) or backwards chaining (Ellis, 1989)

citation searching (Bates, 1989) – Although I could swear that I learned “pearling” in library school, but maybe not from Dr. Bates? Booth (2008)* is a likely suspect, but I still don’t know where I got pearling from.

berrypicking (Bates, 1989)  – “Browsing is undirected, while berrypicking is more directed.” (Bates, 2002)

My attempts at representing these ideas at powerpoint slides. Feel free to use, share and re-mix: pearling footnote chasing

*Booth (2008) makes a distinction between “citation pearl growing” and “citation searching”. What I call pearling is what he calls “citation searching.” I will figure out where I put his citation pearl growing (building search terms from a relevant article) in my model. I also think he mis-characterizes Bates’ summary of berry picking on p.

Bates, M. (2002). Toward An Integrated Model of Information Seeking and Searching. Retrieved February 7, 2016, from https://pages.gseis.ucla.edu/faculty/bates/articles/info_SeekSearch-i-030329.html

Bates, M. (1989). The design of browsing and berrypicking techniques for the online search interface. Online Review. Retrieved from http://www.emeraldinsight.com/doi/abs/10.1108/eb024320  [http://comminfo.rutgers.edu/~tefko/Courses/e530/Readings/Bates_Berrypicking.pdf]

Booth, A. (2008). Unpacking your literature search toolbox: on search styles and tactics. Health Information and Libraries Journal, 25(4), 313–7. doi:10.1111/j.1471-1842.2008.00825.x

Ellis, D. (1989). A behavioural approach to information retrieval system design. Journal of Documentation. Retrieved from http://www.emeraldinsight.com/doi/abs/10.1108/eb026843

pearling and footnote chasing list summary

 from the recent discussion on the Medial Libraries Discussion List:
pearl growing/building/purling/citation pearling – citation associations (or keyword associations through pearl growing) can be conceptually incestuous when it comes to building a comprehensive search. For example, you would unlikely see someone who uses a behavioural approach to intervention citing a cognitive theorist or vice versa. In both cases if you stayed within one paradigm you’d miss important points of view/evidence. Reviewing and using additional index or key terms
citation tracking/tracking/searching/chasing/referencing/mining.
Ancestral searching
tree – bread crumbing – staircase
The Oxford Guide to Library Research by Thomas Mann (4th edition, 2015), Chapter Six is titled Citation Search and Mann uses the term Citation Searching.  I and other faculty have used this text in our library school online research classes at SJSU School of Information.
Citation Searching
Snowballing is sometimes referred to as *reference harvesting* or *pearl
growing search method*; here are some other terms that evoke similar kinds
of search strategies:
   – forward citation searching, footnote chasing, reference scanning,
   reference harvesting, hand-searching & powerbrowsing
   – backward chaining, forward chaining, digital browsing, footnote
   – pearl growing, reference harvesting, reference lists, reference
   searches, ‘cited by searching’ ….and so on
Bates articulates 29 different search strategies in a era before online databases:
Bates, M. (1979). Information search tactics. Journal of the American Society for Information  …. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/asi.4630300406/full  [https://pages.gseis.ucla.edu/faculty/bates/articles/Information%20Search%20Tactics.html]

math and science apps and sites for school age kids

I’ll update these pages someday. Until then other bits and pieces:


And this Scratch activity book: Super Scratch Programming Adventure! (Covers Version 2): Learn to Program by Making Cool Games by The LEAD Project

Khan academy’s summer of scripting has a nice interface and sends the grown up (if you signed your kids up through your account) email updates.


Dogo News – http://www.dogonews.com

Science Buddies – fantastic site for finding science fair projects. Take the quiz to get suggestions. (The quiz includes questions about how much time you have, and tries to get a sense of what your kid might find interesting.)

Middle school apps:


Elementary school apps:

Both of these recommended by my nephew, a 2nd grade teacher, the first one is only for iPad.


and this one: http://www.thinkingblocks.com/

Science Library Instruction Program pt. 2


Curriculum mapping: UNLV’s terrific work on the topic is all here:


I think of it as something more like the following, which I’ve applied to a specific UO department.
Worksheets_CurriculumMapping_HPHY_Fall_2014 [downloads as a .xls file with several worksheets]

More mapping using the ACRL Framework:



http://cas.uoregon.edu/learning-outcomes/ [note the curriculum map template at the top]

Rubric for new learning outcomes for UO

End of the term evaluations

Instead of the usual try these questions instead:

Your insights into your learning in this course can help me see our course from your side of the desk. Please respond to any three of the statements below (more if you’d like). Submit these anonymously; I will use them as I plan for my courses next semester. 

In this course …

  • it most helped my learning of the content when…because…
  • it would have helped my learning of the content if…because…
  • the assignment that contributed the most to my learning was… because…
  • the reading that contributed the most to my learning was… because…
  • the kinds of homework problems that contributed most to my learning were…because…
  • the approach I took to my own learning that contributed the most for me was…because…
  • the biggest obstacle for me in my learning the material was… because…
  • a resource I know about that you might consider using is…because…
  • I was most willing to take risks with learning new material when… because…
  • during the first day, I remember thinking…because…
  • what I think I will remember five years from now is…because…

You can also add a query such as the following: What is something covered in this course material that you can do now that you could not do or did not fully understand at the beginning of the term?

from: http://www.facultyfocus.com/articles/philosophy-of-teaching/a-new-twist-on-end-of-semester-evaluations/


…If you’re interested in improving something like organization, if you define it behaviorally, you can change what you, do which is a lot easier than changing what you are. Organization has never been one of my strong suits and I didn’t make much progress trying to “be” organized. But when I started putting a skeleton outline on the board, when I stopped five minutes before the end of period and used the outline to summarize, when I began class working with students to create a list of points to remember from last class, I was seen by students as being more organized.

But it isn’t all good news. A collection of dashed off student comments collected at the end of the semester doesn’t easily translate into an action-based improvement agenda. What the student comments mean is probably not what you think they mean. Communication about the impact of teaching policies and practices on efforts to learn needs to be ongoing so there’s an opportunity for clarifying feedback, adjustments and then more feedback. We can and should make efforts to change the way our institutions collect student assessments, but, until that glacier melts, we need to take matters into our own hands and solicit a different kind of feedback and at different times during the course….

Italics and emphasis mine

from: http://www.facultyfocus.com/articles/teaching-professor-blog/end-of-course-evaluations-making-sense-of-student-comments/

“The Teaching and Learning Quality survey, created by Theodore W. Frick, who is now an emeritus professor in Indiana University at Bloomington’s School of Education, attracted interest from dozens of institutions about five years ago. Its questions focus on students’ perceptions of effective educational practices (prompts include “I was able to connect my past experience to new ideas and skills I was learning” and “My instructor demonstrated skills I was expected to learn”).

To study the instrument, instructors assessed student work in 12 courses one month after the courses had ended. Researchers compared those assessments with the results of Mr. Frick’s survey, finding a clear relationship: Students who’d said they frequently saw effective practices in use showed high levels of mastery.

For critics, the problems with student evaluations are too fundamental to be fixed…..

What they really measure is ‘student satisfaction,’” Ms. Nilson wrote in an email to The Chronicle. “They bear no relationship at all to learning.””

emphasis mine


Authentic assessment

From the terrific librarians from Carleton College:

Iris Jastram and Heather Tompkins present workshop at Oregon State University

links to the material they used are in a google folder from the story above.

They also mentioned the work done by the librarian at the Claremont Colleges.