Because he did all this work so I don’t have to, Ben Wagner “…review[ed] studies on the citation advantage (or in a few studies, non-advantage) that open access articles have over non-OA articles. http://www.buffalo.edu/~abwagner/OA-CiteImpact-Bibliography.doc.
I normally would not note this update, excepting a colleague in Great Britain call my attention to an amazing, massive October 2014 study of OA articles done for the European Commission, an analysis of over 1 million articles indexed in Scopus<http://www.elsevier.com/__data/assets/pdf_file/0007/148714/scopus_facts_and_figures.pdf> from 1996-2013. Data on the growth of OA, the proportion of various types of OA articles, and the OCA citation advantage are reported. It is the only study to my knowledge that includes a breakdown of OA articles by country, region, discipline, publication year, and type of OA: gold, green (in official repository), other (free, but not in official repository.
Since this is not a burning issue for everyone on this list, please refer to my blog post <http://libweb.lib.buffalo.edu/blog/sciences/?p=962> or the bibliography cited above for a slightly expanded summary. The report’s 7-page executive summary is well worth a read: http://science-metrix.com/files/science-metrix/publications/d_1.8_sm_ec_dg-rtd_proportion_oa_1996-2013_v11p.pdf.
–A. Ben Wagner, Sciences Librarian
Science & Engineering Information Center
226 Capen Hall (Silverman Library)
University at Buffalo
I read the summary and would recommend you do the same.
the cheap kind, not the interactive fancy ones.
finally this one, where you can purchase white boards:
not hoaxes, but bad sites for students to develop for critiques annotations from ILI:
member of Medical Veritas, which was a “journal” from 2004 to 2008 whose sole purpose seemed to be to provide “medical” research showing the dangers of vaccines. On its list of Purposes, Medical Veritas says it “values the experiences of laypersons as a means of encouraging physicians and scientists to consider medical evidence instead of medical theories.” [there is also http://www.vaccines.gov/ ] – Jennifer Farquhar
DrDay.com is not a hoax website. I talk about it to my students because this is a woman who has a medical degree but not in oncology (cancer) and her website is really just for getting money from people who don’t know better than to buy her DVDs and other merchandise that promises to cure them of all cancers if they watch the videos. If you look at all the information about herself and look at the publicity that she includes about how great she is and how her critics are unjustly attacking her, you should see that she is a scam artist and a con woman. Compare her site to actual websites owned by medical programs at universities such as this one: www.oncolink.upenn.edu you will see that Dr. Day is taking money from people and she is not going to help them cure their cancer. – Miriam Laskin
Here is a biased website that has a professional appearance, and is hosted by an organization that at first glance seems reputable and authored by people with strong credentials. Students might have to do some digging on the web to discover that there is skepticism of the research (and especially the funding) of this organization. http://www.nipccreport.org/ – Oliver Zeff
http://www.globalissues.org. At first glance it looks good, but I point out the About section. Most of the articles (at least as of 2008) have been written by one person–who is not an expert on these global issues. Most students trust .org to be a “good” domain name so I like having a negative example. – Kelly Frost
http://www.smokingaloud.com/corrupt.html – Candice Benjes-Small
How about a site like The Daily Currant http://dailycurrant.com/ which is the source of so many articles shared casually on Facebook. – Amy Burger
One of my favorite bad websites is “About” because they never date their sites or give good background on the authors.
From the most fantastic Elly Vandegrift:
2011 workshop – AssessmentWorksheet_Elly_Spring 2011
2014 assessment journal club – Fall 2014 Week 5 Rubrics_Elly
This site by the awesome Megan Oakleaf looks amazing. But, I need more context for the rubrics. Still, I hope to sit down with them soon and pick out stuff to repurpose.
-The Essentials of Instructional Design (Brown & Green). Meant for undergrads
but it’s a fantastic overview and introduction to the topic.
-Instructional Design (Smith & Ragan). Advanced and a slog to get through.
Nonetheless, it’s a standard bearer and with good reason.
-The Systematic Design of Instruction (Dick & Carey). Another advanced one. But
another “essential” for good reasons.
-Understanding by Design (Wiggins and McTighe). A little tangential, and not an
easy read, but really great ideas. Changed the way I think about ID.
Someone mentioned Richard Mayer’s work – I’m not a big fan of him as some of
his conclusions about best practices have been shown to be a little… off.
(Probably the nicest way to put it). That said, the Clark & Mayer book
E-Learning and the Science of Instruction… though having a few fundamental
flaws… is still pretty good. I’d say about 60%-75% of the information is
quite good. So worth reading.
Booth, C. (2011). Reflective teaching, effective learning instructional
literacy for library educators. Chicago: American Library Association.
Brown, A., & Green, T. D. (2011). The essentials of instructional design:
Connecting fundamental principles with process and practice. Boston:
Clark, R. C., & Mayer, R. E. (2011). E-learning and the science of
instruction proven guidelines for consumers and designers of multimedia
learning, third edition. San Francisco, CA: Pfeiffer.
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design of
instruction. Upper Saddle River, NJ: Merrill/Pearson.
Dirksen, J. (2012). Design for how people learn. Berkeley, CA: New Riders.
Heinich, R. (Ed.). (1996). Instructional media and technologies for
learning. Englewood Cliffs, NJ: Merrill.
Larson, M. B., & Lockee, B. B. (2013). Streamlined ID: A practical guide
to instructional design.
Mayer, R. E. (2009). Multimedia learning. Cambridge: Cambridge University
Press. (two votes for this one)
Morrison, G.R., Ross, S.M., Kalman, H.K., & Kemp, J.E. (2013). Designing
effective instruction (7th edition). Hoboken: John Wiley & Sons. (two
Piskurich, G. M. (2015). Rapid instructional design: Learning id fast and
right. Place of publication not identified: John Wiley. (this newest
edition due out in January)
Smith, P. L., & Ragan, T. J. (2005). Instructional design. Hoboken, NJ: J.
Wiley & Sons.
Wiggins, G. P., & McTighe, J. (2005). Understanding by design. Alexandria,
VA: Association for Supervision and Curriculum Development. (two votes)
Michael Allen has several excellent titles regarding instructional design.
The following site offers a lot of book recommendations as well as details
on different Instructional Design models:
nesting similar courses in 1 tab. from Ken Simon at Pasadena City College:
“We just switched to doing this, and our librarians are relieved not to be swimming in a huge list of largely similar guides for different courses within a discipline — and even guides for different instructors’ sections of the same course. Since we just started doing it this way, we haven’t built up a lot of them yet, but here’s an example. You’ll see the individual course sections under the Sections tab:
No one has complained about this yet, and we’re not concerned about the extra click, since students visit this in the context of an instruction section and first thing they’re shown is how to get to the guide and to their course tab, if one exists.”
I like some of these, but I’m not sure how many I’d feel comfortable using for a one shot. I’m going to try to facts based think-pair-share one today using facts from Project Information Literacy. Click here to download a .pdf file of a bunch of instruction ideas_ icebreakers
“This page is to share references to good articles on Assessment of One-Shot Instruction Sessions. There is also a link to my [Kathy Stroud’s] presentation at the June 19, 2014 Subject Specialists meeting. The bulk of the references are to summative assessment – assessment that occurs after the instruction session and can be used to improve the outcomes of future sessions. However, I have included one resource that is entirely about formative assessment – assessment that occurs at aany point in the instruction and provides feedback to the instructor and learner.
Sobel, Karen and Kenneth Wolf. “Updating Your Tool Belt: Redesigning Assessments of Learning in the Library.” Reference & User Services Quarterly 50.3 (2011): 245-258. Library Literature & Information Science Full Text. Web. 19 June 2014.
Summary of Learning Assessment Tools applicable for one-shot instruction. It has example pre- and post-tests, and activities with scoring rubrics.
Whitlock, Brandy and Julie Nanavati. “A Systematic Approach to Performative and Authentic Assessment.” Reference Services Review 41.1 (2011): 32-48. Emerald Publishing Group. Web 19 June 19, 2014.
Good overview of assessment with a table comparing assessment tools. It also has example activities and scoring rubrics.
Grassian, Esther S and Joan R. Kaplowitz. Information Literacy Instruction: Theory and Practice. New York: Neal-Schuman Publishers, 2009. Print
Chapter 11 is “Assessment: Improving Learning; Improving Teaching. Good overview of assessment, discussion of levels of assessment, and choosing assessment tools. Knight Library’s copy is at ZA3075.G73 2009.
Choinski, Elizabeth and Michelle Emanuel. “The One-Minute Paper and the One-Hour Class: Outcomes Assessment for One-Shot Library Instruction.” Reference Services Review 34.1 (2006): 148-155. Emerald Publishing Group. Web 19 June 19, 2014.
Example questions, but scoring rubric not provided
Carter, Toni M. “Use What you Have: Authentic Assessment of In-Class Activities.” Reference Services Review. 41.1 (2013): 49-61. Emerald Publishing Group. Web 19 June 19, 2014.
Example of topic development activity and discussion of developing scoring rubrics
Buchanan, Heidi E. and Beth McDonough. The One-Shot Library Instruction Survival Guide. Chicago: ALA Editions, 2014. Print
The Library does not own a copy of this. Chapter 6 “How Will I Know What Worked?” is a reassuring guide to helping you assess your performance as a one-shot session instructor.
Veldof, Jerilyn R. Creating the One-shot Library Workshop: A Step-by-step Guide. Chicago: American Library Association, 2006. Print.
Also not owned by the UO Libraries. Step 9 “Build Evaluation Tools” and Step 19 “Evaluate Workshop” Discuss types of evaluations, evaluation design, and provide examples.
Broussard, Mary Snyder, Rachel Hickoff-Cresko, and Jessica Urick Oberlin. Snapshots of Reality: A Practical Guide to Formative Assessment in Library Instruction. Chicago: Association of College and Research Libraries, 2014. Print.
Also not owned by the UO Libraries. Contains many examples of formative assessment techniques and creative ideas for assessing what students know and have learned without cutting into teaching time.”
-Kathy Stroud from https://iris.uoregon.edu/cms/node/6258
This is basically the presentation that I saw yesterday at the UO:
He has a very cool piece about testing by group work, including using an IF AT form (scratch off multiple choice), starts around minute 40.
more on the concept:
more on the IF-AT system in the lower right of the page.
The web based version Mazur uses (because he dislikes multiple choice with a passion) is here:
It’s really reasonably priced, with some very cool features. It reminds me in may ways of what we were hoping to do with parts of ripple.
The Calibrated Peer Review system (the thing I understood as peer and self evaluation) part is around minute 54. And, I forgot that it’s based on a project at UCLA:
Where it looks like they have the whole system set up and ready to use/purchase.
Demonstrating Success through Assessment: Don’t Leave Outcomes to Chance – STS
With Dominique Turnbow, UCSD Librarian
1) Title: First write the learning outcomes, then plan the rest: assessment to make one shot sessions successful
2) Abstract (150-250 words):
Many librarians know that assessment should be a part of the planning process for instruction; however it is usually an afterthought. Well-written learning outcomes can lead to thoughtful, effective assessment and sound instruction. Learning outcomes that focus on goals that aren’t possible to measure during a one-shot workshop can lead to a feeling that assessment and instruction isn’t working. In these situations, outcomes likely focus on behaviors that should be assessed summatively and only after the learners have had practice (i.e. database searching techniques). Instead, most one-shot outcomes should focus on formative assessment, that is, behaviors that can be reasonably observed during the workshop.
This presentation will raise awareness about how to write formative learning outcomes for in-person instruction for large classes without computers and in computer labs, as well as for online instruction. The presenters will discuss concrete examples that have been used in science classes (primarily biomedical). This presentation will include best practices for:
Participants will leave with a solid understanding of formative and summative assessment, concrete examples of outcomes that can be used in one-shot library sessions or online and assessment strategies to address them.
Copyright © 2015 making science librarianship wonderful - All Rights Reserved
Powered by WordPress & Atahualpa