Day/Time: Friday, May 14, 2:45 PM to 3:45 PM

Presenters

  • Matt Ruen, Grand Valley State University
  • Brianne Selman, University of Winnipeg
  • Stephanie Towery, Texas State University
  • Leila Sterman, Montana State University
  • Joshua Neds-Fox, Wayne State University
  • Teresa Schultz, University of Nevada, Reno

Description

Attendees of this workshop will get the most out of it if they preview the project at https://vimeo.com/543381147

Concerns about “predatory” or questionable journals have led many academics to seek out simple checklists of good or bad journals, but this obscures the contextual and constructed nature of authority in information.

A group of librarians has banded together to try to address this problem through the creation of the journal Reviews: the Journal of Journal Reviews (RJJR) that would invite peer reviewed evaluations of all journals, both open and paywalled, from across the world, no matter the language. The idea is to create an open rubric for thoughtfully evaluating a journal, as well as a platform for sharing those evaluations as resources. Authors interested in a potential journal could look to RJJR for evaluations already completed on the journal, while the reviews themselves would model the practice of nuanced, contextual evaluation. This aims to be an iterative process that can be updated and allows for open feedback.

The values which shape this project include: taking a critical approach to prestige, supporting labor not traditionally seen as scholarly work, ensuring an environment inclusive of diverse voices, being transparent about the process, emphasizing nuance in journal evaluation, and accepting that change happens.

Prior to the Forum, we will share our rubric and a video overview of this project so that the session can focus on deeper engagement with the idea of context-centered journal reviews as a form of scholarly publication.  The heart of this session will be a facilitated conversation using Mentimeter to gather and discuss recommendations, critiques, and other feedback from participants.

We invite participants to help us revise and reflect on our project, specifically considering these questions:

  • For authors, does the context-centric rubric make sense when evaluating an unfamiliar journal, and how can we improve the tool for your future use?
  • For editors, is our framework appropriate to evaluate your journal?
  • For librarians, does this publication structure adequately recognize that the labor of librarians supporting scholarly communications is itself scholarship, especially for promotion and advancement?
  • For everyone, how can the practice of contextual evaluation critically reflect on racial, gender, and global biases that shape perceptions of publication venues?

“Read the Rubric: https://tinyurl.com/rjjrrubric

Try it Out: https://tinyurl.com/rjjrevaluation