Engaging Students with MS Teams: 3 Case Studies from Lab Courses
View Recording: https://jh.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=fd09945d-3663-4bfa-8718-ad4900fca224
- Eileen Haase, PhD, Whiting School of Engineering
- Jamie Young, PhD, Krieger School of Arts and Sciences
- Rebecca Pearlman, PhD, Krieger School of Arts and Sciences
The 2020 pandemic forced many faculty to shift how they teach and rethink how they engaged students. This session describes the work of three faculty in WSE and KSAS who used Microsoft Teams to engage students with each other and the instructors. While the case studies focus on lab courses, the lessons learned are relevant to any course.
Future of Storytelling: 360 Video and the Case for Quality
View Recording: https://jh.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=8c6d7d34-7825-4672-a127-ad4900fca242
- David Toia, M.A. Video Production Supervisor, JHSPH Center for Teaching and Learning
- Amy Pinkerton, M.A. Instructional Designer, JHSPH Center for Teaching and Learning
In this session, Amy Pinkerton, Instructional Designer and David Toia, Video Production Supervisor from the Center for Teaching and Learning at the School of Public Health, will highlight pedagogical benefits of storytelling using 360 video to engage students with communities around the globe. The presenters will highlight the technical aspects of immersive 360 video technologies and share clips from “Inside COVID-19,” a 360 degree video documentary produced for Oculus TV. They will then discuss some of the filmmaking techniques used and how they can be applied to create academic content. The presentation will include an examination of the importance of quality in educational video and demonstrate an analytical approach to evaluating content development. There will be opportunities for audience questions and comments throughout and the session will conclude with a Q&A. Participants in this session are encouraged to watch the “Inside COVID-19” documentary before the session by clicking here.
Note: If you are watching on a computer use your mouse to click and drag around the video to change your viewing angle, viewers on mobile devices can simply use their fingers to navigate the scene.
Mixed Reality Headsets in Teaching Laboratory Courses: Changing the Pedagogy through Remote Collaboration and Experimentation [DELTA Grant]
View Recording: https://jh.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=afcba4ec-c065-403d-a6ac-ad4900fe4ecc
- Sakul Ratanalert, PhD, Whiting School of Engineering
- Orla Wilson, PhD, Whiting School of Engineering
- Meredith Safford, PhD, Krieger School of Arts and Sciences
- Patty McGuiggan, PhD, Whiting School of Engineering
- Luo Gu, PhD, Whiting School of Engineering
- Robert Leheny, PhD, Krieger School of Arts and Sciences
The transition to remote teaching has exposed a critical need in the science and engineering curriculum. In particular, how can the necessary laboratory skills be learned remotely? These skills not only involve scientific problem solving, but observational acuity and the ability to work collaboratively. To address this need, we are using mixed reality headsets coupled with electronic notebooks in lab-based courses in WSE and KSAS. The headsets enable not only real time remote visualization but an entire interactive laboratory experience to students participating remotely. This system will modernize the laboratory training experience, expand our educational offerings, and better prepare our students in their careers. In this session, we will showcase some basic and advanced functionalities of the mixed reality headsets, outline our journey to bring this technology to the classroom, and demonstrate both current and projected implementations of this suite of tools in teaching and research laboratories. We will also explore how instructors and students have responded and adapted to its use, including observations surrounding unexpected barriers to student engagement and steps we have taken towards overcoming them.
Improving Pediatric Cardiopulmonary Resuscitation Care via Augmented Reality [DELTA Grant]
View Recording: https://jh.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=5f800b23-d9d8-42e7-a7ef-ad4900fde0a4
- Keith Kleinman, MD, School of Medicine
- Justin M Jeffers, MD MEHP, School of Medicine
- Therese Canares, MD, School of Medicine
- James Dean, MS, Applied Physics Laboratory
Augmented reality (AR), although prominent in other sectors, has been slow to infiltrate healthcare education. There are numerous opportunities to utilize AR in training current and future healthcare providers, specifically around resuscitation and cardiopulmonary resuscitation (CPR) performance. High quality CPR has been shown to improve pediatric cardiac arrest survival yet adherence to current guidelines for CPR performance is poor. Numerous attempts have been made to improve guideline adherence with varying degrees of success but there is still room for improvement. This project, in collaboration with the Johns Hopkins University Applied Physics Lab, aims to develop AR software to improve CPR education and performance by providing real-time feedback in the direct field of vision of someone performing chest compressions. This software will be integrated into existing AR hardware creating an affordable, portable, and effective means of training and debriefing chest compression performance. The platform will be applied to a variety of learner groups across multiple disciplines.
Turning Zoombies into Students: Encouraging Community and Engagement in the Virtual University
View Recording: https://jh.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=6243783a-2311-4a1f-985e-ad4a0013a440
- Daniel H. Foster, PhD, Peabody Institute
This presentation focuses on my First-Year Seminar (FYS), “What Can Music Do For Us?” Instead of a topic-based course like “Math and Music” or “Opera and Literature,” in order to attract students equally from Whiting, Peabody, and Krieger I based my seminar on a “big question” answerable from many academic perspectives. Because of its online modality and status as an inaugural FYS, as well as my aim to achieve some of the main pedagogical goals in CUE2, in this class I tried both tech and non-tech ways to support student-led learning communities and to stimulate student engagement. Humanities, science, engineering, and music students determined much of the syllabus’ content and were encouraged to “dream big,” both as individuals and in collaboration when designing large-scale projects. Each student provided us with texts (e.g., articles, podcasts, videos, or websites) that answered our class question about the uses of music. To further encourage collaboration, students were regularly given time together in Zoom without my presence. Sometimes to discuss an assigned question, sometimes just to casually chat. More structured collaboration included student-designed large-scale projects. By giving students class-time to talk privately in breakout rooms, they were able to choose partners for collaboration. Exhausting the possibilities of Zoom and its problems with disembodiment, distraction, and the digital divide, we then turned to Virtual Reality (VR). Extending these experiments further, next year I plan to teach a class where the students and I will meet regularly in a virtual classroom by wearing standalone VR headsets.