Q&A: How Data and Technology are Transforming Educational Services

Peter Tuerk headshot

Peter Tuerk is the director of the Sheila C. Johnson Center for Human Services in the Curry School of Education and Human Development. (Photo by Dan Addison, University Communications)

When excellent service and training align, it’s the definition of a win-win situation: a family gets the educational and psychological services they need, and students gain valuable experience providing them.

As the new director of the Sheila C. Johnson Center for Clinical Services at the University of Virginia’s Curry School of Education and Human Development, Peter Tuerk hopes to create this type of experience for as many UVA students and clients as possible.

The Sheila Johnson Center, housed in Bavaro Hall on Grounds, is a multidisciplinary training center that provides a range of clinical services to local children and families – including speech and language, hearing, reading, clinical psychology, educational assessment, autism spectrum services, and more. At the same time, it provides UVA students with important clinical training.

There are a lot of pieces to fit together – but Tuerk, a problem-solver at heart, is up to the challenge. He began his career as a clinical psychologist working with childhood obsessive-compulsive disorder and, eventually, post-traumatic stress disorder. These days, he wears many hats as a clinical psychologist, administrator, researcher and educator specializing in evidence-based treatments for anxiety spectrum and depressive disorders.

Tuerk has also become a futurist of sorts in his field. He has investigated the effects of virtual reality, artificial intelligence and psycho-physiological wearables, among other technologies. His work treating veterans suffering from PTSD through telehealth (or treatment administered over long distances via video) led to a popular TEDx talk and the Veteran Administration’s highest honor, the Olin E. Teague Award for outstanding career achievement in the rehabilitation of war-injured veterans.

Tuerk describes the work of merging health care and training processes with data tools as complex but gratifying, like putting together a puzzle. Here, he shares his vision for putting together the puzzle of the Sheila Johnson Center – and how data and technology can help make it happen.

Q. What is the primary goal of the Sheila Johnson Center? 

A. Ultimately, providing high-quality clinical training for our students is job No. 1.

We have a lot of services under one roof, and we want to leverage that environment to have a clinic where students benefit from working in a multidisciplinary, measurement-informed model. We want to provide a learning environment where students are confronted with electronic medical records, relevant technologies, charting standards, learning how to help families navigate insurance, and collaborating with a team – many of the things they will likely encounter in their future workplace.

Of course, our mission statement also highlights proving clinical services to the community. There are different ways to think about balancing these priorities, but to me, they are two sides of the same coin. How can clinical training be robust, well-rounded and science-based if the clinical services we provide to the community are not also effective or evidence-based? Likewise, the present and future of clinical care and educational services is interdisciplinary, so clinical training should be as interdisciplinary as possible.

Q. What’s an example of how those dual goals – training and service – can support each other effectively?

A. One example of that, for the Sheila Johnson Center, might be a “one-stop shop” for clinical triage. So, when people call the clinic and don’t know exactly what they need, we would have a student clinician ready to triage this client. Using a tested decision-tree interviewing model, that student could identify if services and training match, or if their family’s needs would be better met somewhere else. If they’re better met somewhere else, a high-quality, data-informed referral could be made right on the spot.

Then, it’s about being proactive and calling that family back in a few weeks to say, “How did that work out?” and tracking those data. Or if we keep the case, being proactive by offering evidence-based assessment and intervention, with ongoing measurements to ensure the care is effective.  

In this scenario, we’re providing a service to the community that’s also in service of our training model – giving students the opportunity to practice basic clinical skills that we teach in our courses here, like reflective listening, showing empathy, making sure that you’re addressing people’s needs and not the needs that you’re assuming and, most importantly, practicing evidence-based assessment and intervention.

Q. How do you bring more evidence-based assessment into the equation?

A. There are a lot of process-oriented outcome measures in behavioral health: number of clients seen, wait times, productivity metrics – all the things that clinical administrators typically care about and should care about.

What we’re also trying to move toward, more generally, is measurement-based care. That means the measurements we’re concerned about are answering a different question: Are your clients improving? Did someone go from having a mental health diagnosis to not having one as a result of treatment? Or even, did this clinical assessment accomplish what it was supposed to?

Both types of measures, process and clinical, are important. But I’m excited about focusing on measurement-based care, and doing that in a way that’s clinically meaningful for training and for the families who are receiving services. We can’t, and don’t want to, collect all potential data. We want to be strategic.

This all can sound very dry, but on the ground it can be empowering. We’re using data to help make sure people can afford their care here, to reduce waste and to demonstrate clinical improvement to clients.

There is really great work being done here at Sheila Johnson, so shining a light on it by collecting valid and reliable clinical measures is an exciting place to be.

Q. You’ve done a lot of work with telehealth, which is emerging as a way to provide clinical services to a wider range of people. How is telehealth different from traditional in-person care?

A. If anyone’s ever been broken up with on the telephone, you know it can hurt just as much as it would in-person. My point is, when you’re using technology to communicate, the important thing is that you’re communicating – not the technology. In clinical videoconferencing, the screen sort of fades away after the first few minutes and you’re left with the person. I’m saying this as a researcher who has studied telehealth, but also as a clinician who has regularly used it in standard care settings.

There are some habits that are important for a clinician to develop, like making sure to become familiar with the communities that you are treating. For example, if you’re working with a family and suggest that a child be rewarded by going out for ice cream, not knowing the community, you might not understand that the only thing in that family’s town is a post office, and there’s nothing else for 30 miles. Those sorts of opportunities for disconnect are more prevalent over telehealth, but they are easily addressed and if addressed proactively, they actually can help create a productive therapeutic relationship.

Q. How can telehealth, and other new technologies, impact clinical training and services?

A. Telehealth can help deliver effective treatments or education to people who need them. It’s here to stay, and it’s evolving.

I think it’s important in the near future that we strive to provide a training environment that includes a telehealth component. But telehealth is not a panacea – it requires more resources, more problem-solving, more tech support, and it has its own unique issues related to referral stream development and patient motivation. We’ve seen over and over that just purchasing telehealth equipment or infrastructures without a needs-based strategic plan leads clinics to underutilization and waste and can be frustrating for clinicians.

If we want to include student training, we also have to make sure that our technology capabilities match the educational programs’ needs and values.

Q. How do you assess new technologies and decide whether or not to use them?

A. There are lots of interesting and innovative technologies on the horizon; the struggle is not to confuse novelty and “wow” factor with clinical effectiveness. I think that oftentimes there’s a gravity to do technology for technology’s sake, which is the wrong approach in a clinical setting. We want to be using a technology strategically to solve specific problems or to improve specific outcomes.

Conducting a needs assessment, even an informal one, can be an important first step. Many technology products, especially human service products, are not sufficiently tested for usability, so they can be difficult to implement. But a strategic plan, informed by an initial assessment of the need or problem trying to be solved, can help build support and patience for the learning curve, even before a technology is implemented. The need should always be driving the technology, not the other way around.

Q. What is your overall vision for the future of the Sheila Johnson Center?

A. The goal is to provide a vibrant, welcoming, multidisciplinary and science-based environment for students to learn and serve in.

Overall, we want to be a collaborative link between our degree programs and the community – where faculty and students in different programs can come together and be creative in a service environment. To this end, I envision us expanding educational, psychoeducational and treatment opportunities, so that more of the children we assess can also get the interventions they need. We will be opening up opportunities for cross-disciplinary observation of clinical or educational services, implementing behavioral parent training models of intervention, and strategically building on our relationships with local schools and services.

In all of these efforts, we will be devoting attention and resources to tracking the relevant information required in order to improve or assess the programming. 

We’ve got this wonderful resource here in the clinic, and an education school ranked in the top 5 percent in the country, so the opportunity to optimize those resources to serve the community and enhance our training mission is exciting. But in all these endeavors, we want to be focused and deliberate and able to qualify and quantify expected outcomes.

Media Contact

Laura Hoxworth

School of Education and Human Development