Illumination by Modern Campus

Matthew Gunkel (University of California Riverside) on Exploring the Myths, Realities and Innovations of Generative AI in Higher Ed

February 22, 2024 Modern Campus
Illumination by Modern Campus
Matthew Gunkel (University of California Riverside) on Exploring the Myths, Realities and Innovations of Generative AI in Higher Ed
Show Notes Transcript

On today’s episode of the Illumination by Modern Campus podcast, podcast co-founder Amrit Ahluwalia was joined by Matthew Gunkel to discuss the introduction of generative AI in higher ed and the need for a digital first-approach to shaping the future of student learning. 

Voiceover: Welcome to Illumination by Modern Campus, the leading podcast focused on transformation and change in the higher education space. On today’s episode, we speak with Matthew Gunkel, who is CIO at the University of California Riverside. Matthew and podcast co-founder Amrit Ahluwalia discuss the introduction of generative AI in higher ed and the need for a digital first-approach to shaping the future of student learning. 

Amrit Ahluwalia (00:02):Well, Matt, welcome to the Illumination podcast. It's great to be chatting with you.

Matthew Gunkel (00:06):Wonderful. Thank you for having me today.

Amrit Ahluwalia (00:08):Absolutely. Well, you did a fantastic presentation at last year's Ed's conference talking a little bit about generative AI and its progression, its introduction to the post-secondary space, and I think any conversation about generative AI needs to start off with a little bit of debunking. So I'm hoping we can kick off just by talking a little bit about what is generative AI and what isn't it?

Matthew Gunkel (00:34):Yeah, I mean, I think it's a really interesting question when we begin to consider AI as just sort of this broad field and then the different types of machine learning or artificial intelligence or just these different things and concepts around large language models. And of course large language models are very specific modality with regards to ai. And so I think when you consider that, I mean universities are broadly engaged in large science endeavors. And so in that regard, we're looking at lots of different concepts for applied ways in which we can use different data mechanisms for essentially portions of artificial intelligence. And so this is where this multimodal concept starts to come into play. And so for me, I think large language models are the sexy new thing that are on the market. It's the thing that's making the news cycle, but ultimately we've had long-term engagement with lots of these different kinds of technologies. This is just the one that's now starting to have an emerging impact, not only in what we're hearing about, but also in everyday work or our lives that people are taking on.

Amrit Ahluwalia (01:58):Well, I think what's so interesting about generative AI is truly the confusion that surrounds it. I think there's a lot of fascination with tools that can do what people feel they're designed to do in the first place. But getting to that concept of the misconceptions, this is where I think people might be getting confused about what the technology can actually do. What are some of the most common misconceptions you've bumped into when you're in a conversation with someone on campus about generative AI and the impact it can have?

Matthew Gunkel (02:34):Yeah, I think there's this concept of automation that doesn't fully come to life. So when you're in there and you're now using this prompt engineering or you're watching these people on TikTok and they're like, well, if you just ask it x, Y, Z, you'll now get the magical result you were looking for. And then I was talking with an associate provost just the other day and he was doing some statistical analysis and he was like, he keeps giving me the wrong answer. And so finally we actually reversed it. We sort of said, Hey, here's what we're trying to do. Actually write us a prompt for how we would ask you to correctly solve this problem. But it elicits the challenge, which is just that the things that these generate can be very interesting, but they're not often correct or you're now trying to balance the correctness with application.

(03:26):So all too often, right, I'll hear these very high level generic responses, and then when we actually get into real work, we're like, that's a good start. It helps me with my thinking, but it's not actually the applied answer that I'm now looking for, or it lacks real depth. And so this is where a lot of people, I think are very hopeful that they're going to see automation around that depth piece and the application in their real work. So one example for me, I've been recently just looking at some strategic planning and I was asking it some deep questions on emerging technology including ai. And again, it comes back with these really nice high level answers and you read 'em and then you're just kind of like, well, yeah, but when you push on it, that response really doesn't fully stand up. I think that's a big misconception or it's a desire that we're seeing from people and they hear the hype and then they go and they use it and then they're like, but it didn't deliver.

(04:29):And I think that that's one of the things that we're really seeing as a challenge or they want more interconnectedness, so they want awareness. So a funny one, the provost here was actually trying to use AI to write a poem about our chancellor, and ultimately AI just didn't have the contextual awareness and it couldn't cobble together an understanding of who our chancellor was. Even though there's lots of social media, there's lots of content that could be pulled in. And so again, it's the application, it wrote a poem, it just wasn't very good and it wasn't really contextually relevant about the chancellor. And so I think there's sort of that humid and machine disconnect that still exists with the tools that we're seeing. In some cases, they're useful and they're valuable. I mean, you're wanting to check code, you're wanting to quickly ask some complicated response, then it's useful. But as far as getting full automation, for instance, I really wanted to do things where it has awareness of my email and can actually write prompts based on historical knowledge of conversations in email and then actually provide responses that are useful where I can now quickly vet the response and send it versus having to sort of craft it all from the beginning.

Amrit Ahluwalia (05:52):Well, it's so interesting. I mean, this is where it starts to get into this question of technology versus magic, and it's an alignment piece that I think is really interesting because once we start to veer into magic, I think we maybe lose a little bit of sight as to what humans happen to be great at. I don't have the data point offhand, but there was I think a McKinsey report that spoke to AI not necessarily replacing jobs, it was replacing paid work activities. So when you kind of parse it to that extent, you say, well, if folks have say 30% of their day opened up, what more can they accomplish? What more can they really be impactful around as opposed to doing things that are automatable? I'm curious, as you look to your colleagues, as you survey your own campus environment, what are some of the unique areas where generative AI is really capturing the imagination of your colleagues?

Matthew Gunkel (06:58):So some of the things mean there are things that we're hopeful for, and so then we're trying to figure out how do we apply AI to those things. I've got some work in the school of medicine right now where we're really looking at how it can assist with charting or making patient connections. So you hear five obscure data points, and then of course a physician is always trying to triangulate, well then what does that mean is wrong with you? And of course an AI can then help with, well, here are other suggestions that physician might consider. Here's medical research that might further back those up, especially if you're in trying several different treatment options or plans. We're also just looking at how we can expedite levels of patient care through that charting process, through the capturing of notes, through the sharing of information back to nurses and those kinds of things on people's plans.

(07:58):I think some of those kinds of things, to your point where those are tasks that a computer can do well, I think we're going to see those continue to come to fruition quickly in the next six to nine months. We're also actively working on things like advising. How can we begin to tackle more complex problems? How do we assist all of the different nuance of all of our different students who are trying to work through the bureaucracy of a university, navigate it and then actually craft a program for the kind of learning experience and outcomes that they want to have? And so that's where we're really excited about leveraging AI for advising, because it can have awareness of schedules, it can have awareness, of course outcomes. It can have an understanding that of who that student is and then how we actually might ask them survey questions around, well, what are you trying to do? What are you trying to accomplish? What do you want to be? And then how can we now customize and guide you down those paths in a much more nuanced fashion while also then working to meet the mission of the institution, which is to graduate students, avoid student debt and do some of these sort of very administrative things from an educational perspective.

Amrit Ahluwalia (09:18):That's so interesting. One thing before we get to hype, I do want to talk a little bit about hype. You've sort of always been in the public institutional space. You were Indiana, you went to Missouri system, now you're at uc, Riverside. Do you think that there's a unique sort of level of responsibility or a unique level of risk awareness that public university technology leaders in particular need to bear in mind when it comes to exploring the possibilities of technologies like call it chat, GPT specifically, but generative AI more generally?

Matthew Gunkel (09:59):Yeah, I mean, this is one of those where data is the new commodity and it's the thing that's going continue to drive us and rule us into the future. And this is where when you look at historical privacy policies or you look through click through agreements where it's the thing that pops up for 30 seconds and then you say, yes, yes, company, you can have my life away. Here's my firstborn and all these things. I think this is where part of our responsibility is educational institutions or really just institutions in general, is to help provide increased transparency for people around what are we doing? Where are you giving us data and information, and then how are we effectively using it? Or where do you have an opportunity to opt out or say, please not that data, please don't use my information for X, Y, Z, or where we can give people more control over the portability of that information so when they leave, they can take it with them and then they know that we're not using it or that it's not going back and it's not driving the development of models, et cetera.

(11:15):And so I think for me, that's really a big responsibility that we have is to increase this transparency and to work on simplifying that narrative for people so that way they can understand it because we are seeing abstraction of the technology. The technology is getting more advanced. I mean, you even have AI scientists who look at the AI and they go, I don't know how it got there. I don't know how I learned a new language. And when they dig through the code, it is jargon, right? They can't follow it either. And so we are going to have this explainability factor that I think we have to continue to directly address. And we're seeing that in federal policy and local governments are looking at it. But I think the more that we can do to help people understand the importance of the data and their information, we can really help provide service to them in that regard as the population at the university that we're trying to help and support. It can be part of the educational experience as well, which is interesting. But I think that's the big one, right? Because it's all about data and how we're leveraging and using data. It's not so much explicitly about the large language model or the artificial intelligence piece that it might be being used in.

Amrit Ahluwalia (12:30):That makes sense. That makes sense. So getting to hype cycle, and this is again, it's a topic you brought up in, oh geez, this was months ago, whatever town we were in for Ed, cause and you talked a little bit about the zone of disillusionment and it's a term you used in this conversation as well. So as a campus technology leader, how can you sort of navigate waning interest and exhaustion that people will inevitably run into with this technology to ensure that its adoption still stays on track is as a valuable tool or a valuable asset to the operation of the institution?

Matthew Gunkel (13:09):So I think one of the big ones first is as people are running up that hype cycle curve is give them runway and ramp to do that. So trying to let them play with tools, let them understand where the value is because that's what everyone's looking for. When you're trying 10, 20, 30 tools or you're out Googling and you're trying to find the perfect resume builder or whatever it is, the more that people can explore and try things and work with the tools to see if it facilitates different use cases or scenarios or if it actually alleviates that workload where they're looking to solve a really operational problem.

(13:51):That's sort of one way. And then I think once that is underway where this exploration is occurring, people are working through trying new tools, it's then pulling together community. And so this is one thing that we've really been doing heavily. So UCR heavily leverages Slack. I think you can do this in any chat platform, community platform. But basically pulling together the people who are spending the time doing that analysis and then having a conversation around where they're deriving value, where are they finding useful answers? It's really less about the specific piece of technology because as we're sort of seeing the expansion of all of these different tools, we're seeing however many hundreds a week or whatever we're still at, we'll eventually then begin to see consolidation. So as we start to come back off that hype cycle, see the consolidation, then I think it'll really flow back into then where were we seeing explicit value and how then can we start to hone in on that piece in particular and then begin to move that forward in our work.

(15:02):And that can be with continued partnerships that can really be if somebody truly derived something very specific in the market space and we go, ah, that's it, that one had value and starting to get those in place. On the IT side, it's also about building out the ability to integrate quickly. So again, as we're kind of trying and then people are saying, I found the value and now we're going great, now I've got to get that integrated into ServiceNow, Oracle, whatever the big gnarly system is that we're either trying to replace just replaced or have had for 30 years.

Amrit Ahluwalia (15:43):One thing, stepping back from generative ai, looking more broadly at the role and the place of technology in the modern campus, I mean you've seen over the past decade, decade and a half technology playing a much larger role in the operation and administration of the post-secondary institution. And that shift has really happened in concert with a very distinct change in the way that students think of themselves and the way that consumers behave. Right? Folks are becoming more and more have higher and higher expectations around technology as being part of their learner experience. Like you're seeing a greater intersection of technology and physical space in terms of how folks want to interact with any service provider. And so from the perspective of the post-secondary institution, how are you seeing the role of technology in general changing in the way that the institution offers a student experience that today's learners actually expect?

Matthew Gunkel (16:44):Yeah, I think it's one of those where we're now really seeing digital first as an expectation. So I don't know, the beginning of the last century we've really seen where people were just showing up with more technology, but the expectation wasn't there. And we're now starting to really see a student population where they've lived with technology their entire lives, and now when they're coming, not only do they want to use the tech, but then they also now are a little bit more self-aware around how they like to use it and their preferences for engagement. And so that's a shift and that's been sort of further amplified as we went through covid because people were then also forced to self consider, how do I want to engage? And so that level of engagement and that awareness has now shifted, and so we're kind of seeing this rebalancing with how are needing to consider that engagement with their student populations, or we're seeing lots of students bringing lots of new technology into the classroom space on a regular basis that we're now having to be responsive to or be beginning to take into consideration. We're even seeing that with ai. So as it's coming into the classroom, it's not that you can't stand on historical pedagogical approaches, definitely can, but you need to be prepared for students to be bringing new narratives, new ways of thinking, and new tools to the conversation.

Amrit Ahluwalia (18:28):Well, to that end, I mean, how are you seeing the role of campus technology leaders evolving as technology in and of itself starts taking more front and center role in the way that learners and the way that stakeholders in general interact with the institution?

Matthew Gunkel (18:44):Yeah, I think for the technology leaders, it's really facilitating the microcosm of a university and having platforms that people can operate these pieces of technology safely on. As you know, cybersecurity is an ever increasing threat for various operations. Universities are often targeted. And so we're working every day to figure out not only how can we do all of this sort of breadth of different things that people want to run on our networks and environments, but then how can we also do that in a secure way? And so I think as we continue to see sort of more tools come onto to the market, we're going to continue to see technical people needing to really build out and improve knowledge around really data and how are you managing data and information and access to that information where people are managing different workflows and filling out different forms. We're seeing a number of changes just in how marketing and Google are now collecting data and information and shifts there. And so it's similar in that regard because we have to be in a place where we can help support all of the academia and sort of what they're wanting to do from a teaching perspective, but then also bringing in all of this technology that people want to leverage and use and or learn from because it is becoming more and more a requirement for the learning experience.

Amrit Ahluwalia (20:27):Well, Matt, this is the phase of the interview that I warned you about where we'll pivot from being a higher ed podcast with Food Podcast. And for those of you who don't have a sense of the geography of Southern California, Riverside is about halfway between Los Angeles, downtown LA and Palm Springs. Matt, someone's going to dinner in Riverside, which in fairness you might have to do on the six hour drive between downtown LA and Palm Springs. Where do you need to go for dinner?

Matthew Gunkel (20:55):Oh, okay. I'm going to pivot on you and I'm going to give you my favorite. Not for, I mean, I'd say dinner, but I don't think they're open for dinner. So I'm really a breakfast guy. And so there's a place in Riverside just downtown called Simple Simons, and they all their own, so they bake all their own breads, they make their own croissants, they make their own jams and jellies. Yeah. So they also make their own, they grind their own sausage out of here. Yeah, it's so good. So that is hands down, one of my favorite places, if not my favorite place. I go there all the time, so absolutely fantastic. But again, it's breakfast and I don't just

Amrit Ahluwalia (21:43):Schedule your drive appropriately.

Matthew Gunkel (21:46):That's right. Yeah, well, you just got to get up extra early and hit it on the way in. You can definitely get it for lunch, so no doubt about that, but it's absolutely amazing food.

Amrit Ahluwalia (21:57):That's awesome, Matt. Hey, I so appreciate your time and thank you so much.

Matthew Gunkel (22:01):I really appreciate it. Thanks for having me.

Voiceover: This podcast is made possible by a partnership between Modern Campus and the evolution. The modern campus engagement platform powers solutions for non-traditional student management, web content management, catalog and curriculum management, student engagement and development, conversational text messaging, career pathways, and campus maps and virtual tours. The result, innovative institutions can create Learner to Earner Lifecycle that engages modern learners for life, while providing modern administrators with the tools needed to streamline workflows and drive high efficiency. To learn more and to find out how to modernize your campus, visit modern campus.com. That's modern campus.com.