Why Fighting AI in the Classroom Is the Wrong Battle with Chris Campbell - Ep 207
Chris Campbell is the CIO at DeVry University, a career-aligned institution focused on delivering education in high-demand technology fields that translates directly into workforce-ready skills. In his role, Chris leads technology, cybersecurity, and digital innovation across the university, with a current focus on integrating AI into both operations and curriculum in ways that prepare working adult students for the jobs they are already heading into. DeVry has made a public commitment to embed AI literacy into every program and every course regardless of discipline, and Chris is the technology leader driving that commitment from the inside out.
Here’s a glimpse of what you’ll learn:
- Why DeVry made a formal commitment to embed AI literacy into every program and course regardless of field of study and what that looks like in practice
- Why Chris calls them durable skills rather than soft skills and why critical thinking, problem solving, and human collaboration are the career-long differentiators that AI cannot replace
- How DeVry created an internal AI Labs function to manage prioritization, ROI, and the scaling of AI initiatives without losing focus on student outcomes
- Why the intersection of quantum computing and AI is the one development that concerns Chris most as a security leader and what quantum-resistant encryption means right now
- How DeVry uses AI on both sides of the academic equation, helping students build stronger outputs and helping faculty assess work more effectively while maintaining academic integrity
- Why the power users of AI treat it like an employee rather than a tool and why that distinction determines whether AI delivers lasting value
- Why vibe coding and other AI-assisted development shortcuts still require a foundation of architectural thinking and why fundamentals matter more now not less
- How cybersecurity education is shifting away from manual monitoring toward systems thinking, threat understanding, and working alongside automated tools
In this episode…
Chris opens with a position that should be the default for every higher education institution but still is not: if students are walking into an AI-influenced workforce, then keeping AI out of the classroom is not protecting academic integrity, it is leaving students unprepared. DeVry has gone further than most by making a formal public commitment to embed AI literacy across every program, from accounting to software engineering, not as a standalone course but as a thread running through everything. The framing matters too. Chris does not talk about AI replacing work. He talks about AI redefining it, and that distinction shapes every decision his team makes about how students learn, how faculty teach, and how the university itself operates.
The internal structure Chris has built is worth noting. Rather than letting AI initiatives proliferate without direction, DeVry stood up an organization called AI Labs that routes every AI initiative through a single prioritization and ROI framework. That discipline has allowed them to evaluate tools seriously, adopt what works, and avoid the trap of chasing every new product that claims to be AI-powered. On the security side, Chris draws a sharp distinction between machine learning applied purposefully in products like Darktrace and Abnormal versus traditional security products with an LLM bolted on. He describes the emerging dynamic as an LLM war, where the organization is putting AI in on one side and the threat actors are putting AI in on the other, and the organizations that do not deploy deliberately will find themselves on the losing end of that exchange.
The conversation about durable skills is where Chris brings his most forward-looking thinking. He rejects the term soft skills precisely because it undersells what these capabilities are worth in an AI-powered world. Critical thinking, problem solving, the ability to articulate how you arrived at a conclusion, and the ability to collaborate with both human and agentic workers are the skills that will carry people across their entire career arc regardless of what the technology landscape looks like in five or ten years. DeVry builds these into every level of the curriculum, from certificates through bachelor's degrees, because the institution's defining purpose is relevance. What they teach must align with where the workforce is going, not where it was.
Resources mentioned in this episode
Matthew Connor on LinkedIn
CyberLynx Website
Chris Campbell on LinkedIn
DeVry University Website
Sponsor for this episode...
This episode is brought to you by CyberLynx.com
CyberL-Y-N-X.com.
CyberLynx is a complete technology solution provider to ensure your business has the most reliable and professional IT service.
The bottom line is we help protect you from cyber attacks, malware attacks, and the dreaded Dark Web.
Our professional support includes managed IT services, IT help desk services, cybersecurity services, data backup and recovery, and VoIP services. Our reputable and experienced team, quick response time, and hassle-free process ensures that clients are 100% satisfied.
To learn more, visit cyberlynx.com, email us at help@cyberlynx.com, or give us a call at 202-996-6600.
Check out previous episodes:
Maritime Cybersecurity, AI Governance, and the Threats No One Sees Coming with Amit Basu - Ep 206
The Arms Race, the Energy Gap, and the Ethics of Teaching AI to Be Good with Alex Dalay - Ep 205
Transcript:
Chris Campbell Interview Transcript
Cyber Business Podcast
Guest: Chris Campbell, CIO, DeVry University
Matthew Connor: Matthew Connor here, host of the Cyber Business Podcast. Today we're joined by Chris Campbell, CIO at DeVry University. Chris, welcome to the show.
Chris Campbell: Thank you, Matthew. Glad to be here.
Matthew Connor: It's great to have you. Before we get too far in, a quick word from our sponsors. Hackers are getting smarter — is your security keeping up? Cyberlink sells industry-leading, AI-powered cybersecurity solutions that detect threats in real time, so you know about an attack before the damage is done, not after. Learn more at cyberlinks.com. And now back to our show.
Chris, for those who aren't familiar, can you tell us about DeVry University and your role there as CIO?
Chris Campbell: Sure. At DeVry University, we're focused on delivering career-aligned education, especially in high-demand technology fields. Our mission is to help learners build skills that translate directly into the workforce and carry them across their career arc. As CIO, I lead our technology, cybersecurity, and digital innovation efforts, with a major focus right now on integrating AI into both how we operate and how we prepare our learners for the future of work.
Matthew Connor: Those are two topics on everybody's mind right now. Let's start with how you're integrating AI into the teaching side — how you're preparing students for the AI-powered future.
Chris Campbell: One of the things we've done is publish our AI commitments, which include incorporating AI literacy and AI skills into every program and course we offer, regardless of the area of focus. Whether you're studying accounting or software engineering, it doesn't matter — the skills required to use AI effectively are relevant to all of them. We've been focused on this for about the last twelve months. We don't see AI as replacing work; it's really more about redefining it. We operate from the belief that every graduate is walking into an AI-influenced role, whatever field they're entering.
Matthew Connor: I think that's fantastic. You hear a lot in the news about educational institutions grappling with how to keep students from using AI, and to me that makes no sense. A large part of what a university is for is helping people get jobs. It doesn't make sense to shield them from AI if they're going into an AI-powered job market. How are they going to be competitive if their education looks the same as it did twenty or thirty years ago? I love that DeVry University is integrating it into everything. That's brilliant.
Chris Campbell: We really believe that not everyone needs to be an AI engineer, but every single person is going to need to know how to use it effectively. And what's interesting about the world of AI is that the durable skills employers have always prized — critical thinking, problem solving — are still required to use AI well. You still have to think critically, do the planning, and bring those distinctly human capabilities to the work. The skill of leveraging AI to propel that work is simply another layer on top of that.
Matthew Connor: Absolutely. And can you walk us through what that actually looks like in a course? Because I think a lot of people still associate the discussion of AI in education with cheating — the student just had AI do the work for them. What's the DeVry University approach?
Chris Campbell: What we're focused on is practical exposure. We're working to help students evaluate how they leverage AI to get strong outputs, how they assess whether those outputs are actually strong, and how they take the analysis further. What we're deliberately avoiding is what I've heard recently called "AI slop" — just blindly accepting whatever the tool produces. We firmly believe that if you don't have practical, meaningful ways to use AI, you're going to be at a disadvantage the moment you hit the workforce. We're already seeing real challenges emerge for some entry-level roles, and we want to make sure our students know how to leverage this powerful tool to their advantage.
Matthew Connor: One hundred percent. The goal of education is learning how to think. If you're just passing everything off to AI, you're passing time, not building capability. But the approach you're describing — you still do the thinking, and you use AI to get the job done better and be more competitive — that's the right model. I've spoken with at least 150 CIOs across a range of universities, and I think this is the first time I've heard an institution embrace it this directly and thoughtfully. I really hope others take note and start emulating it.
Now on the CIO side — how does integrating AI at the student level change how you're managing things internally?
Chris Campbell: It's certainly put a renewed focus on upskilling and driving AI literacy within our own organization. We've taken lessons learned from the academic side and applied them internally, committing not only to the curriculum changes but also to ensuring that all of our colleagues are upskilled, oriented, and have access to the tools that will make them more effective.
What we're also seeing shift is the nature of the work itself. We used to focus on very technical, procedural skills — how to create a variable, how to move data from point A to point B. Now there's a much greater emphasis on critical evaluation, understanding risks like bias, and higher-order thinking. We're even seeing that early planning work move into the functional areas of the university. It used to be strictly architects in a room figuring things out. Now we're getting better requirements, better problem definition, and more creative thinking about potential solutions from partners across the whole institution — which makes delivery dramatically faster and better. What AI is bringing to the table is genuinely powerful, and also genuinely fraught with risk. The more knowledge and literacy we build across the university, the better positioned we all are.
Matthew Connor: I couldn't agree more. And I think one of the most exciting areas is AI in cybersecurity specifically. We're seeing tools like Darktrace using machine learning in really effective, well-designed ways — the right kind of AI doing the right job. Then there are traditional security products where someone just bolts an LLM onto the stack and calls it AI-powered, which actually introduces new problems like prompt injection vulnerabilities. Products like Darktrace and Abnormal Security give us a real glimpse of where AI can take cybersecurity when it's done thoughtfully.
But on your end, is integrating AI into security and other parts of the university a straightforward process, or is it something you're approaching with great caution? Because there are hundreds of new products launching every day and only so many hours to evaluate them. How do you manage that as a CIO — on top of an industry that's never moved this fast in history?
Chris Campbell: You're right, and I've been in this industry for quite a while — I haven't seen anything move at this pace either. One of the first things we did when the landscape shifted was stand up an internal organization we call AI Labs, and we run all of our AI initiatives through it. That helps with prioritization, ensures we have a clear ROI to pursue, and helps us scale our initiatives thoughtfully — whether on the security side or anywhere else. The idea is that there are a million things you could chase, and AI Labs gives us the structure to be selective and deliberate.
On the cybersecurity side specifically, what we're starting to see is what I'd call the LLM war — AI going in on our end, and adversaries putting AI in on the other end, and these things beginning to compete with each other. That is frankly a frightening aspect of where this is heading. So we tend to be very deliberate in our pace, ensure we have clear ROI, and confirm we can at least see a roadmap to scale before we commit. We may run a test — we've looked at and actually used Abnormal Security, and it performs well. But for every one product like that, there are probably a dozen that simply didn't produce what we expected.
My bigger fear, honestly, is the intersection of quantum computing and AI. That one makes me nervous. That's going to be a step-change difference.
Matthew Connor: That is a game changer, and a scary one because it's so unknown. I think for the foreseeable future AI will create more jobs than it eliminates — much like the early days of the internet. But quantum computing combined with AI really does change the equation on everything, including encryption. Governments are already worried about adversaries stockpiling encrypted data today that they can't crack yet, knowing that quantum computing will eventually let them decrypt it retroactively. Quantum-resistant encryption is becoming a genuine near-term priority in government circles, though we haven't seen as much urgency on the commercial side yet. It's a strange moment — there have been so many breaches already that people sometimes wonder what's truly private anymore. You still have to fight for privacy, but the ground has shifted.
Chris Campbell: Right, and cycling back to your question about cybersecurity — it is still one of the most in-demand fields we track. But the nature of the work is changing rapidly. There's far less emphasis on manual monitoring, the room full of security analysts watching dashboards. There's much more energy on understanding systems, understanding how threats materialize through an organization's workflows, and working effectively with automated tools. That's why, both for our learners and internally, we prioritize hands-on experience — labs, simulations, real-world scenarios — and building that practitioner mindset. We're developing it in our students and developing it in ourselves simultaneously.
Matthew Connor: That's really the differentiator between DeVry University and a lot of other institutions. Many take a traditional, academic approach. You're taking a very real-world, hands-on approach — here's what's happening today, let's get you up to speed so you're an asset the moment you join an organization. And you're doing it in real time, which is remarkable, because unlike printing an encyclopedia that's outdated before it ships, you're keeping pace with a field where everything changes week to week.
Chris Campbell: The pace of higher education is an interesting thing to navigate. We are an industry that still wears and celebrates medieval robes at commencement, which says something about how quickly the sector traditionally moves. Higher education is not the fastest-turning ship. But we find ourselves working hard to figure out how to pivot and bring this rapidly changing technology to bear more quickly. Many universities are doing this successfully, as is DeVry University. Make no mistake, though — it is a foundational kind of shift. It touches the very structure of how we operate.
Matthew Connor: How has it been received on the student side? Are they embracing it, or do some students still want the traditional model — here's the material, take the test, get the grade?
Chris Campbell: You'll always have students who prefer to absorb information, take their assessment, and move on. But the students who are genuinely hungry for development — the ones who can see the road to their career — and that's the majority of our students — they are very excited about the opportunity to work with this technology, often for the first time.
It's also worth noting that the DeVry University student is quite distinct from a traditional college-age student. Almost all of our students are working adults with full-time jobs. They deeply resonate with the opportunity to learn something today that they can take to work tomorrow. That's central to our career-focused model. And one of the things we've learned at DeVry University over the years is that early, tangible exposure makes the biggest difference. When a student can see how something works in practice, and can see people who look like them doing it successfully, it changes their trajectory. That's one of the reasons we layer mentorships and outreach opportunities on top of the academic work — to create those connective moments early in a student's journey.
Matthew Connor: That makes a lot of sense. And how does grading work when students are expected to use AI? If the old model was "write a paper," what does assessment look like now?
Chris Campbell: We leverage AI on both sides of the equation. Academic integrity is still very much a real and important thing at DeVry University, and the way we address it is by designing learning experiences that genuinely require critical thinking, deep analysis, and the problem-solving and organizational skills that don't simply fall out of asking ChatGPT or Claude a question. The assignment structure demands that students demonstrate their reasoning, not just produce an output.
Our faculty also use AI platforms to assist with grading and assessment. And in a lot of ways it comes back to what we all learned in school: show your work. How did you get there? Does your reasoning hold up? That kind of accountability ensures we understand what the student actually contributed. We've found our faculty are fully up to the challenge, and our students are doing well. We really haven't seen a significant wave of academic dishonesty — some, of course, but far less than one might fear.
Matthew Connor: That's great. Looking into your crystal ball — where does this go? Does DeVry University continue down this same path, or do you see it changing dramatically as AI becomes even more capable?
Chris Campbell: I think we continue down this path, but with an ever-sharper focus on discrete skills. The other major emphasis is on what we call durable skills — what others might call soft skills, but we prefer "durable" because they're the ones that carry you across your entire career arc. Deep thinking, critical evaluation, the ability to assess and fact-check information, human collaboration — and increasingly, knowing how to collaborate with a digital worker alongside a human one. Agentic AI is already raising that question in very practical ways. I don't think this is going to stand still. It's going to keep us on our toes.
Matthew Connor: That's a great point. I read an article — I believe it was Forbes or Fortune — about when Microsoft first rolled out Copilot internally. After three or four months, they assessed usage and found that about 10 to 15% of people were heavy power users who continued to leverage it deeply. The other 85% tried it and tapered off. When they dug into why, they found the power users treated AI like a new employee — they communicated with it, taught it what it needed to know, gave it context, and managed it intentionally. The others just wanted it to instantly do their task, got frustrated when it didn't, and reverted to doing things themselves. That divide maps exactly to what you're describing. The communication skills, the ability to direct and collaborate — those translate directly into how effectively someone uses AI.
Chris Campbell: Exactly. And the real superpower of AI is context — context of your customer, context of your organization. That's where the genuine value lives. You've probably read about vibe coding and similar trends. But if you don't understand how to architect a software platform that is secure, functional, and reliable, vibe coding is not going to carry you very far. You'll end up with real challenges. It can absolutely be an accelerator, but the fundamentals and the foundational understanding are still the superpower. Our adult learners already have that superpower in their domain knowledge and work experience. What DeVry University is working to do is amplify it — help them do even more with what they already know. AI is accelerating change, but the fundamentals matter. Adaptability, critical thinking, continuous learning — those are what will carry you across a career arc.
Matthew Connor: One thing that strikes me about this conversation is how mission-focused you are as a CIO. There's been a real shift over the last decade from IT being the department of "no" — always checking, always cautioning — to CIOs who are genuinely driving organizational outcomes. You clearly embody that. Your focus is on student success, not on which data center you're in or which productivity suite you're running.
Chris Campbell: You have to do both — I want to be clear about that. But I will say that as CIO, I spend far more time talking about student success and persistence than I do about infrastructure decisions or productivity app choices. Those conversations are just not that relevant when you measure everything against the mission and the outcomes we're here to deliver. Our challenge — and I think this is true for most CIOs today — is figuring out how to do both well. And it's partly what's driving a resurgence of the generalist in IT: someone who knows something about a lot of things and can leverage these powerful tools to go deep when needed. It's an interesting time.
Matthew Connor: It really is. I think we're living in the most exciting time in human history — the most interesting, the fastest-paced, and by any measure the most prosperous, thanks to the technology we've built. And it's only going to accelerate. Being in technology right now, and especially being in a position like yours where you can help others get into it and thrive in it — that's remarkable work. DeVry University is genuinely preparing people to be more competitive, more capable, and more valuable to the organizations they join. That's a lot more than a diploma.
Chris Campbell: Thank you. We think we're approaching it right. The common thread for everything we do is relevance — making sure what we're teaching aligns with where the workforce is actually going. That's the north star.
Matthew Connor: Well said. Chris, this has been an absolute pleasure. Before we go, can you tell everyone where they can find out more about you and DeVry University?
Chris Campbell: Absolutely. www.devry.edu is a great place to start — you'll find everything we're doing and how we're focused on workforce outcomes. And you can find me on LinkedIn as well. The intersection of technology and humanity is a fascinating space, and I have the privilege of waking up every morning thinking about how to help learners create career mobility — whether it's a pivot, a promotion, or starting from scratch. We'd love to hear from anyone who wants to reach out.
Matthew Connor: Fantastic. Chris, thanks so much. Until next time.
Chris Campbell: Thank you. Have a great day.
Matthew Connor: You too.







