In the latest episode of our AI first podcast series with Box Chief Customer
(00:00) Jon Allen’s dual role at Baylor University
(03:10) Defining what it means to be AI-first at Baylor
(07:00) Real-world examples of AI transforming workflows
(12:30) Leveraging AI to enhance collaboration and speed up operations
(18:00) Balancing cybersecurity with AI adoption in higher education
(24:00) The importance of education and transparency in AI adoption
(30:00) Managing data privacy and security challenges with AI
(36:00) Preparing students for an AI-first workforce
(42:00) The future of AI in higher education and content management
(48:00) How Baylor is evolving to support AI-driven innovation
(54:00) Closing thoughts on AI adoption, culture, and strategy in education
Highlights and quotes
AI as an Enabler of Human Connection
A recurring theme in the conversation was that AI should supplement, not replace, human interaction. Allen debunked the outdated narrative of AI as “the rise of the robots” and reframed it as a virtual companion or collaborative team member. He emphasized: “AI isn’t the replacement… AI is actually an enabler to greater human interaction.”
For Allen, AI is a powerful tool to reduce time spent staring at screens while empowering people to connect more deeply with each other. This philosophy challenges assumptions and underscores why organizations should position AI as a bridge rather than a barrier for human connection.
The Role of Education and Accessibility
As AI integrates into everyday tools, accessibility has become critical. When it comes to system design, Allen noted practical shifts that make AI tools more impactful. Simple improvements—like moving a button within an application rather than forcing users to navigate an external web platform—can significantly increase adoption and value.
“Small adjustments like these directly influence how users interact with AI, encouraging use and showcasing real enablement,” he asserted.
Additionally, Allen highlighted the emergence of “AI natives,” referring to the younger generation who intuitively rely on AI for context, decision-making, and problem-solving. He shared a relatable example:
“My youngest two kids… they think AI natively. When faced with a problem, the first thing they do is engage with AI.”
Preparing Students and Teams for the Workforce
Higher education is at an inflection point as students approach their careers armed with AI tools that align with their instincts. According to Allen, institutions must provide similar AI technologies in the classroom to prepare students for workplace applications—along with guardrails to ensure responsible use.
Speaking to these concerns, Allen explained: “The context that students learn AI in is often divorced from intellectual property realities. They shouldn’t have to learn those lessons when they enter their first jobs.”
For technology leaders, this highlights the need to balance innovation with practical training on security, ownership, and ethical AI usage.
Balancing Innovation and Risk
Another central topic was the tension between innovation and risk management. As both Baylor’s CIO and CISO, Allen underscored that his role involves balancing these two often opposing forces. He explained:
“Good risk management is about enabling the business while safeguarding intellectual property and constituent data.”
This mindset applied directly to AI adoption as Allen urged leaders to stop fearing AI technologies, seeing them instead as gray areas compared to traditional black-and-white systems—making them feel “more human-y.”
To navigate this duality, decisions must meet key criteria:
“Any technology should be evaluated based on whether it’s truly an enabler, the best solution for the problem, and aligned with continuous improvement.”
Lessons Learned as a Technology Leader
Reflecting on his experiences over the past three years, Allen shared valuable lessons for leaders navigating technological transformation:
-
Adaptability Over Assumptions: “Don’t always assume you can predict marketplace trends.”
-
Principles Over Policies: “I’m not one to run to policy. Creating unnecessarily rigid policies could hinder adaptation; principles should be flexible enough to evolve with the needs.”
-
Enabling Bottom-Up Innovation: “The best ideas often come from individuals embedded in the organization. Empower them to surface problems and propose solutions.”
Leaders must enable frontline workers both as architects of innovation and as stewards of safe technology use—embracing their expertise while guiding strategic alignment.
Allen referred to HECVAT (Higher Education Vendor Evaluation Toolkit) as a standard for vendor evaluation, which now also helps with AI evaluation. HECVAT
“Well , and if you could take that forty to sixty hours of staff time and not only reduce it , but also compress it in a way that the turnaround back to the the requester is faster . It just means we accelerated the velocity getting that through the system” added
“So may maybe a tip for anyone out there listening who sells to higher ed , maybe get on the HECVAT , make sure you understand what's in there and how to how to complete one” said
Democratization of AI Platforms
The podcast highlighted key advancements in AI accessibility and democratization, particularly as platforms evolve to include multiple AI models. Discussing this evolution, Allen remarked:
“Six months ago, there was one model. Now, platforms offer three or four lineages… these tools must democratize access so anyone—not just technologists—can use them effectively.”
Furthermore, security stood out as a priority:
“Maintaining data in one secure domain rather than moving content across platforms simplifies risk management and strengthens posture.”
“researchers , especially on our campus , would say I need to get access to multiple models , and there's no easy way for me to do that . Right ? You guys have enabled that a huge way . To be able to go into a single platform and just click a drop down and select from multiple lineages of models , and see how the results vary so greatly”
Wrapping Up: A Culture Shift from “No” to “How”
The conversation concluded with Allen’s philosophy of technology enablement, highlighting the shift from being “the office of no” to “the office of how.” His advice to leaders? Always evaluate tools on whether they truly enable strategic goals or continuous improvement. As
Also thanks to
Would love to hear from our Higher Ed Community how these use cases and the comments from Jon Allen resonate with your experience, feel free to comment in the reply!
