Strategic Leaders and Partners
Institutional leaders are cautiously optimistic about AI. Just over half (52%) of executive respondents reported that leaders at their institution are approaching AI with a mix of caution and optimism (see figure 5). Further, nearly a third (29%) of executives reported that leaders at their institution are enthusiastic or very enthusiastic about AI. In contrast, managers, directors, and frontline faculty and staff were more likely to indicate that their leaders are very cautious, cautious, or indifferent toward AI. Together, these results point to an opportunity for colleagues to communicate across silos about their sentiments.

At most institutions, someone is working on AI-related strategy. A majority of respondents (73%) indicated that some or most units at their institution are working on AI-related strategy (see figure 6). In "other" open-ended responses, respondents described situations in which only a few interested individuals are working on AI. One respondent explained, "Most are aware that we need an AI strategy but are awaiting central guidance on how to start."

There's no clear front-runner for AI-related strategy leadership. Just over a quarter of respondents (28%) indicated that executives are the primary leaders for AI-related strategy at their institutions (see figure 7). That response is closely followed by faculty, managers/directors, and professionals/staff, at 21%, 17%, and 16%, respectively. Most of the respondents who selected "other" described collaborative teams comprising multiple job roles. As one respondent wrote, "Everyone. We have an 'AI is for Everyone' campaign, and strategies are both grassroots and top down." This sentiment is echoed in a closed-ended survey item in which over half (55%) of respondents agreed or strongly agreed that collaborative groups spanning units are working on AI-related strategy at their institution.

Disaggregation of these data by respondent job role reveals an interesting pattern (see table 1). Respondents in leadership roles were more likely to identify executive leaders, managers, and directors as the primary leaders for AI-related strategy at their institutions. In contrast, respondents in frontline roles were more likely to identify faculty and professionals/staff as the primary leaders for AI-related strategy at their institutions. This pattern indicates that opportunities exist for more institution-wide communication, agreement about roles and responsibilities, and collaborative work.
Respondent Job Role | Professionals/staff are the primary leaders. | Faculty are the primary leaders. | Managers/directors are the primary leaders. | Executive leaders are the primary leaders. | Don't know/other |
---|---|---|---|---|---|
Leadership (i.e., executives, managers, directors) (N = 392) |
11% |
17% |
21% |
37% |
14% |
Frontline (i.e., professionals, staff, faculty) (N = 386) |
22% |
23% |
12% |
19% |
24% |
Everyone has a role to play in AI-related strategic planning. More than half of respondents (56%) indicated that they have personally been given responsibilities related to AI strategy. Disaggregating these data by job role, a larger proportion of respondents in leadership positions have been tasked with AI strategy than those in frontline roles (see figure 8).

In a follow-up question, respondents described the specific AI-related tasks they are working on:
- Educating students, staff, and faculty about AI
- Developing guidance and support for faculty using AI in their teaching
- Facilitating faculty teaching and learning groups (e.g., faculty learning communities)
- Coordinating and supporting task forces and working groups
- Creating and running AI-focused academic programs
- Evaluating AI-powered tools
- Developing and evaluating a wide variety of AI-related policies and guidelines related to faculty, staff, and students, both for teaching and learning and for work
- Supporting AI for research computing
- Advising stakeholders about AI-related privacy and security issues
- Researching AI-related topics (e.g., use cases, exemplars, privacy and security)
Notably, some respondents emphasized the ethical and transparent use of AI, for example, "working with faculty…on an ethical use statement" and "crafting a policy for ethical and transparent student use of generative AI in coursework and assessments."
Similarly, most respondents indicated that all functional areas are at least somewhat responsible for AI-related strategy (see figure 9). Unsurprisingly, teaching/learning and technology top this list, with 86% and 78% of respondents (respectively) indicating that these units are somewhat or to a great extent responsible for AI-related strategy. This finding is reinforced by respondents' answers to a separate closed-ended survey item. Most respondents (68%) agreed or strongly agreed that their faculty's interest in incorporating AI into teaching is on the rise, and 38% agreed or strongly agreed that their IT leaders consider AI technology as mission-critical in terms of the support provided.

For "others not listed," respondents described functional areas such as accessibility, admissions, health care, continuing education, public safety, board of trustees, libraries, marketing and communications, and student affairs.
Most institutions are probably not leveraging third-party partnerships for AI-related strategy and cost. More than half of respondents (57%) indicated that either their institution is not working with third-party partners to develop AI strategy or they don't know whether their institution is working with third-party partners to develop AI strategy (see figure 10). The most commonly chosen partners were peer-institution consortia or networks (30%) and professional associations (22%)

Similarly, 84% of respondents indicated that either their institution is not working with third-party partners to share AI-related investment costs or they don't know whether their institution is working with third-party partners to share AI-related investment costs (see figure 11). The top three sources of funding selected by respondents were government funding agencies, corporations, and foundations (8%, 7%, and 6%, respectively). The lack of third-party partnerships for funding AI investments points to an opportunity for funders to take early action.
