2024 AI Breakout Report: Perceptions of Planning and Readiness

Sentiments Toward Strategic Planning and Readiness

Respondents were asked to rate their sentiment toward sixteen statements focusing on their institution's AI-focused strategic planning and readiness. We report findings in the following areas: AI as an investment and strategic priority, strategic planning and implementation, technology and IT support, and teaching and learning. Comparisons based on institutional position, primary area of responsibility, and designation of AI responsibilities (or not) are provided. Based on sample sizes and similarities in some responses, we grouped respondents into three categories:

  • Position: Faculty, professionals/staff, managers/directors, and executives
  • Primary Area of Responsibility: Technology, data and analytics, cybersecurity and privacy, teaching and learning, and business and operations (including those selecting "other" areas)
  • AI responsibilities: Whether or not an individual has AI responsibilities

AI as an Investment and Strategic Priority

Respondents in executive-level positions were more likely to agree that their institution views AI as an investment rather than an added cost (55%), as compared to managers/directors (36%), professionals/staff (31%), and faculty (24%) (see figure 1). Executives were also more likely to agree that their institution views AI as a strategic priority (65%), followed by managers/directors (54%), professionals/staff (43%), and faculty (33%). Overall, the further away a respondent's position is from leadership, the less they agreed that their institution views AI as an investment and strategic priority.

Figure 1. Views of AI as an Investment and Strategic Priority, by Position
Bar chart showing percentages of position levels who agreed with two statements: 'We view AI as an investment rather than as an added cost': Faculty (24%), Professional/staff (31%), Managers/directors (36%), Executives (55%). 'We view AI as a strategic priority': Faculty (33%), Professional/staff (43%), Managers/directors (54%), Executives (65%).

Across the three areas of responsibility, sentiments toward AI as an investment and a strategic priority were more similar than across position level (see figure 2). However, business and operations professionals, in addition to technology, data and analytics, and cybersecurity and privacy professionals, were more likely to agree that their institutions view AI as an investment (44% and 40%, respectively) than teaching and learning professionals (31%).

Figure 2. Views of AI as an Investment and Strategic Priority, by Primary Area of Responsibility
Bar chart showing percentages of role groups who agreed with two statements: 'We view AI as an investment rather than as an added cost': Teaching and learning (31%), Business and operations (44%), Technology, data and analytics, cybersecurity and privacy (40%). 'We view AI as a strategic priority': Teaching and learning (46%), Business and operations (49%), Technology, data and analytics, cybersecurity and privacy (52%).

Respondents who have personally been given AI responsibilities were much more likely to agree that their institution views AI as an investment (45%) than individuals without AI responsibilities (24%) (see figure 3). These individuals were also more likely to agree that their institution views AI as a strategic priority (59% versus 36%).

Figure 3. Views of AI as an Investment and Strategic Priority, by AI Responsibility
Bar chart showing percentages of agreement by whether the person has AI responsibilities: 'We view AI as an investment rather than as an added cost': Doesn’t have AI responsibilities (24%), Does have AI responsibilities (45%). 'We view AI as a strategic priority': Doesn’t have AI responsibilities (36%), Does have AI responsibilities (59%).

Strategic Planning and Implementation

Sentiments toward institutional strategic planning and implementation were largely similar across respondents in different positions (see figure 4). Overall, not many respondents agreed that their institutions have implemented appropriate/effective policies and mechanisms for AI or that their institution's AI services, programs, and technologies are adaptable and scalable (rates of agreement fell at or below 30%). The largest differences were in agreement about the implementation of collaborative working groups. Executives and managers/directors were more likely to agree that their institution has implemented collaborative groups spanning units to work on AI-related strategy (69% and 60%, respectively), compared to professionals/staff (47%) and faculty (44%).

Figure 4. Strategic Planning and Implementation, by Position
Bar chart showing percentages of position levels who agreed with six statements: 'We have appropriate policies and guidelines in place to enable ethical and effective decision-making about AI use': Faculty (20%), Professional/staff (19%), Managers/directors (24%), Executives (23%). 'We have an effective, established mechanism in place for AI governance (responsible for policy, quality, etc.)': (13%, 9%, 15%, 20%). 'We have instituted sufficient and effectual analytics to ensure that AI use is aligned with our strategic goals': (8%, 7%, 8%, 12%). 'Our AI services, programs, and technologies are scalable; we will be able to handle a growing number of AI applications in the coming years': (14%, 10%, 13%, 20%). 'Our AI services, programs, and technologies are adaptable; we will be able to accommodate new uses of AI applications in the coming years': (25%, 23%, 23%, 30%). 'There are collaborative groups spanning units to work on AI-related strategy': (44%, 47%, 60%, 69%).

Sentiments toward institutional strategic planning and implementation were also largely similar among individuals with different areas of responsibility (see figure 5). Overall, across responsibility areas, not many respondents agreed that their institutions have implemented appropriate/effective policies and mechanisms for AI or that their institution's AI services, programs, and technologies are adaptable and scalable (rates of agreement fell at or below 27%). Specifically, though, business and operations professionals were more likely to agree that their institution's AI services, programs, and technologies are scalable (23%) than technology, data and analytics, and cybersecurity and privacy professionals (14%) or teaching and learning professionals (11%).

Figure 5. Strategic Planning and Implementation, by Primary Area of Responsibility
Bar chart showing percentages of role groups who agreed with six statements: 'We have appropriate policies and guidelines in place to enable ethical and effective decision-making about AI use': Teaching and learning (17%), Business and operations (20%), Technology, data and analytics, cybersecurity and privacy (25%). 'We have an effective, established mechanism in place for AI governance (responsible for policy, quality, etc.)': (13%, 16%, 13%). 'We have instituted sufficient and effectual analytics to ensure that AI use is aligned with our strategic goals': (7%, 12%, 9%). 'Our AI services, programs, and technologies are scalable; we will be able to handle a growing number of AI applications in the coming years': (14%, 23%, 11%). 'Our AI services, programs, and technologies are adaptable; we will be able to accommodate new uses of AI applications in the coming years': (25%, 27%, 24%). 'There are collaborative groups spanning units to work on AI-related strategy': (54%, 53%, 56%).

Although not many respondents (with or without AI responsibilities) agreed that their institutions have implemented appropriate/effective policies and mechanisms for AI or that their institution's AI services, programs, and technologies are adaptable and scalable (rates of agreement fell at or below 30%), a trend emerged indicting that individuals who have been AI responsibilities were consistently more likely to agree with statements in these areas (see figure 6). Most notably, 30% of respondents who have AI responsibilities agreed that their institution's AI services, programs, and technologies are adaptable, compared to just 18% of those without AI responsibilities, and far more of them also agreed that their institution has implemented collaborative groups spanning units to work on AI-related strategy (68%), compared to those without AI responsibilities (38%).

Figure 6. Strategic Planning and Implementation, by AI Responsibility
Bar chart showing percentages of agreement by whether the person has AI responsibilities: 'We have appropriate policies and guidelines in place to enable ethical and effective decision-making about AI use': Doesn’t have AI responsibilities (15%), Does have AI responsibilities (26%). 'We have an effective, established mechanism in place for AI governance (responsible for policy, quality, etc.)': (7%, 19%). 'We have instituted sufficient and effectual analytics to ensure that AI use is aligned with our strategic goals': (6%, 11%). 'Our AI services, programs, and technologies are scalable; we will be able to handle a growing number of AI applications in the coming years': (10%, 16%). 'Our AI services, programs, and technologies are adaptable; we will be able to accommodate new uses of AI applications in the coming years': (18%, 30%). 'There are collaborative groups spanning units to work on AI-related strategy': (38%, 68%).

Technology and IT Support

Executives were somewhat more likely to agree that their institution has the appropriate technology in place to ensure the security and privacy of data used for AI (26%) than managers/directors (19%), professionals/staff (16%), or faculty (14%) (see figure 7). They were also more likely to agree that IT leaders at their institution consider AI technology as mission critical (61%) than managers (39%), professionals/staff (30%), or faculty (26%). Finally, executives and managers/directors were more likely to agree that providing AI support is straining IT resources and staff at their institution (30% and 29%, respectively) than professionals/staff (22%) or faculty (17%).

Figure 7. Technology and IT Support, by Position
Bar chart showing percentages of position levels who agreed with four statements: 'We have the appropriate technology in place to ensure the privacy and security of data used for AI': Faculty (14%), Professional/staff (16%), Managers/directors (19%), Executives (26%). 'Most of our AI technology is supported through a centralized system': (15%, 11%, 17%, 20%). 'IT leaders consider AI technology as mission critical in terms of the support provided': (26%, 30%, 39%, 61%). 'Providing AI support is straining our IT resources and staff': (17%, 22%, 29%, 30%).

Across responsibility areas, respondents felt similarly about their institution's implementation of technology ensuring privacy and security of data used for AI, support via a centralized system, and strain on IT resources and staff (see figure 8). However, technology, data and analytics, and cybersecurity and privacy professionals were more likely to agree that IT leaders at their institution consider AI technology as mission critical (47%) than business and operations professionals (37%) or executives (29%).

Figure 8. Technology and IT Support, by Primary Area of Responsibility
Bar chart showing percentages of role groups who agreed with four statements: 'We have the appropriate technology in place to ensure the privacy and security of data used for AI': Teaching and learning (19%), Business and operations (24%), Technology, data and analytics, cybersecurity and privacy (16%). 'Most of our AI technology is supported through a centralized system': (19%, 19%, 11%). 'IT leaders consider AI technology as mission critical in terms of the support provided': (47%, 37%, 29%). 'Providing AI support is straining our IT resources and staff': (26%, 21%, 24%).

Respondents who have AI responsibilities were more likely to agree that IT leaders at their institution consider AI technology as mission critical (46%), compared to individuals without AI responsibilities (27%) (see figure 9). Although the gaps were smaller across the other three items, those with AI responsibilities consistently reported greater agreement with measures of AI progress/maturity in the area of technology and IT support.

Figure 9. Technology and IT Support, by AI Responsibility
Bar chart showing percentages of agreement by whether the person has AI responsibilities: 'We have the appropriate technology in place to ensure the privacy and security of data used for AI': Doesn’t have AI responsibilities (15%), Does have AI responsibilities (21%). 'Most of our AI technology is supported through a centralized system': (12%, 18%). 'IT leaders consider AI technology as mission critical in terms of the support provided': (27%, 46%). 'Providing AI support is straining our IT resources and staff': (20%, 28%).

Teaching and Learning

Sentiments about the sufficiency of resources and support for students, faculty, and staff with disabilities in their use of AI were fairly similar among all respondents (see figure 10). Overall, only a small percentage agreed that their institution has adequate resources and knowledge to effectively provide support for users with disabilities in using AI (10–19% agreed). A significant number of respondents did agree, however, that faculty interest in incorporating AI into teaching is on the rise at their institution and that faculty at their institution have autonomy in choosing which AI technologies to use. Notably, faculty themselves were the most likely to agree that they have autonomy in choosing AI technologies (72%), while just 55% of professionals/staff said that faculty have such autonomy.

Figure 10. Teaching and Learning, by Position
Bar chart showing percentages of position levels who agreed with four statements: 'We have adequate resources and knowledge to effectively provide support for students with disabilities to use AI tools': Faculty (19%), Professional/staff (10%), Managers/directors (13%), Executives (13%). 'We have adequate resources and knowledge to effectively provide support for faculty and staff with disabilities to use AI tools': (11%, 13%, 12%, 15%). 'Our faculty’s interest in incorporating AI into teaching is on the rise': (74%, 63%, 70%, 71%). 'Our faculty have autonomy to choose which AI technologies are used in their courses': (72%, 55%, 69%, 64%).

Teaching and learning professionals were more likely to agree that faculty interest in incorporating AI into teaching is on the rise at their institution (74%) than were technology, data and analytics, and cybersecurity and privacy professionals (67%) and business and operations professionals (49%) (see figure 11). They were also more likely to agree that faculty at their institution have autonomy in choosing AI technologies (74%), compared to technology, data and analytics, and cybersecurity and privacy professionals (58%) and business and operations professionals (47%).

Figure 11. Teaching and Learning, by Primary Area of Responsibility
Bar chart showing percentages of role groups who agreed with four statements: 'We have adequate resources and knowledge to effectively provide support for students with disabilities to use AI tools': Teaching and learning (11%), Business and operations (11%), Technology, data and analytics, cybersecurity and privacy (14%). 'We have adequate resources and knowledge to effectively provide support for faculty and staff with disabilities to use AI tools': (10%, 11%, 14%). 'Our faculty’s interest in incorporating AI into teaching is on the rise': (67%, 49%, 74%). 'Our faculty have autonomy to choose which AI technologies are used in their courses': (58%, 47%, 74%).

Respondents who have AI responsibilities were more likely to agree that faculty interest in incorporating AI into their teaching is on the rise at their institution (77%) than those without AI responsibilities (58%) (see figure 12). Respondents with AI responsibilities also were more likely to agree that faculty at their institution have autonomy in choosing AI technologies (71%), compared to 56% of those without AI responsibilities.

Figure 12. Teaching and Learning, by AI Responsibility
Bar chart showing percentages of agreement by whether the person has AI responsibilities: 'We have adequate resources and knowledge to effectively provide support for students with disabilities to use AI tools': Doesn’t have AI responsibilities (9%), Does have AI responsibilities (15%). 'We have adequate resources and knowledge to effectively provide support for faculty and staff with disabilities to use AI tools': (8%, 15%). 'Our faculty’s interest in incorporating AI into teaching is on the rise': (58%, 77%). 'Our faculty have autonomy to choose which AI technologies are used in their courses': (56%, 71%).