2024 AI Breakout Report: Perceptions of Planning and Readiness

Who's Uncertain about AI Strategy and Planning?

Across the four sentiment areas, a not-insubstantial proportion of respondents selected "don't know" as their answer. These respondents might currently lack opinions or attitudes about their institution's AI-related strategies and planning because they are simply not in the know. Identifying where gaps might exist in awareness will be important for institutions that aim to garner collective buy-in and spur efforts toward effective AI adoption and implementation.

  • Faculty, professionals, and staff know less than executives about AI strategy and readiness. Across all areas and items, executives were the least likely to select "don't know," while faculty and professionals/staff were the most likely, with the exception of some teaching and learning items (specifically those focused on faculty) (see table 1).
  • Respondents know less about AI strategy in areas outside their responsibilities. Two groups—technology, data and analytics, and cybersecurity and privacy professionals, and business and operations professionals—were less likely to select "don't know" for items related to technology and IT support (see table 2). Business and operations professionals and teaching and learning professionals were somewhat more likely to select "don't know" for a number of the items related to strategy (AI as an investment, analytics, scalability, and adaptability) and were more likely to select "don't know" for items related to teaching and learning. Teaching and learning professionals were the least likely to select "don't know" for items related to faculty.
  • Respondents without AI responsibilities know less about AI strategy and readiness. Across all sixteen items, respondents with AI responsibilities were less likely to select "don't know" than those with AI responsibilities (see table 3).
Table 1. Percentage of Respondents Who Answered "Don't Know," by Position
Faculty
(N = 113–116)
Professionals, staff
(N = 263–266)
Managers, directors
(N = 257–260)
Executives
(N = 130–132)
We view AI as an investment rather than as an added cost.

32%

33%

19%

7%

We view AI as a strategic priority.

16%

20%

6%

3%

We have appropriate policies and guidelines in place to enable ethical and effective decision-making about AI use.

13%

16%

6%

5%

We have an effective, established mechanism in place for AI governance (responsible for policy, quality, etc.).

20%

20%

6%

3%

We have instituted sufficient and effectual analytics to ensure that AI use is aligned with our strategic goals.

31%

33%

15%

5%

Our AI services, programs, and technologies are scalable; we will be able to handle a growing number of AI applications in the coming years.

37%

40%

26%

12%

Our AI services, programs, and technologies are adaptable; we will be able to accommodate new uses of AI applications in the coming years.:

31%

35%

20%

11%

There are collaborative groups spanning units to work on AI-related strategy.

22%

20%

8%

5%

We have the appropriate technology in place to ensure the privacy and security of data used for AI.

37%

30%

19%

11%

Most of our AI technology is supported through a centralized system.

28%

31%

13%

6%

IT leaders consider AI technology as mission critical in terms of the support provided.

32%

34%

17%

4%

Providing AI support is straining our IT resources and staff.

45%

35%

20%

8%

We have adequate resources and knowledge to effectively provide support for students with disabilities to use AI tools.

22%

27%

20%

12%

We have adequate resources and knowledge to effectively provide support for faculty and staff with disabilities to use AI tools.

24%

26%

16%

10%

Our faculty's interest in incorporating AI into teaching is on the rise.

10%

15%

8%

7%

Our faculty have autonomy to choose which AI technologies are used in their courses.

9%

22%

11%

5%

Table 2. Percentage of Respondents Who Answered "Don't Know," by Responsibility Area
Technology, data and analytics, cybersecurity and privacy
(N = 339–343)
Business and operations, other
(N = 73–76)
Teaching and learning
(N = 378–384)
We view AI as an investment rather than as an added cost.

19%

27%

28%

We view AI as a strategic priority.

10%

22%

12%

We have appropriate policies and guidelines in place to enable ethical and effective decision-making about AI use.

11%

18%

8%

We have an effective, established mechanism in place for AI governance (responsible for policy, quality, etc.).

9%

20%

14%

We have instituted sufficient and effectual analytics to ensure that AI use is aligned with our strategic goals.

17%

23%

25%

Our AI services, programs, and technologies are scalable; we will be able to handle a growing number of AI applications in the coming years.

24%

33%

35%

Our AI services, programs, and technologies are adaptable; we will be able to accommodate new uses of AI applications in the coming years.:

22%

27%

28%

There are collaborative groups spanning units to work on AI-related strategy.

12%

24%

13%

We have the appropriate technology in place to ensure the privacy and security of data used for AI.

16%

25%

31%

Most of our AI technology is supported through a centralized system.

14%

27%

24%

IT leaders consider AI technology as mission critical in terms of the support provided.

15%

26%

30%

Providing AI support is straining our IT resources and staff.

16%

32%

36%

We have adequate resources and knowledge to effectively provide support for students with disabilities to use AI tools.

21%

32%

19%

We have adequate resources and knowledge to effectively provide support for faculty and staff with disabilities to use AI tools.

18%

31%

19%

Our faculty's interest in incorporating AI into teaching is on the rise.

13%

24%

6%

Our faculty have autonomy to choose which AI technologies are used in their courses.

15%

31%

9%

Table 3. Percentage of Respondents Who Answered "Don't Know," by AI Responsibility
Doesn't Have AI Responsibilities
(N = 351–355)
Has AI Responsibilities
(N = 440–446)
We view AI as an investment rather than as an added cost.

36%

14%

We view AI as a strategic priority.

21%

5%

We have appropriate policies and guidelines in place to enable ethical and effective decision-making about AI use.

19%

3%

We have an effective, established mechanism in place for AI governance (responsible for policy, quality, etc.).

21%

6%

We have instituted sufficient and effectual analytics to ensure that AI use is aligned with our strategic goals.

32%

13%

Our AI services, programs, and technologies are scalable; we will be able to handle a growing number of AI applications in the coming years.

39%

23%

Our AI services, programs, and technologies are adaptable; we will be able to accommodate new uses of AI applications in the coming years.

32%

19%

There are collaborative groups spanning units to work on AI-related strategy.

25%

4%

We have the appropriate technology in place to ensure the privacy and security of data used for AI.

33%

17%

Most of our AI technology is supported through a centralized system.

30%

12%

IT leaders consider AI technology as mission critical in terms of the support provided.

32%

16%

Providing AI support is straining our IT resources and staff.

36%

20%

We have adequate resources and knowledge to effectively provide support for students with disabilities to use AI tools.

28%

16%

We have adequate resources and knowledge to effectively provide support for faculty and staff with disabilities to use AI tools.

26%

15%

Our faculty's interest in incorporating AI into teaching is on the rise.

18%

5%

Our faculty have autonomy to choose which AI technologies are used in their courses.

22%

7%